Automating SQL and PL/SQL Deployments using Liquibase

 

You’ll have heard me barking on about automation, but one subject that’s been conspicuous by its absence is the automation of SQL and PL/SQL deployments…

I had heard of some products that might work for me, like Flyway and Liquibase, but couldn’t really make up my mind or find the time to start learning them. Next thing I knew, SQLcl got Liquibase built in, so I figured that was probably the decision made for me in terms of product. This also coincided with discussions about making a deployment pipeline for APEX applications, which kind-of focused me. It’s sometimes hard to find the time to learn something when there is not a pressing demand for it…

Despite thinking I would probably be using the SQLcl implentation, I started playing with the regular Liquibase client first. Kind of like starting at grass roots. If you are working in a mixed environment, you might prefer to use the regular client, as it will work with multiple engines.

Once I had found my feet with that, I essentially rewrote the article to use the SQLcl implementation of Liquibase. If you are focused on Oracle, I think this is better than using the standard client.

Both these articles were written more than 3 months ago, but I was holding them back on publishing them for a couple of reasons.

  1. I’m pretty new to this, and I realise some of the ways I’m suggesting to use them do not fall in line with the way I guess many Liquibase users would want to use them. I’m not trying to make out I know better, but I do know what will suit me. I don’t like defining all objects as XML and the Formatted SQL Changelogs don’t look like a natural way to work. I want the developer to do their job in their normal way as much as possible. That means using DDL, DML and PL/SQL scripts.
  2. I thought there was a bug in one aspect of the SQLcl implementation, but thanks to Jeff Smith, I found out it was a problem between my keyboard and seat. 🙂

With a little cajoling from Jeff, I finally released them last night, then found a bunch of typos that quickly got corrected. Why are those never visible until you hit the publish button? 🙂

The biggest shock for most people will probably be that it’s not magic! I’m semi-joking there, but I figure a lot of people assume these products solve your problems, but they don’t. Both Flyway and Liquibase provide a tool set to help you, but ultimately you are going to need to modify the way you work. If you are doing random-ass stuff with no discipline, automation is never going to work for you, regardless of products. If you are prepared to work with some discipline, then tools like Liquibase can help you build the type of automated deployment pipelines you see all the time with other languages and tech stacks.

The ultimate goal is to be able to progress code through environments in a sensible way, making sure all environments are left in the same state, and allow someone to do that promotion of code without having to give them loads of passwords etc. You would probably want a commit in a branch of your source control to trigger this.

So looking back to the APEX deployments, we might think of something like this.

  • A developer finishes their work and exports the current application using APEXExport. It doesn’t have to be that tool, but humans have a way of screwing things up, so having a guaranteed export mechanism makes sense.
  • Code gets checked into your source control. This includes any DDL, DML, packages, and of course the APEX application script.
  • A new changelog is created for the work which includes any necessary scripts, including DDL and DML, as well as the APEX script, all included in the correct order. That new changelog for this piece of work is included in the master changelog, and these are committed to source control.
  • That commit of the changelog, or more likely a merge into a branch triggers the deployment automation.
  • A build agent pulls down the latest source, which will probably include stuff from multiple repositories, then applies it with Liquibase, using the changelog to tell it what to do.

That sounds pretty simple, but depending on your company and how you work, that might be kind-of hard.

  • The master changelog effectively serialises the application of changes to the database. That has to be managed carefully. If stuff is done out of order, or clashes with another developer, that has to be managed. It’s not always a simple process.
  • You will need something to react to commits and merges in source control. In my company we use TeamCity, and I’ve also used GitLab Pipelines to do this type of thing, but if you don’t have any background in these automation tools, then that part of the automation is going to be a learning curve.
  • We also have to consider how we handle actions from privileged accounts. Not all changes in the database are done using the same user.

Probably the biggest factor is the level of commitment you need as a team. It’s a culture change and everyone has to be on board with this. One person manually pushing their stuff into an environment can break all your hard work.

I’m toying with the idea of doing a series of posts to demonstrate such a pipeline, but it’s kind-of difficult to know how to pitch it without making it too specific, or too long and boring. 🙂

Cheers

Tim…

Author: Tim...

DBA, Developer, Author, Trainer.

5 thoughts on “Automating SQL and PL/SQL Deployments using Liquibase”

  1. I for one would love to see a demonstration. I am wanting to set something up like this but not sure how to start. We already put our changes into install scripts that call and compile packages. DML and DDL and just statements in a single file, they are not versioned. Then the deployment goes to me the DBA to run in production manually. I want to automate that part as long as approval was made to changes.

  2. Hi, did you finally do this with Apex and Liquibase? The apex export from the apex dev environment is not compatible with Liquibase. It is full of SQLPLUS commands.

  3. Krisztian : This is still a work in progress. Options at the moment are:

    – Split the files up manually and create changelogs. This seems like a bad idea.
    – Add in a unique separator between blocks and separate on that. This is still a manual edit of the files so I don’t like it, but it does work.
    – Wait for the next release of Liquibase. It will allegedly deal with normal SQL & PL/SQL scripts typically used by Oracle people. I’m not sure if this will make it into the SQLcl version, or just stay in the enterprise Liquibase client.

    I’m still hoping one of two things will happen.

    – SQLcl will get a custom tag for Oracle scripts. There does seem to be one in the docs, but nobody can tell me how to use it.
    – The APEX team produce an export format that is Liquibase compatible. A single anonymous block.

    Cheers

    Tim…

  4. Thanks Tim.

    There is a widely used separator regexp to split blocks, that is not the problem. It means that to deploy code we use endDelimiter=”\n/\s*\n|\n/\s*$” and splitStatements=”true” , this solves the problem of having the package body and spec in the same .sql.

    I think that would work for the apex export file too, I will try to remove the SQLPLUS commands manually and give it a go. I put this question to the Oracle Apex forum, maybe someone will have an idea.

    Cheers

Comments are closed.