Fully managed cloud. 30-day FREE full featured trial. Start Now
Showing results for 
Search instead for 
Did you mean: 

CI/CD for Database Deployment


Hello Community Members,

Looking for some help regarding setting up CI/CD deployment for Exasol database schemas. Currently I'm trying to setup similar to SQL Server SSDT where we have the option to compare the project and database to figure out the difference. I have a database deployed to production, I also have code in repo which Im deploying to docker so I'll have the latest version in docker and the live version in production.


I'm looking for a script or tool which can compare these two versions in two servers and return me the difference so that I can use the diff script to deploy. How others are doing CI/CD in Exasol database? Any inputs is appreciated.

I'm tired of getting the schemas deployed via pipeline.




We have CREATE <OBJECT> sql stored as plain text files and a small custom script.

The script parses SQL text and compares it with object actually presented in database. Views, functions, scripts are compared as text. Tables are compared on per-column basis.

Script produces SQL for ALTER's and UPDATE's, which can be applied during the release process automatically. However, we still prefer to alter big tables manually to double-check the operation and avoid mistakes.

In theory, I could publish this script, but it is implemented in PHP and it is very heavily customised for our use cases and database structure. For example, we use custom CONNECTION objects and Java UDF's a lot. I am not sure if it is going to help.

Team Exasol
Team Exasol

Maybe Sqitch is something you might want to try out?

Team Exasol
Team Exasol

Another tool to keep in mind is DbSchema. They have some tools you may be interested in. For example, their Schema Synchronization and Automated Schema Deployment & Tasks.

They support Exasol. If you send them an email, I'm quite sure they would be more than happy to discuss your use case with you and give you a tour of their solutions.


when I was at Zalando we implemented CI/CD in a very simple way:

- developers had write access only in DEV (specific schemas in prod where dedicated to the analysts in need of adding new data, e.g. copy data from an excel spreadsheet or importing some CSV files)
- all the DDLs were stored in github 
- an internal CI/CD was triggered with every new PR (running checks, in an ephemeral Exasol in a docker container, and deploying into Test) and merge operation (deploying in Prod)
- each deployment consisted of a deployment file containing:

  1. SQL files to be executed (full DDLs for new tables, views or scripts, or files with alter statement or deployment specific action like drop statements)
  2. List of the objects required to deploy (e.g. the table needed to deploy a view). The plan was to have a pre-built docker container with all the objects as per github.
  3. Test queries to be executed in the docker Exasol to ensure everything is working as expected

This was quite raw, but it was rolled out with tens of BI developers and works pretty well.

When I left I had plans to improve and automate this even further, but you know how it is life 🙂