We use dbup[0]. Its philosophy[1] is that you should treat your DB changes like code changes in source control, and only perform up migrations. I agree with this. We previously spent a lot of time and effort writing down migrations that were never, ever used. If you need a down migration, take a backup before running your up migrations.
We're a C# shop. Our DB migration scripts are simply named with a datestamp and a change description. The scripts are added to a console exe project as embedded resources and everything is committed to source control. These exe projects are very thin and contain a single line of C# code that passes the command line arguments directly into an in-house DB deployment library which is installed via nuget. This library handles command line parsing and executing dbup in various ways, using the scripts embedded in the calling assembly.
The result is a simple executable file that, depending on its command line, can upgrade the DB directly, preview which scripts would run against a particular DB instance, test the DB deployment against a clean SQL Server LocalDB instance, or generate SQL scripts for DBAs.
One nice feature is that the exe can also return the connection string of the SQL Server LocalDB test instance to external code as a test fixture. We can use this to directly unit test our repository types against a real, clean database. When the test fixture is disposed by XUnit, the temporary DB is cleaned up and removed.
The console projects are built and tested as part of our CI builds, then pushed to Octopus Deploy[2] as packages. The Octopus deployment process simply runs the executable and passes it the connection string of the DB to update.
>If you need a down migration, take a backup before running your up migrations.
Aren't you missing some steps? Or does your db back up schema structure and data separately?
1. Make sure nobody can insert/update/delete anything
2. Take a backup
3. Run your up migration
4. Thoroughly test everything
5. Allow insert/update/delete again
I generally like the up/down better, and both the up and down is remarkably trivial to write generally, the downs are useful during development you might change your mind about the DB structure. Never used a down in production.
We used to do this but found it was error prone since the down migrations would have bugs and wouldn’t always catch everything. We have dedicated SQL devs that write the migrations by hand and so moving to generated code is a slow and painful process. We now encourage developers working on the DB to thrash out their design in a local SQL Server Express instance before baking their changes into a migration script that will be automatically run into our dev env by octopus. If the scripts don’t run or the tests against the temporary DB fail during build then they never get to our shared dev DB.
we never have problems with it. It's all local, we don't have a shared dev DB, I think that's asking for trouble.
We have test stacks, and a production stack. Those DBS are TBs in size. Locally we have a conditioned DB thats much smaller. The rule is locally, you only change your DB via migration.
We're a C# shop. Our DB migration scripts are simply named with a datestamp and a change description. The scripts are added to a console exe project as embedded resources and everything is committed to source control. These exe projects are very thin and contain a single line of C# code that passes the command line arguments directly into an in-house DB deployment library which is installed via nuget. This library handles command line parsing and executing dbup in various ways, using the scripts embedded in the calling assembly.
The result is a simple executable file that, depending on its command line, can upgrade the DB directly, preview which scripts would run against a particular DB instance, test the DB deployment against a clean SQL Server LocalDB instance, or generate SQL scripts for DBAs.
One nice feature is that the exe can also return the connection string of the SQL Server LocalDB test instance to external code as a test fixture. We can use this to directly unit test our repository types against a real, clean database. When the test fixture is disposed by XUnit, the temporary DB is cleaned up and removed.
The console projects are built and tested as part of our CI builds, then pushed to Octopus Deploy[2] as packages. The Octopus deployment process simply runs the executable and passes it the connection string of the DB to update.
[0] https://dbup.github.io/
[1] https://dbup.readthedocs.io/en/latest/philosophy-behind-dbup...
[2] https://octopus.com/