Database Migrations in v3

Posts   
 
    
rdhatch
User
Posts: 198
Joined: 03-Nov-2007
# Posted on: 23-Jun-2009 17:40:11   

Hi Frans -

Since v3 will support Model->DB, and thus already naturally abstracts the DDL-specifics for creating Tables, Columns, DataTypes, etc.

Have you thought about including Database Migrations in v3? This would GREATLY help many people, including myself. Tracking Database changes is a significant challenge.

If we had a simple LLBLGen API for DDL, such as Tables.Add(...) - this would be extremely helpful. Then, we could simply create POCO classes with .Up() and .Down() methods.

Your expert reply is greatly appreciated. Thank you -

Ryan

PS. I spent $600+ on RedGate SQL Compare - all to find out Stupid simple stuff doesn't work - It doesn't even support Column renaming!! It will drop the column (and all your data) and create a new column. They claim because their project files are "stateless", that this is intended behavior. Infuriating.

I pointed them to LLBLGen's Designer, since it already perfectly detects column renames, and everything else.

Otis avatar
Otis
LLBLGen Pro Team
Posts: 39749
Joined: 17-Aug-2003
# Posted on: 23-Jun-2009 17:57:59   

The designer tracks changes on the meta-data when created by the user or mapping code and will emit the changes into DDL SQL to modify an existing schema.

So if you rename a field in a table, it will create an ALTER TABLE statement and drop the column and re-add it, if there's no proper way to do this in the SQL of the target database. SQLServer has sp_rename, so a rename will use that stored proc instead simple_smile

For types etc, as long as there's an ALTER statement, DROP/ADD or stored proc, we're using it, so changing a type in a column uses an ALTER COLUMN statement, however this can still require you to step in and add migration code to convert the data first.

The changes are emitted into change scripts, and as only the changes are emitted (or you can opt for a complete new schema) it's not hard to rework the changes into a change script which migrates your schema WITH data.

We won't emit data migration code, as that's impossible in a lot of cases: migration of data often requires special code which is written by the DBA and migrations of databases with data require migration tests on copies of the live data before it's ran on the database. For example data which requires a function call to be migrated properly due to a type change on a column requires custom code we can't provide for you. So in that light it's not that odd that red gate says it's not really possible.

So, in many cases, the changes to the meta-data are performed through ALTER statements, DROP/ADD statements, proc calls and data is migrated by the RDBMS. In other cases, you need to write some extra SQL, however it's not that hard to do as only changes are emitted to the change script so in general it's not that much SQL. Example, if you rename a field, add a field and change a field's type, you get 3 ALTER COLUMN / ALTER TABLE statements.

Of course, the DDL sql is generated using the code generation pipeline with its own preset and templates, ran on the project model, so you can alter the templates, add your own code if you want to or generate different code than DDL SQL.

The goal is that a developer can manage the database schema from the designer, by modifying the entity model.

Frans Bouma | Lead developer LLBLGen Pro
rdhatch
User
Posts: 198
Joined: 03-Nov-2007
# Posted on: 23-Jun-2009 18:13:38   

Hi Frans,

Thank you for your reply.

In v3 - How would one upgrade a TestDB/ProductionDB from a previous version of the model to the new version of the model being used in the DevelopmentDB?

To know which SQL statements to apply - We would need to know which version of the model the TestDB/ProductionDB are currently on.

Does v3 support any kind of model versioning? Is every model change a new version? Or do you group multiple changes into a model version number?

What's your recommended way to do this?

Ryan

Otis avatar
Otis
LLBLGen Pro Team
Posts: 39749
Joined: 17-Aug-2003
# Posted on: 23-Jun-2009 19:09:40   

Versioning is indeed a problem, and there's no real solution for that other than planning things properly, as with every versioning system.

It's not really a problem unless multiple developers modify the same schema. The designer exports DDL SQL when you validate and update the metadata (1 action), or when you decide to export DDL SQL. Also when you want to generate code this action is performed.

So you're in charge when ddl sql is exported, and after that, what you're doing with it. We're generating header info, when the ddl sql is generated etc. so you have a way to match things, also in the project this info is available. Not every change is a new version, the project itself doesn't work that way, you have the active model and per element it contains additional info about changes (e.g. a type has changed of a field, the field then has a flag set that its type has been changed. If you change it again, it won't matter, no 'new' version is created). When you multiple times change an element in between DDL SQL exports, that's not creating new versions, just when you export DDL SQL the changes are seen as permanent.

However, if multiple developers export DDL SQL which have to be applied to the same schema, the DBA has to merge them together, as every exported DDL SQL is for example 'start version +1', as they're exported in parallel simple_smile

We could add a trick but it's intrusive to the DB schema so I don't think we'll add it by default, in that a separate 'version' table is added with 1 column which name is the version of the schema and another column which name is the date/time the schema was created/updated.

A new ddl sql will alter that table and modify the column names (no data is in the table anyway). This still doesn't solve the problem with multiple changes in parallel on the same schema though.

Frans Bouma | Lead developer LLBLGen Pro