In-memory database for testing

Posts   
 
    
Erichero
User
Posts: 6
Joined: 26-May-2011
# Posted on: 26-May-2011 14:40:12   

With the Habanero ORM framework, there was a feature where you could persist your entities to an in-memory database, which acted as a full persistence mechanism, but obviously lost its contents upon completion of the app.

This is a terrific feature for unit testing, and I'm wondering how I can complete the same thing with LLBLGen, now that I've moved to a company using the framework.

Thanks for your help.

MTrinder
User
Posts: 1461
Joined: 08-Oct-2008
# Posted on: 26-May-2011 19:19:24   

Not easily I'm afraid, all of the databases that LLBLGen works with are essentially disk based. Apart from possibly speed, is there any reason why you can't just use a "DELETE * FROM" on each table at the start and end of the test run ?

Thanks

Matt

Erichero
User
Posts: 6
Joined: 26-May-2011
# Posted on: 27-May-2011 07:55:40   

Thanks for the feedback.

The motivations are: - Speed is a massive issue when you're doing agile or TDD development with thousands of tests - Foreign key constraints are a continuous problem for setup and cleanup - If you have one memory store, you just dump it and start again, simplifying cleanup - In-memory relies on your data model being correct, rather than your db model, so you don't have to synchronise database structures (ie. test vs dev db's)

If you say not easily, how would you see it being done?

Erichero
User
Posts: 6
Joined: 26-May-2011
# Posted on: 01-Jun-2011 12:33:37   

Just bumping this thread because I think it can be a big topic for LLBLGen.

How could an in-memory database be implemented with LLBLGen? Even if we did it ourselves?

Erichero
User
Posts: 6
Joined: 26-May-2011
# Posted on: 01-Jun-2011 15:39:19   

Perhaps as an alternative, there is support for a DB vendor that provides an in-memory option. I'm trying to read through the options on Wikipedia, but maybe someone knows more about this?

Otis avatar
Otis
LLBLGen Pro Team
Posts: 39749
Joined: 17-Aug-2003
# Posted on: 01-Jun-2011 16:44:09   

IMHO, testing on databases should always be on real databases. The main point is that testing of code should take into account the full environment in where the code operates. This avoids running into issues where the code is ran in the actual environment and suddenly fails.

So I always suggest to use a database on the RDBMS which is used in production as well, use a script to setup the catalog if you want to, run the tests, drop the catalog.

We ourselves use different databases for different parts of the system: one for reads, one for writes and one for inheritance scenarios (read and write). The one for writes (which reads back the data written) is clearing the data with a truncating stored proc called after the tests are done (in the teardown). This requires little setup, and you can run them every time, knowing they actually hit the database and do the full roundtrip, on the database they have to work on in production as well.

IMHO it's really a little price to pay for great benefits: you know things ran ok in a real setup as well, so you won't run into surprises when you move your code to production.

If you really really want to use a different setup, you can use firebird embedded, and simply create a mapping on firebird as well, create a firebird database from that (it's a single file), then use firebird embedded dll (and a small adjustment to the connection string) and before testing, simply copy the database file (which is empty) to the test folder, then do the tests, after the tests, delete the file. But IMHO a proc which truncates tables in the right order is as easy as this.

Frans Bouma | Lead developer LLBLGen Pro
Erichero
User
Posts: 6
Joined: 26-May-2011
# Posted on: 01-Jun-2011 17:03:55   

Thanks for the feedback Frans.

I understand what you're saying, but if you're concerned about variations on the data coming back from the database, then perhaps you should have a section of your test code dedicated to ensuring that your database behaves as expected.

Perhaps the real issue here is how you feel about writing mocks, and personally my experience hasn't been fantastic - I prefer to work with a persistence layer that behaves like it does in production code.

If you're then writing thousands of tests that use what comes out of your database, then the speed of your test execution and the cost of maintaining the integrity of your test data is heavy, worsened if you're sharing the same test database with several developers. Also, using TDD and highly agile environments requires a lot of test runs.

My experience with in-memory databases has been exceptionally good - you dump your entire database per test or test fixture, create your test pack, and get a guarantee of speed and data integrity, without having to re-align your schema every time there's been a change.

Firebird might be an option. On further research I see Sqlite also supports in-memory. I'm keen to look into it.