Bulk insert takes too much time

Posts   
 
    
KsV
User
Posts: 5
Joined: 20-Sep-2018
# Posted on: 20-Sep-2018 13:46:07   

Hi,

I have a functionality where I have to save a large number of records around 800. We observed that llblgen pro takes a lot of time for it to complete it.

I am using UnitofWork.AddCollectionforSave then UnitOfWork.Comit

When we do the same Insert using sql it is done in less than 10 seconds but Llblgen takes around 3-4 min Is there any way to achieve insert faster?

What is the use of commit async?

Any help as early as possible will be appreciated.

Thanks

Otis avatar
Otis
LLBLGen Pro Team
Posts: 39613
Joined: 17-Aug-2003
# Posted on: 20-Sep-2018 16:07:24   

Please use a profiler (.net profiler) to see where things are slow. inserting 800 rows takes very little time, even with non-batched inserts which the current builds use (inserting 1000 rows with individual queries takes ~500ms here).

It's likely not the DB, but perhaps other things are slow like events are triggered and e.g. the UI is updated after every insert... a profiler can give insight into that. Visual studio comes with a profiler, if you don't have dotTrace or Ants, but it's more arcane than e.g. dotTrace which can easily give you insight in where the most time is lost.

Without that I don't know what takes a lot of time in your situation. 3-4 minutes is ridiculously slow so there's something else not right. Be aware that 800 entities saved which have references to many other entities will result in those other entities also being saved, if they're changed.

You don't give any more specifics (e.g. llblgen pro version, adapter or selfservicing, database type etc.) so I'm guessing here.

Frans Bouma | Lead developer LLBLGen Pro
KsV
User
Posts: 5
Joined: 20-Sep-2018
# Posted on: 20-Sep-2018 16:25:55   

Hello,

Thank you for replying.

We have checked this in Sql profiler, what we observed is that it generates separate insert statements for all the records and saves one-by-one (correct me if I am wrong here) which might be the reason for slowness.

We do not have any additional triggers on our DB tables nor there are any other UI events or refresh.

We have a simple web service whose only job is to save these records. Which simply has 2 **UnitOfwork.AddCollectionForSave ** (around 300-400 each) and a few other entities in UnitOfWork.AddForSave. The LLBLGen pro versio0n is 5.1 and the DB used is SQL server 2014

//Following is the code that we have used UnitOfWork.AddForSave(costDiffOtheCostX); // Single entity UnitOfWork.AddForSave(costDiffOtherCostY); // Single entity UnitOfWork.AddCollectionForSave(lst1EntitiesToSave); // Entity collection around 300 UnitOfWork.AddCollectionForSave(Lst2EntitiesToSave); // entity collection around 400 using (var adapter = ServiceHelper.ServiceAdapter) { UnitOfWork.Commit(adapter); // The final commit }

Note: ServiceAdapter is derived from DataAccessAdapterBase

It will be helpful if you can provide any insight on the way in which such insert should be handled in case if there is any other way of saving these.

Any reply will be helpful.

Thanks

Walaa avatar
Walaa
Support Team
Posts: 14950
Joined: 21-Aug-2005
# Posted on: 20-Sep-2018 18:11:33   

We have checked this in Sql profiler, what we observed is that it generates separate insert statements for all the records and saves one-by-one (correct me if I am wrong here) which might be the reason for slowness.

I'll repeat Otis. You need to profile your code. Few hundred inserts should not take that much.

Are you sure the single entities and collections you are saving are not referencing other entities?

KsV
User
Posts: 5
Joined: 20-Sep-2018
# Posted on: 20-Sep-2018 18:21:58   

Yes I am quites sure. All I have done is created new entities based on the list of in put I have. Initialised the two entity collections I mentioned add entities in those collection. Added them to the unit of work and then commit.

Is there anything else that should have been done?

Really appreciate any help. Thanks again.

Walaa avatar
Walaa
Support Team
Posts: 14950
Joined: 21-Aug-2005
# Posted on: 20-Sep-2018 20:38:10   

Just profile your code.

Otis avatar
Otis
LLBLGen Pro Team
Posts: 39613
Joined: 17-Aug-2003
# Posted on: 20-Sep-2018 21:01:20   

Also, the db is a normal sql server database, not some cloud database miles away accessible over a slow network line?

But indeed, please profile the code. The queries themselves aren't the problem, it's the rest, so with a profile you can get insight in where things are terribly slow.

Frans Bouma | Lead developer LLBLGen Pro
KsV
User
Posts: 5
Joined: 20-Sep-2018
# Posted on: 21-Sep-2018 07:56:18   

Hi,

I really appreciate your time and feedback. After your reply I checked a few things. There is something that i wanted to verify, after i have sent the collection for save i.e. have hit UnitOfWork.Comit

It performs its internal actions like OnBeforeEntitySave for the first record, saves it in the DB then goes on for the second one and son. So this will happen 800 times as if saving in a foreach loop.

Is there any other alternative to this. Can all 800 inserts happen at once in a single go?

Appreciate your time and help.

Thanks

Otis avatar
Otis
LLBLGen Pro Team
Posts: 39613
Joined: 17-Aug-2003
# Posted on: 21-Sep-2018 09:21:06   

KsV wrote:

Hi,

I really appreciate your time and feedback. After your reply I checked a few things. There is something that i wanted to verify, after i have sent the collection for save i.e. have hit UnitOfWork.Comit

It performs its internal actions like OnBeforeEntitySave for the first record, saves it in the DB then goes on for the second one and son. So this will happen 800 times as if saving in a foreach loop.

Is there any other alternative to this. Can all 800 inserts happen at once in a single go?

Appreciate your time and help.

Thanks

The OnBeforeEntitySave method is an empty method unless you have overridden it to do logic. The collection is first processed by a graph visitor to see which entities are reachable and changed, and these entities are then sorted in the right order (this is done at the same time using a topological sort, very fast) then each entity is converted into a query and executed within the same transaction over the same connection.

You're using 5.1 which is relatively fast, 5.4 is faster but I wouldn't say it is 1000% faster in this regard. In v5.5, which will reach beta in the next month or so (maybe a little later), we've implemented query batching to batch inserts/updates into 1 query. While testing this, we first did an insert of 1000 rows separately and then with a batch. 1000 rows separately is what you see today, every query is executed separately. This takes over the network about 550ms. Batching this takes about 90ms. So batching helps.

But, and that's my point wink , is persisting 800 entities like in your case, doesn't take 4 minutes. It shouldn't take more than a few seconds, unless your entities have massive amounts of data inside themselves (like large objects, documents, pictures etc.) and the connection with the database is really slow (network).

You mentioned that the 800 inserts in sql take just a few seconds, how did you measure that? With a sql dump?

Your question regarding OnBeforeEntitySave, you profiled the code and it spent a lot of time in that method?

Frans Bouma | Lead developer LLBLGen Pro