Entity reuse and memory issues

Posts   
 
    
Posts: 2
Joined: 04-May-2007
# Posted on: 04-May-2007 04:40:03   

Hello, I am using the latest runtime with self-servicing to process some large EDI files and am running into memory problems with entities while doing bulk inserts (5000+)....... i have several related entities with identity keys, and have to save them in order.

if (entity3.IsDirty) { entity1.save(false); entity2.fk = entity1.pk; entity2.save(false); entity3.fk = entity2.pk; entity3.save(); }

......... and so on........

after reading some forum threads, i decided to create only one instance of each entity and reuse it (earlier i was creating a new entity instance for every insert)

ClearFields (entity1); entity1.IsNew = true;

ClearFields(entity2); entity2.Isnew = true;

But even then, my memory profile has shown no improvement....... with 5000 inserts, the ram usage hits upto 1gb and windows starts paging like crazy.......

what is the right way to reuse entities and reduce my memory footprint...... i know bulk inserts/direct sql is better option for me, but because of lots of logic involved in parsing data, i have no choice but to do it through code.

Thanks kiran

Walaa avatar
Walaa
Support Team
Posts: 14995
Joined: 21-Aug-2005
# Posted on: 04-May-2007 08:51:18   
if (entity3.IsDirty)
{
entity1.save(false);
entity2.fk = entity1.pk;
entity2.save(false);
entity3.fk = entity2.pk;
entity3.save();
}

Do you mean you repeat the above code 5000 times?

Please provide more information: What's the target database? its version? LLBLGen Pro runtime library build? Please check the following thread for guidelines: http://www.llblgen.com/TinyForum/Messages.aspx?ThreadID=7722

Anyway here is a relative discussion: http://www.llblgen.com/TinyForum/Messages.aspx?ThreadID=3415

Posts: 2
Joined: 04-May-2007
# Posted on: 04-May-2007 19:38:21   

Sorry about that, i use sql server 2005 and runtime version v2.0.50727

Yes, the same code is repeated 5000 times. I figured I can't use unit of work because of the identity keys for each table. On the flip side, I didn't use related collections as it would save recursively.

What i am trying to figure is why the memory usage is so high.... i cant post the code because of our NDA......i tried commenting out all the entity saves to check if there was any other code problems; with the saves disabled, the memory usage remained constant.

Any help would be greatly appreciated.

Thanks

psandler
User
Posts: 540
Joined: 22-Feb-2005
# Posted on: 04-May-2007 22:04:55   

kiran wrote:

Sorry about that, i use sql server 2005 and runtime version v2.0.50727

Yes, the same code is repeated 5000 times. I figured I can't use unit of work because of the identity keys for each table. On the flip side, I didn't use related collections as it would save recursively.

What i am trying to figure is why the memory usage is so high.... i cant post the code because of our NDA......i tried commenting out all the entity saves to check if there was any other code problems; with the saves disabled, the memory usage remained constant.

Any help would be greatly appreciated.

Thanks

Is it possible that the code sample you posted has nothing to do with the memory problem? Are you, for example, fetching those thousands of records into memory (as a dataset or entity collection or whatever) before you do all that processing?

Phil

Posts: 254
Joined: 16-Nov-2006
# Posted on: 05-May-2007 21:45:40   

I would you review the article

http://msdn.microsoft.com/msdnmag/issues/06/11/CLRInsideOut/default.aspx

Id be very interested to know a few things

1) Size of the GC generation 0,1 and 2 heap sizes 2) How fragmented the managed heap is

In relation to 2

The !eeheap –gc SOS command will show you where each garbage collection segment starts. You can correlate this with the output of !address to determine if the virtual memory is fragmented by the managed heap.

and use dumpheap i.e.

The !dumpheap –stat command performs a complete dump of the objects on the managed heap. (Thus, when you have a big heap, !dumpheap may take a while to finish.) The list that !dumpheap produces is sorted by memory used by type . This means you can start analyzing from the last few lines since they are the objects that take up the most space.

In the example in Figure 4, strings take up most of the space. If strings are the issue, the problem is often easy to solve. The content of the strings may tell you where they come from.

If your still struggling to find the cause of the memory issue, let us know and we can create debugger scripts to run all required sos commands, and capture the output to a text file we can analyse.