Out of memory w/35,000 objects

Posts   
 
    
tvoss avatar
tvoss
User
Posts: 192
Joined: 07-Dec-2003
# Posted on: 01-Jun-2008 04:36:34   

This is the first time I've run into this and my search here couldn't find what I'm sure has occurred before.

A program that works fine for smaller historys, dies everytime for one customer with 35000 history records, where these records have about 20 children related each. This means eventually lazy loading will get around to 20 times 35,000 records. The query works fine, but after we start using the records one at a time, memory gets smaller and smaller as the lazy loading (I'm assuming since there is no other memory element, as I'm writing data to csv file to disk at every record) actually brings the objects into memory.

How does one generally handle this situation when it is a business requirement to export this data to a csv file and then look at it with Excel. In Excel it turns into just 35,000 records. Yes, I've asked the clients why they think they need to look at 35,000 of something, but they somehow think they do.

Once the data is written to disk, I don't need that object in memory any more, but I'm looping through the collection so can I remove elements from what I'm looping through?

Can I loop through id#s and that way remove from the collection as I go?

What is the usual solution for this?

TIA,

Terry Voss

Otis avatar
Otis
LLBLGen Pro Team
Posts: 39872
Joined: 17-Aug-2003
# Posted on: 01-Jun-2008 11:14:08   

You're using v2.0/2.5?

These 35.000 entities shouldn't take up all the memory. Though if you keep them in memory inside a collection which is referenced elsewhere, and you keep on fetching entities inside that graph, the total # of entities which is in the graph will never decrease. Keep in mind that a relation works both ways: if you fetch a customer and its orders, and you reference one of these orders elsewhere, that order references the customer and the customer still references the other orders simple_smile

Looking at your requirement, you should make sure that you're fetching the 35000 entities in a separate collection and clear the collection afterwards. You shouldn't reference an entity in the graph elsewhere as that could mean that the whole graph is kept in memory.

As you're writing the data to a csv file, you could page through the set, 100 entities at a time, fetch them in the same collection, clearing the collection at the end of each loop iteration.

Frans Bouma | Lead developer LLBLGen Pro
tvoss avatar
tvoss
User
Posts: 192
Joined: 07-Dec-2003
# Posted on: 02-Jun-2008 19:44:28   

Paging 1,000 samples of 25 children each just moves from normal iis usage of 3gigs memory of 4gigs total,,,up to about 3.6 and then drops back to 3.0 after each collection.clear.

I guess the customer actually requests this spreadsheet be emailed for ftpd once in a while and some customers are this big.

Thanks,