This is the first time I've run into this and my search here couldn't find what I'm sure has occurred before.
A program that works fine for smaller historys, dies everytime for one customer with 35000 history records, where these records have about 20 children related each. This means eventually lazy loading will get around to 20 times 35,000 records. The query works fine, but after we start using the records one at a time, memory gets smaller and smaller as the lazy loading (I'm assuming since there is no other memory element, as I'm writing data to csv file to disk at every record) actually brings the objects into memory.
How does one generally handle this situation when it is a business requirement to export this data to a csv file and then look at it with Excel. In Excel it turns into just 35,000 records. Yes, I've asked the clients why they think they need to look at 35,000 of something, but they somehow think they do.
Once the data is written to disk, I don't need that object in memory any more, but I'm looping through the collection so can I remove elements from what I'm looping through?
Can I loop through id#s and that way remove from the collection as I go?
What is the usual solution for this?
TIA,
Terry Voss