In fact I maybe should explain my problem instead, maybe you'll have a better idea than mine.
I've a process that can be run thousand of time (20 000 or more) with different input. This process usually take between 50 and 500 ms. When running, it fetch quite a lot of data, analyze, and save results.
To avoid the fetch of same entities several times I use an entity cache. This cache is not so big, maybe 80 or 100 entities.
Example :
MyEntity myEntity = new MyEntity(3);
myAdapter.FetchEntity(myEntity);
myEntity.OtherEntity = myCache.GetOtherEntity(entity.otherEntityId);
The problem is : The memory increase indefinitely. Ie after 4000 runs, I reach about 500Mb ram used by the thread. After 5000 run, it become instable an begin to take more time, sometime it take 45sec for 1 process
What I think: OtherEntity have now a reference to myEntity. When the process will be finished (and it'll loop into another process), the memory took by myEntity will not be released because there is still a reference from the OtherEntity (still alive in the cache). Do you think it's true ?
So I'll try to have a small process that will run ie each 500 process and will parse the cache and break all links with other entities, so that these entities will be released.Your idea about injecting the Field info into a new object seem to be fine.
But I don't think the cache is the only memory leak...