Hi There,
I'm just after a bit of feedback as to what is the best way to go about working with a large set of data that has been retrieved from the database.
I'm working on a 3D modelling system, this has a Model table with model bounds etc. This Model table contains a number of Blocks. Each block has an x,y and z value for where it sits.
I am using the adaptor model and have a prefetch path to load the model and the blocks for a given date into memory. As there are a large number of blocks to be queried, i thought maintaing an in-memory cache of these objects would be the fatest solution. If i want to find a block of a xyz co-ordinate should i be:
Looping through the block collection to see which matches the criteria?
OR
Retriving the data from the database and putting it into a datatable and then use a view on that table to see if any match the criteria.
OR
Create and maintain a Hashtable for each searchable item?
How are other people working with large datasets?