I'm new to all this stuff (.NET, SQL and O/R Mappers) so forgive me if this is a dumb question.
I am building a web app using .NET 2, SQL Server 2005 and, of course, llblgen Pro (of which the latter I am really starting to like a lot!).
I have a design decision to make. I have a table that holds rows representing nodes of tree data structures. The table will ultimately hold thousands of trees with each tree having up to about 250 nodes each. So you can see that there could be 1 million+ records easily. The table is also one that will be queried (and updated) often by users so I want it to be efficient.
My decision to make is how to implement a query that returns the nodes for a particular tree. Some options I have thought of are:
-
Use UDFs so that I can do a sort of 'parameterized view' that will retrieve the result set desired. Pro is that filtering out of records I don't want would be done right at the database level (efficient). Con is that I don't think llblgen supports passing parameters to UDFs from what I have read so I would have to bypass my lovely llbelgen-generated DAL for this functionality (ugly).
-
Use a view and filter the results returned. Clean architecturally but I suspect that the performance would take a huge hit. Probably a non-starter, right?
-
Call a SPROC that puts the result set into a temp table and then retrieve that using Entity or TypedView or whatever. I'm not even sure how to do that (how to tell the DAL where the result set is (i.e. that name of the just-created temp table)). Also, sounds slow too!
I sure wish I could do 1 (UDF's) especially since it also solves another problem for me in that the depth of a retrieved nodes is not stored in the table but determined by counting the # nodes retrieved for a particular tree (denormalized that way). If I retrieve nodes for all trees and then filter results to get just the nodes I want, this technique won't work and so I'll be forced to denormalize (which, in turn, introduces some additional uglies in my design).
Any advice?