- Home
- LLBLGen Pro
- LLBLGen Pro Runtime Framework
Regeneration touching all files, very large project getting slower
Joined: 28-Nov-2008
LLBL 4.2 - adapter - sql server - Database 1st
My project has gotten very large, 828 entities, 501 typed views (that I don't want), and I've noticed that after a catalog refresh and code regeneration (done from command line batch file), my visual studio (that is open when I do this), bogs down my pc (pegged 100%) for some number of minutes (and I have a very fast OCd machine, ssds, etc).
I have emit timestamps off, yet I notice the timestamp on ALL the files is changing even though the contents themselves did not change and version control doesn't see a change.
My guess is VS is detecting a file change and reprocessing all the files behind the scenes. I can't tell if it's bogged down because it's busy checking all the files or just because a few key but very large files (like FieldCreationClasses,FieldInfoProvider,etc) have changed.
1) Is there any option I can turn on to have it be smarter and not keep changing those files timestamps even though they haven't changed?
2) Any options to get those very large files to be broken up into individual files?
3) I am using a command line to regenerate and am wanting it to take everything from my db. If I add a new table, view, etc, it is auto mapping it which is what I want. But I've also ended up with a bunch of typed views I don't want. Any way to get it to auto map new tables and views (just as entities), but not have it also generate typed views? (that all end up with a 1 on the end of their name)
Thanks!
happyfirst wrote:
LLBL 4.2 - adapter - sql server - Database 1st
My project has gotten very large, 828 entities, 501 typed views (that I don't want), and I've noticed that after a catalog refresh and code regeneration (done from command line batch file), my visual studio (that is open when I do this), bogs down my pc (pegged 100%) for some number of minutes (and I have a very fast OCd machine, ssds, etc).
I have emit timestamps off, yet I notice the timestamp on ALL the files is changing even though the contents themselves did not change and version control doesn't see a change.
My guess is VS is detecting a file change and reprocessing all the files behind the scenes. I can't tell if it's bogged down because it's busy checking all the files or just because a few key but very large files (like FieldCreationClasses,FieldInfoProvider,etc) have changed.
1) Is there any option I can turn on to have it be smarter and not keep changing those files timestamps even though they haven't changed?
It simply overwrites the files, so they appear newer (but the contents doesn't change). If you want to have the generator ignore files which are already there, you can do that in the preset you're using. Add to the task which generates files e.g. the task which generates the entity classes: <parameter name="failWhenExistent" value="true" />
in its <parameters> element.
this will make the code generator skip the file if it already exists. There's a catch: you shouldn't do this with all tasks, as some files change when you e.g. add a new entity, like the persistenceinfoprovider class.
2) Any options to get those very large files to be broken up into individual files?
You mean the infoprovider classes? No, sorry. 800+ entities is rather large though, so expect some overhead from that. If your model however has several disconnected submodels (or sub models which e.g. use an entity E only in readonly fashion, so you can copy it) you can think of cutting it up into groups in the designer and change the project setting for group usage, set it to 'As separate projects' which will generate a separate project per group, which are smaller. I doubt your 800+ entity model is one connected graph, that's almost never the case.
3) I am using a command line to regenerate and am wanting it to take everything from my db. If I add a new table, view, etc, it is auto mapping it which is what I want. But I've also ended up with a bunch of typed views I don't want. Any way to get it to auto map new tables and views (just as entities), but not have it also generate typed views? (that all end up with a 1 on the end of their name) Thanks!
That option isn't present, sorry: we have options for 'Add new elements after relational model data sync' (which is refresh in v4.2) which add new elements as new model elements like tables and typed views, and there's a setting 'add new views as entities after relational model data sync' (refresh in v4.2), which partly covers what you want.
If you switch off the 'add new elements after...' setting and enable the 'add new views as entities'... setting, you partly get what you want, and you can new tables in the designer as entities by right-clicking the database node in catalog explorer and select 'reverse engineer tables to entities'. this will split tables which are already mapped and tables which aren't mapped so you can easily add the new ones without getting duplicates as they're grouped in two different groups in the dialog that pops up.
I know it's not on the command line, but it's IMHO easier than dealing with 500+ mapped elements you don't want.
Joined: 28-Nov-2008
Thanks. that was what I was worried about.
I guess I have a few ideas I will have to find time to experiment with. I really like how today the command line batch is doing everything I need, but unfortunately giving me all that excess baggage with the typed views.
I had thought about the groups. It is not one massive connected graph. But then, many times, would still be generating multiple projects and I don't want to have to think about which project or projects to regen.
I will have to experiment with your suggested settings changes. I may also possibly write a simple program to remove the TypedViewDefinitions and mappings from the llblgen file before executing the code generator. Would be nice to be able to better control this in a future release.
I want to try and figure out what is killing VS. All the timestamps changing or just those few key but very large files.
My guess would be the files which are all changed and it tries to reload them all. Other than that, if you want to check it out, you need to run a profiler on DevEnv.exe but that will likely take forever to complete...
Joined: 03-Dec-2025
Hi Team, LLBLGen Pro Version: 5.12 I am facing two major issues while working with LLBLGen after migrating our database from SQL Server to Oracle:
1.Stored procedure retrieval is extremely slow. Each time I perform a refresh to retrieve stored procedures, it is taking more than 2 hours to complete.
2.Retrieval procedures are being incorrectly marked as action procedures. After retrieval, some procedures that return result sets are detected as action procedures. Because of this, I have to manually change the Resultset value to 1 for each affected procedure after every refresh.
Note: Only the schema was migrated from SQL Server to Oracle. Kindly suggest how to resolve these issues or if there are any recommended settings or steps to improve this behavior.
Thank you.
Dile wrote:
Hi Team, LLBLGen Pro Version: 5.12 I am facing two major issues while working with LLBLGen after migrating our database from SQL Server to Oracle:
1.Stored procedure retrieval is extremely slow. Each time I perform a refresh to retrieve stored procedures, it is taking more than 2 hours to complete.
2.Retrieval procedures are being incorrectly marked as action procedures. After retrieval, some procedures that return result sets are detected as action procedures. Because of this, I have to manually change the Resultset value to 1 for each affected procedure after every refresh.
Note: Only the schema was migrated from SQL Server to Oracle. Kindly suggest how to resolve these issues or if there are any recommended settings or steps to improve this behavior.
Thank you.
Please don't post in random old threads, but start a new one next time. You asked this question via email as well, and I answered:
"I think the main issue with the stored procedures is that the designer executes all of them with default values to see if they're returning a resultset. That's on one of the tabs in the wizard you go through. If your stored procedures do a lot of work with the default values but not with others, change the default values for the various parameter types on that page in the wizard. Or if you don't have any retrieval procedures, UNCHECK them on that tab."
This is sadly something that's unavoidable IF you want the designer to determine if a stored procedure returns a resultset and what the shape of that resultset is.