- Home
- LLBLGen Pro
- LLBLGen Pro Runtime Framework
Problems with PK in Datagrids
Joined: 18-May-2006
I find that when there is a PK that the user can edit I get the following problems with SaveMulti and entries into a DataGrid
First Suppose that there are two entities with PKs 1 and 3. If the User edits the 3 to 2 and adds a new entity with PK 3 then SaveMulti throws a duplicate key error such as:-
An exception was caught during the execution of an action query: Violation of PRIMARY KEY constraint 'PK_tbAgeGrps'. Cannot insert duplicate key in object 'tbAgeGrps'.
Second Even though the datagrid is bound to the collection, the user can add duplicate foreign keys with no error being thrown.
LLBGen's Collections are beginning to seem rather unintelligent. They do not recognise Primary Key errors until too late. They throw Primary Key errors when there are none.
I've overcome the fact that Collections cannot delete removed entities from the underlying source.
The only solution to these new issues seems to me to be to do a save/refresh-if-dirty after every position change in the datagrid.
Are there any alternatives?
Bruce wrote:
I find that when there is a PK that the user can edit I get the following problems with SaveMulti and entries into a DataGrid
First Suppose that there are two entities with PKs 1 and 3. If the User edits the 3 to 2 and adds a new entity with PK 3 then SaveMulti throws a duplicate key error such as:-
An exception was caught during the execution of an action query: Violation of PRIMARY KEY constraint 'PK_tbAgeGrps'. Cannot insert duplicate key in object 'tbAgeGrps'.
Correct.
Second Even though the datagrid is bound to the collection, the user can add duplicate foreign keys with no error being thrown.
That's because the collection isn't the database, so it can't know if the Fk field values violate anything. And why are duplicate FK values a bad thing? I can have 10 order entities, all having the same CustomerID FK value!
Don't compare an entitycollection with a datatable. An entity collection is a collection of objects, not a set of tabular data. This means that there aren't constructs like 'unique constraints on a given column' because there are no columns.
LLBGen's Collections are beginning to seem rather unintelligent. They do not recognise Primary Key errors until too late. They throw Primary Key errors when there are none.
They can't be too intelligent because they represent a result of a set fetch in most cases, so they're not the entire set in the DB. So if there's an entity with PK = 3 in the DB (not in the collection) and you add an entity with PK = 3, it will cause a problem as well. And no that's not something different.
To avoid people editing PK fields in a grid, make the column readonly in the grid. The collection can't do that, as the grid checks in the FIRST entity in the bound set if a column is read only or not.
I've overcome the fact that Collections cannot delete removed entities from the underlying source.
That's a choice made to overcome ambiguistic behavior. IMHO solid software does what you want it to do, in a predictive, understandable way. If there can be confusion what can be expected from a given action, the software has to wait for further instructions as it might take the wrong route (wrong as in: make the wrong choice).
The only solution to these new issues seems to me to be to do a save/refresh-if-dirty after every position change in the datagrid.
Are there any alternatives?
You should think about not edit rows in a grid but in a detail form with textboxes. I know this sounds stupid perhaps, but grids work rather silly: as I said: if field 1 in the first row is readonly, all 1st fields of all rows are readonly, even if the NEW row at the bottom has that field as a WRITABLE field. This gives problems for grids which accept NEW entities for which you obviously have to specify the PK value, because you can't do that if the PK column is readonly. However if the column isn't readonly, the user can alter PK values of existing rows.
Joined: 18-May-2006
Thanks Otis
I meant to say that the user can add duplicate PKs with no error being thrown!
I often do edits of datagrid data using an edit form; but there are a lot of cases when that is irritating for the user. This is particularly true when the grid (as in this case) only has a very few columns or when there are only a few editable columns or where the columns are lists of figures.
Unfortunately, it is no good to tell a client that he can’t edit in a datagrid because it is difficult to programme and doesn't fit very well with using entity collections. What does he care about entity collections? If he wants to edit in the data grid and I can't achieve that then he'll find a programmer that can or go back to using his spreadsheets.
So I need to figure out how to achieve this.
It seems to me that the same requirements arise whether I allow the user to edit the grid directly or by using an edit form except that in the first case I have to make use of grid events and in the second case I can use form events.
I see from your replies to some other similar questions that editable PKs are inconvenient for the Entity-Collection system. Do the Collections use the db PKs to track entities? If so, I cannot understand how they manage to track relationships when new 1:m entities are created and multisaved because the PKs and FKs all appear to be 0 prior to the save.
I am thinking about adding an Identity field as PK. I still need to enforce uniqueness on the combination of AgeGrpSysID and AgeGrpCode because the Code must be editable and must be unique in each system. Will the new Identity-PK make things easier?
I have to figure out how to enforce this uniqueness. In the circumstances where this application will be used and for these particular tables I can be sure that only one person will edit them at a time. The content is quite small. Use over the last 3 years has resulted in 3 rows in the AgeGrpSystems table and 32 rows in the AgeGroups table. The client (a Market Research company) may never need to add a new Age Group System, but I do want them to be able to do so without having to ask me to reprogram so I’ve created the tables.
In these circumstances it is enough to enforce uniqueness in the Collection as pulled from the database because I know that this will always be the entire set of records. To deal with the unlikely situation that two users did edit these entries simultaneously and break the unique constraint in the database, I will need to catch the exception and roll back the transaction with a suggestion to the user to re-load the data and try again. (I think I’d need to roll the deletes and multisave into a single transaction for this to work properly)
In other collections where similar situations occur but multiple user edit is likely, if I want to alert the user to a problem before he leaves the row (or the edit form) then obviously I will have to check back to the database at the end of each row-edit to ensure that uniqueness is maintained and then save the row immediately if there is no problem. Alternatively, if using multisave, then before saving I will have to do all the delete operations then validate the whole of the grid contents against the database and flag any uniqueness errors back to the grid.
The question of whether to insert/update/delete back to the database on a row by row basis or in a multisave seems to me to be about performance. I think that in any situation where it is necessary to impose uniqueness constraints on editable data it will be easier to programme a satisfactory solution using row-by-row updates but fear that this may impact upon performance.
I will be grateful for any examples or advice on how to achieve these aims.
On page 17 of the book, ‘Rapid C# Windows Development’, it says: ‘Remember not to overlook fields that are unique and might serve as a natural primary key, as you want to prevent duplicate data wherever possible with good schema design’. It was on the basis of this statement that I actually removed the Identity field PK from my table and utilised the combination PK which contained the editable AgeGrpCode field which (I think) is the natural PK. Would you agree that the book is misleading here?
On page 17 of the book, ‘Rapid C# Windows Development’, it says: ‘Remember not to overlook fields that are unique and might serve as a natural primary key, as you want to prevent duplicate data wherever possible with good schema design’. It was on the basis of this statement that I actually removed the Identity field PK from my table and utilised the combination PK which contained the editable AgeGrpCode field which (I think) is the natural PK. Would you agree that the book is misleading here?
Totally agree.
The question of whether to insert/update/delete back to the database on a row by row basis or in a multisave seems to me to be about performance
I'd go for the multiSave.
I still need to enforce uniqueness on the combination of AgeGrpSysID and AgeGrpCode
Do it manually, whether in a cellChanged event, where you go and check other cells on the same column for similar values, and pop a message when a duplicate is found.
Or do the check just before saving all the rows, with a nice message and for a nicer user experience you may highlight the duplicate cells (change the forecolor to Red for example)
Bruce wrote:
Thanks Otis
I meant to say that the user can add duplicate PKs with no error being thrown!
I often do edits of datagrid data using an edit form; but there are a lot of cases when that is irritating for the user. This is particularly true when the grid (as in this case) only has a very few columns or when there are only a few editable columns or where the columns are lists of figures.
Unfortunately, it is no good to tell a client that he can’t edit in a datagrid because it is difficult to programme and doesn't fit very well with using entity collections. What does he care about entity collections? If he wants to edit in the data grid and I can't achieve that then he'll find a programmer that can or go back to using his spreadsheets.
So I need to figure out how to achieve this.
It all comes down to: a user has edited a field and you go through the complete grid and check every value in that same column if it already occurs there.
This is where datatables are different: they have a tabular structure focussed on columns, entity objects do not. And this can be a bit problematic in this case.
It seems to me that the same requirements arise whether I allow the user to edit the grid directly or by using an edit form except that in the first case I have to make use of grid events and in the second case I can use form events.
I see from your replies to some other similar questions that editable PKs are inconvenient for the Entity-Collection system. Do the Collections use the db PKs to track entities? If so, I cannot understand how they manage to track relationships when new 1:m entities are created and multisaved because the PKs and FKs all appear to be 0 prior to the save.
You've to ask yourself: do I want to have non-artificial PK values? You ONLY should use non-artificial PK values if they have a meaning and are really unique. In most cases this is isn't the case, and if you let the user fill in the PK, you WILL run into typo problems: the user discovers that a typo has been made in the PK field and wants to correct it: this is a PK change, which is always destructive for your data.
I am thinking about adding an Identity field as PK. I still need to enforce uniqueness on the combination of AgeGrpSysID and AgeGrpCode because the Code must be editable and must be unique in each system. Will the new Identity-PK make things easier?
You then won't have the problems with PK fields which are modified. The UC is enforced when the save occurs. You have to be prepared for that, that's unavoidable: the data can be unique in the grid, but when saved it can fail because someone else has saved the value in the unique column already.
I have to figure out how to enforce this uniqueness. In the circumstances where this application will be used and for these particular tables I can be sure that only one person will edit them at a time. The content is quite small. Use over the last 3 years has resulted in 3 rows in the AgeGrpSystems table and 32 rows in the AgeGroups table. The client (a Market Research company) may never need to add a new Age Group System, but I do want them to be able to do so without having to ask me to reprogram so I’ve created the tables.
In these circumstances it is enough to enforce uniqueness in the Collection as pulled from the database because I know that this will always be the entire set of records. To deal with the unlikely situation that two users did edit these entries simultaneously and break the unique constraint in the database, I will need to catch the exception and roll back the transaction with a suggestion to the user to re-load the data and try again. (I think I’d need to roll the deletes and multisave into a single transaction for this to work properly)
Be prepared for stale data: you fetch the data in one process and another user alters the data: in the first process you won't see that change until the data is saved and your unique constraint will then fail despite the fact you've checked up front.
In other collections where similar situations occur but multiple user edit is likely, if I want to alert the user to a problem before he leaves the row (or the edit form) then obviously I will have to check back to the database at the end of each row-edit to ensure that uniqueness is maintained and then save the row immediately if there is no problem. Alternatively, if using multisave, then before saving I will have to do all the delete operations then validate the whole of the grid contents against the database and flag any uniqueness errors back to the grid.
You can't achieve this when the user leaves the row. You might warn the user that a problem exists, but that problem can have gone away when yuo save the row because another user has removed the duplicate from the db. Vice versa this also happens.
On page 17 of the book, ‘Rapid C# Windows Development’, it says: ‘Remember not to overlook fields that are unique and might serve as a natural primary key, as you want to prevent duplicate data wherever possible with good schema design’. It was on the basis of this statement that I actually removed the Identity field PK from my table and utilised the combination PK which contained the editable AgeGrpCode field which (I think) is the natural PK. Would you agree that the book is misleading here?
It's not misleading, it's A point of view, not THE point of view. I didn't write the book . In general, I was in favor of natural keys, but a few years back I realized natural keys have problems as well and are hard to find. Automatic keys then are a better choice, as you avoid most of the problems there.
Keep in mind that the debate about natural keys vs. artificial keys is an endless battle, similar to stored procs vs. dyn. sql
Joined: 18-May-2006
I've struggled with this all afternoon. I've been through my database and added Identity type PKs wherever they are missing, restructured relationships, and regenerated.
In my AgeGrp table I now have AgeGrpID (PK) which is uneditable, AgeGrpSysID (FK) which is uneditable, and AgeGrpCode which is editable. In the db I have a unique index on AgeGrpSysID and AgeGrpCode. I still find that if I edit AgeGrpCode in certain ways (fior example I have codes 1,2,4 and I change 4 to 3 and then add a new entry and make it code 4) then SaveMulti it will fail even there is actually nothing wrong with the Collection and there never has been. Clearly the changed entities in the collection are not being saved in the order that they were changed. It seems that the new 4 is being added to the database before the old one has been changed to 3.
It therefore seems to me that it is impossible to use SaveMulti in circumstances where the User can change a field that is part of a unique index.
Checking the collection to see if it contains duplicates is no use because failure/non-failure will depend on the order in which the underlying database fields are updated and with SaveMulti the developer has no control over this.
I think that the only way to ensure that a multiple save would work (assuming that no changes had been made by other users) would be to delete all edited and deleted records from the db and then add all new and edited records back into the db.
Joined: 18-May-2006
If 1,2,4 are already in the db and the user opens the form and changes the 4 to 3 and adds a new 4, when the save multi tries to save the 4 there is already a 4 in the db and the save throws a duplicate index error.
This is only a tiny part of my application. Most of the application performs actions on preloaded lists of contact information so these kind of problems do not arise. It is only in a few look-up tables that this indexing problem arises. These tables basically contain definitions of the codes used to categorise things in Market Research. Most of the coding systems apply to all projects. New coding systems are only ever set up at the beginning of a project. Once the project starts they cannot be changed ever. If you have a system where 6 means African then once you have used that system in a project you can never change 6 in that system to mean Chinese because it will invalidate the previous results.
I've decided that for these lookup tables I will do the updates with stored procs that will just delete and replace the whole of any section that gets edited. So with my Age Grouping tables if a user edits a single item in one Age Group System, then all the Groups in that system will be deleted and replaced. The last person to edit and save an Age Group System will overwrite anyone elses changes. I can then control the Unique index in the front-end application to eliminate errors on saving. I'll also set a flag for each system to indicate whether it is unused and editable or used and non-editable.
I just bought LLBLGen a week ago so am only just beginning to get my head around what it can and can't do.
The forum has helped me to clarify this problem and find a workable solution. Thanks Otis and Walaa.
Bruce wrote:
If 1,2,4 are already in the db and the user opens the form and changes the 4 to 3 and adds a new 4, when the save multi tries to save the 4 there is already a 4 in the db and the save throws a duplicate index error.
Aha, ok that's indeed a problem. You can overcome this by using 2 unit of work objects, add the remove to one and the insert to another. Then create a dataaccessadapter (or transaction, if you're using selfservicing) and start a transaction, then first commit the delete Unitofwork, then the insert/save one.
It's manual scheduling, but there's no way the o/r mapper core can figure out that the intention is to first delete and then insert.
This is only a tiny part of my application. Most of the application performs actions on preloaded lists of contact information so these kind of problems do not arise. It is only in a few look-up tables that this indexing problem arises. These tables basically contain definitions of the codes used to categorise things in Market Research. Most of the coding systems apply to all projects. New coding systems are only ever set up at the beginning of a project. Once the project starts they cannot be changed ever. If you have a system where 6 means African then once you have used that system in a project you can never change 6 in that system to mean Chinese because it will invalidate the previous results.
I've decided that for these lookup tables I will do the updates with stored procs that will just delete and replace the whole of any section that gets edited. So with my Age Grouping tables if a user edits a single item in one Age Group System, then all the Groups in that system will be deleted and replaced. The last person to edit and save an Age Group System will overwrite anyone elses changes. I can then control the Unique index in the front-end application to eliminate errors on saving. I'll also set a flag for each system to indicate whether it is unused and editable or used and non-editable.
I just bought LLBLGen a week ago so am only just beginning to get my head around what it can and can't do.
The forum has helped me to clarify this problem and find a workable solution. Thanks Otis and Walaa.
In these situations it's indeed often more easier to erase what's there and re-insert the new state. Similar issue occurs when you for example manage role rights for a given role: you uncheck some rights, you check some others, and in the end you have to do full maintenance on the existing data, you can also remove the existing data and re-insert the new state which is much less code and gives the same result.
Joined: 18-May-2006
Well, in the end I decided to save/delete datagrid entries as they occur as this would give me a model that I could use in more situations than the delete-and-replace-it-all solution.
As it has taken me 9 days to achieve this (but next time it'll only take me 15mins), I thought I'd post the code - use it at your own risk- to help any other newbies. Perhaps we should have a code bank?
'Visual Studio 2005 + Deve Express XtraGrid +LLBLGenPro 'Windows Forms application.
'1:m tables where there is a composite unique index in the 'm table and part of it is user-editable. 'Save entries in the m table as they are made, throwing 'duplicate-errors back to the datagrid.
Imports SurveyAdmin.Data Imports SurveyAdmin.Data.EntityClasses Imports SurveyAdmin.Data.CollectionClasses Imports SurveyAdmin.Data.HelperClasses Imports SurveyAdmin.Data.FactoryClasses Imports SD.LLBLGen.Pro.ORMSupportClasses Imports DevExpress.XtraGrid.Views.Grid Imports DevExpress.XtraGrid.Columns
'I just import everything I might need and 'copy the list from form to form 'A lot of the above imports are not actually necessary 'for the following code.
Public Class fmAgeGroupings 'Set up UnitOfWork to deal with deletes 'because entities deleted from a collection 'are not automatically dleted from the database 'when the collection is saved Dim uow As UnitOfWork
Private Sub fmAgeGroupings_Load(ByVal sender As Object, _
ByVal e As System.EventArgs) Handles Me.Load
'Instantiate the UnitOfWork for AgeGrpSys deletes
uow = New UnitOfWork
'Fill the AgeGrpSys collection
Me.AgeGrpSysColl.GetMulti(Nothing)
'Set up the binding source for the AgeGrpSysCollection
Me.AgeSysBindSrc.DataSource = Me.AgeGrpSysColl
'Set up the binding soource for the AgeGrp collections
'that provide data to the XtraGrid
Me.AgeGrpBindSrc.DataSource = Me.AgeSysBindSrc
'Do the bindings to the binding sources
Me.AgeGrpBindSrc.DataMember = "AgeGrp"
Me.txtAgeGrpSysName.DataBindings.Add( _
New Binding("Text", Me.AgeSysBindSrc, "AgeGrpSys", True))
Me.AgeSysBindNavig.BindingSource = Me.AgeSysBindSrc
Me.AgeGrpDataGrid.DataSource = Me.AgeGrpBindSrc
End Sub
Private Sub fmAgeGroupings_FormClosing( _
ByVal sender As Object, _
ByVal e As System.Windows.Forms.FormClosingEventArgs) _
Handles Me.FormClosing
'Unbind grid so that it is detached from saving operations
If Not Me.AgeGrpDataGrid.DataSource Is Nothing Then
Me.GridView1.CloseEditor()
Me.AgeGrpDataGrid.DataSource = Nothing
Me.AgeSysBindNavig.BindingSource = Nothing
Me.txtAgeGrpSysName.DataBindings.RemoveAt(0)
Me.AgeSysBindSrc.DataMember = Nothing
Me.AgeGrpBindSrc.DataSource = Nothing
Me.AgeSysBindSrc.DataSource = Nothing
End If
'Save the AgeGrpSys deletes to the db
'In the db the relationship between AgeGrpSys and AgeGrp has
'DeleteCascade set so the db will delete and AgeGrps belonging
'to an AgegrpSys.
uow.Commit(New Transaction(IsolationLevel.ReadCommitted, "UOW"), True)
'Save changes to AgeGrpSys entities back to db
'Recursion false because changes to AgeGrps
'have already been saved as they were made
Me.AgeGrpSysColl.SaveMulti(False)
End Sub
Private Sub AgeGrpSysColl_BeforeRemove( _
ByVal sender As Object, _
ByVal e As System.EventArgs) _
Handles AgeGrpSysColl.BeforeRemove
'When an AgeGrpSys is deleted add this to the UOW
uow.AddForDelete(sender)
End Sub
Private Sub AgeSysBindSrc_PositionChanged( _
ByVal sender As Object, _
ByVal e As System.EventArgs) _
Handles AgeSysBindSrc.PositionChanged
If Me.AgeSysBindSrc.Position > -1 Then
'This sets the AllowRemove state of each AgeGrp collection as it is loaded
'Necessary because the default state is False
Me.AgeGrpSysColl(Me.AgeSysBindSrc.Position).AgeGrp.AllowRemove = True
End If
End Sub
Private Sub cmnGridDelRow_Click( _
ByVal sender As Object, _
ByVal e As System.EventArgs) _
Handles cmnGridDelRow.Click
'Call proc to delete a row when this is
'clicked in my contect menu
delCurrentRow()
End Sub
Private Sub GridView1_InvalidRowException( _
ByVal sender As Object, _
ByVal e As DevExpress.XtraGrid.Views. Base.InvalidRowExceptionEventArgs) _
Handles GridView1.InvalidRowException
'Suppress displaying the error message box
e.ExceptionMode = DevExpress.XtraEditors.Controls.ExceptionMode.NoAction
End Sub
Private Sub GridView1_ShowGridMenu( _
ByVal sender As Object, _
ByVal e As DevExpress.XtraGrid.Views.Grid.GridMenuEventArgs) _
Handles GridView1.ShowGridMenu
'This is for the context menu.
'It determines whether the user as right clicked in a row
'and, if so, shows the context menu
'This is the VS2005 context menu that I dragged onto the form designer
'This Sub is specific to DevExpress XtraGrid but similar functionality
'can be found in Visual Studio's own GridView
Dim View As GridView = CType(sender, GridView)
Dim HitInfo As ViewInfo.GridHitInfo
HitInfo = View.CalcHitInfo(e.Point)
If HitInfo.InRow Then
View.FocusedRowHandle = HitInfo.RowHandle
Me.cmnGrid.Show(View.GridControl, e.Point)
End If
End Sub
Private Sub delCurrentRow()
'Give user the opportunity to cancel the delete
If MessageBox.Show("Do you really want to delete the record?", _
"Warning", MessageBoxButtons.YesNo, _
MessageBoxIcon.Warning) = _
Windows.Forms.DialogResult.Yes Then
'Delete the Agegrp from the underlying db
Me.AgeGrpSysColl(Me.AgeSysBindSrc.Position).AgeGrp( _
Me.AgeGrpBindSrc.Position).Delete()
'Delete the Agegrp from the Grid and Collection
Me.GridView1.DeleteRow(Me.GridView1.FocusedRowHandle)
End If
End Sub
Private Sub GridView1_ValidateRow( _
ByVal sender As Object, _
ByVal e As DevExpress.XtraGrid.Views.Base.ValidateRowEventArgs) _
Handles GridView1.ValidateRow
Dim view As GridView = sender
Dim errMsg As String
'Try to save an updated row in the grid to the db
Try
'Use Save Multi on the top level Collection (AgeGrpSys
'because this will pull the ID for new entries
'and propogate them to the sub-entities (AgeGrp) as soon as the
'first new AgeGrp is added to the new AgegrpSys
'I believe the save-multi will efficiently only
'save dirty entities.
Me.AgeGrpSysColl.SaveMulti(True)
Catch ex As Exception
'This will put a red blob in the AgeGrpCode column of
'datagrid if there is an error in trying to save the Agegrp
'Hovering the mouse over the blob will show the defined msg
'Other implicit errors such as wrong data for field type will
'still show automatically
'Again this is XtraGrid specific, but similar functionality
'can be harnassed for V Studio's own gridview
e.Valid = False
If ex.Message.Contains("duplicate") Then
errMsg = "Duplicate Code"
Else
errMsg = "Unable to save this Row"
End If
view.SetColumnError(colAgeGrpCode, errMsg)
End Try
End Sub
End Class