- Home
- LLBLGen Pro
- LLBLGen Pro Runtime Framework
Detaching or refreshing an entity in a DataScope
Joined: 11-Feb-2008
With a detached client using a DataScope (Adapter/5.8 ) , if the client detects it needs to reload an entity and want to have the updated entity added to the DataScope, this looks a bit hard because the underlying context checks to see if that entity is already present during Attach, and if so becomes a no-op. Can we have a Detach or a Refresh method ?
Right now I'm trying to get the private Context(DataScopeContext) out of the DataScope via reflection to call Context.Remove.
Thanks,
Joined: 11-Feb-2008
Small update, using Context.Remove is not that easy as it relies on ObjectID whereas Attach uses the private Context.Find which does a fields comparison to detect equality. So I've ended up with this junk of reflection and boxes. Any suggestions how to do this better much appreciated
void RemoveExistingEntities(IEntityCollectionCore collection) { var map = new ReferencedEntityMap(collection); var typeDict = (Dictionary<Type, Dictionary<int, Dictionary<Guid, IEntityCore>>>)entityTypeHashtablesPropertyInfo.GetValue(context); foreach (var e in map.GetSeenEntities()) { if (e.IsNew) continue; // this shouldn't happen anyway as we come from a fetch if (typeDict.TryGetValue(e.GetType(), out var entityDict)) { foreach (var contextEntityDict in entityDict.Values.OfType<Dictionary<Guid, IEntityCore>>()) { var contextEntity = contextEntityDict.Values.FirstOrDefault() as EntityBase2; if (contextEntity == null) continue; if (e.Fields.PrimaryKeyFields.Count != 1) throw new Exception("primary key count not 1 for " + e.GetType()); var pkField = e.Fields.PrimaryKeyFields[0]; if (contextEntity.Fields[pkField.FieldIndex].CurrentValue.Equals(e.Fields[pkField.FieldIndex].CurrentValue)) context.Remove(contextEntity); } } } }
Joined: 17-Aug-2003
The entity is the same object as that's in the DataScope, correct? So fetching that entity again, will use the ActiveContext of the entity (which is the DataScope's context btw), so the fetch of the entity object will refresh the entity's values automatically. You don't need to do anything other than to fetch the same entity. So instead of fetching a new entity, fetch the existing one (adapter.FetchEntity(myEntityThatsAlreadyInTheDataScope)
)
If this is what you're doing and it doesn't work, please show some example code so we know what you're doing on when refetching the entity that's already in the datascope.
Joined: 11-Feb-2008
This is a detached client, so there is no adapter, that is on the server. FetchDataAsyncImpl
looks like this:
protected override Task<bool> FetchDataAsyncImpl(CancellationToken cancellationToken, params object[] fetchMethodParameters) { var handler = new HttpClientHandler(); handler.CookieContainer = cookieContainer; #if DEBUG var timeout = TimeSpan.FromMinutes(10); #else var timeout = TimeSpan.FromMinutes(2); #endif HttpClient client = new HttpClient(handler) { BaseAddress = new Uri(Config.Instance.BaseUri), Timeout = timeout }; // create in LoginContext var tcs = new TaskCompletionSource<bool>(); var query = (ILoadOp)fetchMethodParameters[0]; var collection = fetchMethodParameters[1]; var tResponse = client.GetAsync(query.QueryWithParameters()); IsLoading = true; tResponse.ContinueWith((task, o) => { query.IsComplete = true; if (tResponse.IsFaulted) { IsLoading = false; if (TimeoutHelper.FailedAuthentication(tResponse.Exception)) // TODO: does this work? { query.AuthenticationTimeoutHandled = true; tcs.SetResult(false); return; } tcs.SetException(tResponse.Exception); IsLoading = false; return; } if (task.IsCanceled) { tcs.SetException(new Exception("get operation was canceled")); // query has IsCanceled but its not set.... return; } var t = task.Result.Content.ReadAsByteArrayAsync(); t.ContinueWith((ta, o2) => { IsLoading = false; if (ta.IsFaulted) { tcs.SetException(ta.Exception); return; } var ds = new FastDeserializer(); try { ds.Deserialize(ta.Result, collection); } catch (Exception e) { IsLoading = false; tcs.SetException(new Exception("Could not deserialize query result " + query.Query + " " + UtfDecode(ta.Result), e)); return; } RemoveExistingEntities((IEntityCollectionCore)collection); Attach((IEntityCollectionCore)collection); tcs.SetResult(true); }, null); }, null); return tcs.Task; }
In the simple case there is one entity, but in reality of course the server often includes a prefetch for a graph of entities.
Joined: 17-Aug-2003
Hmm. Ok, so the entity you have on the client, has an ActiveContext property, which is the datascope's context. So you can use that instead of reflection.
What the fetch with a context functionality does (which updates an existing entity object with the values obtained from teh database, what you want too), is simply this:
// update the fields on a previous entity object fetched with the same PK, if it's in the current context, with the fields just fetched
activeContext.Get(entityToFetch);
So I think what you can do is:
entityOnClient.ActiveContext.Get(entityFromServer)
and it'll update the entityOnClient
's fields with the values in entityFromServer
. It should do an Equals based find on pk values as the objectid is different of course.
Would that solve your problem?
Joined: 11-Feb-2008
I'm not sure. I don't think so entirely.
Lets say the client has in it's context:
Entity1 Entity2 Entity3
All of the same type as previously fetched and these are in an EntityCollection<T>
Entity1 is updated and the context saved. A fetch then occurs to get everything up to date (other users may have changed the database, or server work or triggers may have created new entites).
The new fetch contains
Entity1 (with updates by this user) Entity2 Entity3 Entity4 (added by some other user)
I can update the entities in the context with your approach and add Entity4 so the context/DataScope will be fine. But maybe I still need an EntityCollection<T>, my old one is no good as it does not contains Entity4. Same for any other collections in the graph. The one fetched from the database contains all the entities, but it's entities are not the ones in the DataScope.
Joined: 11-Feb-2008
Here's the simplest test I can think to give something to talk around. Zip attached.
public static string SavedAccountNumber { get; set; }
[Test]
public void SaveCollection()
{
var scope = new UpdateDataScope();
Assert.IsTrue(scope.FetchData());
var customers = scope.Customers; // get the first 2 entities
// make some change to the first entity
customers[0].AccountNumber = "Foo";
// Save the collection
// UnitOfWork2 uow = null;
// use dummy commit function
Func<IUnitOfWorkCore, bool> commitFunc = a =>
{
SavedAccountNumber = null;
var uow = (UnitOfWork2)a;
uow.ConstructSaveProcessQueues();
// just look at updates for this test
foreach (var e in uow.GetUpdateQueue())
{
SavedAccountNumber = ((SalesOrderHeaderEntity)e.Entity).AccountNumber;
}
return true;
};
scope.CommitChanges(commitFunc);
// we made changes so there should be work to do :
Assert.IsTrue(SavedAccountNumber == "Foo");
// and refresh the collection
scope.FetchData();
// get the latest customers
customers = scope.Customers;
customers[0].AccountNumber = "Bar";
scope.CommitChanges(commitFunc);
// Should be Bar
// but isn't as the entities in the collection aren't the same as the ones in the scope
Assert.IsTrue(SavedAccountNumber == "Bar");
}
// scope:
public class UpdateDataScope : DataScope
{
#region Class Member Declarations
public EntityCollection<SalesOrderHeaderEntity> Customers { get; set; }
#endregion
protected override bool FetchDataImpl(params object[] fetchMethodParameters)
{
// simulate a call to the backend, where we would normally do something like
//
// var collection = fetchMethodParameters[1];
// var ds = new FastDeserializer();
// ds.Deserialize(someByteArray, collection);
//
// But we'll just have a single collection which we'll expose on the DataScope
var customers = new EntityCollection<SalesOrderHeaderEntity>();
customers.Add(new SalesOrderHeaderEntity
{
IsNew = false,
SalesOrderId = 1,
});
customers.Add(new SalesOrderHeaderEntity
{
IsNew = false,
SalesOrderId = 2,
});
customers[0].IsDirty = false;
customers[1].IsDirty = false;
this.Attach(customers);
// normally we'd return the collection through fetchMethodParameters
// but for this test we'll return through a property
Customers = customers;
return customers.Count > 0;
}
}
Filename | File size | Added on | Approval |
---|---|---|---|
DataScopeUow.zip | 98,367 | 09-Dec-2021 21:21.56 | Approved |
Joined: 17-Aug-2003
Assert.IsTrue(SavedAccountNumber == "Bar");
I don't follow this code, why does this have to be true, it's a static variable you don't set to Bar ? Also what is it exactly what you want me to look at ?
Joined: 11-Feb-2008
That static represents the database and its set :
customers[0].AccountNumber = "Bar";
And then "stored in the database" in the commitFunc
The problem is how does the client get a collection after the second fetch that can be updated and committed to the database.
Joined: 17-Aug-2003
Your code didn't compile, I had to fix up the references. Then your code seems to contain 2 pieces of code: one in test and one in program. In program a context is defined which has a bad Attach method. I recon I shouldn't use that one?
I'll ignore the code in 'Program' and will check the code in test. I hope the entities match my adventure works db
Joined: 17-Aug-2003
I think the problem is that an entity class instance gets a new ObjectID and when attaching, it doesn't find the objectid in the context (so it assumes a new instance), and then tries to find an entity with the same pk. If so, it'll skip it, as that's designed behavior (you're effectively trying to add a duplicate).
So Attaching the set of entities you fetched again, which contains duplicates, will only attach the ones which aren't already in the scope (which is designed behavior). Normally this is fine, as entity instances aren't coming from a remote source. In your case however, they do, and fetching with the same context isn't possible, therefore you'll always get new instances and you can't update them in the context as Attach always calls 'Add' on the context in the datascope.
So I get the feeling the entity class instances you have in the second fetch should be removed if the data inside them is already in an entity class instance in the datascope and the entity class instances in the scope should be updated with the data in the new entity class instances.
You can't just reset the scope, attach the newly fetched entity instances?
In any case, to sync an existing entity class instance with newer data from a copy you try to add, the context does existingEntity.Fields = copyInstance.Fields;
So to overcome this, what could be done is:
// scope:
public class UpdateDataScope : DataScope
{
#region Class Member Declarations
public EntityCollection<SalesOrderHeaderEntity> Customers { get; set; }
#endregion
protected override bool FetchDataImpl(params object[] fetchMethodParameters)
{
// simulate a call to the backend, where we would normally do something like
//
// var collection = fetchMethodParameters[1];
// var ds = new FastDeserializer();
// ds.Deserialize(someByteArray, collection);
//
// But we'll just have a single collection which we'll expose on the DataScope
var customers = new EntityCollection<SalesOrderHeaderEntity>();
customers.Add(new SalesOrderHeaderEntity
{
IsNew = false,
SalesOrderId = 1,
});
customers.Add(new SalesOrderHeaderEntity
{
IsNew = false,
SalesOrderId = 2,
});
customers[0].IsDirty = false;
customers[1].IsDirty = false;
// first attach
this.Attach(customers);
// then make sure duplicates which are ignored by attach have their data inserted in the copies already in the context.
UpdateCustomers(customers);
return customers.Count > 0;
}
private void UpdateCustomers(EntityCollection<SalesOrderHeaderEntity> newInstances)
{
if(this.Customers == null)
{
this.Customers = newInstances;
return;
}
var customerPerPk = this.Customers.ToDictionary(c => c.SalesOrderId);
var toAdd = new List<SalesOrderHeaderEntity>();
foreach(var c in newInstances)
{
if(customerPerPk.TryGetValue(c.SalesOrderId, out SalesOrderHeaderEntity soh))
{
soh.Fields = c.Fields;
}
else
{
// new, add afterwards
toAdd.Add(c);
}
}
this.Customers.AddRange(toAdd);
}
}
It's a workaround for the situation where you have new entity class instances which contain potentially the same entity instance (so the data) and obtained from an external source so you can't reuse the context for fetching so ObjectID's are different.
The main issue I think is that the datascope isn't a local cache that you can sync in both directions, it's a system to make it easier to determine what to persist in one direction, from gui to database; hence there's no logic to update the scope with a new 'state' of affairs of the entities currently contained in the datascope, as it assumes the current state is what's inside the datascope. I.e. when you delete an entity on the server, there's no logic to remove an entity from the datascope through some 'sync' action in the direction database -> datascope, as it's not designed for that use case.