Devildog74 wrote:
Actually that is one of the challenges that the people at microsoft havent quite overcome yet, and it is a bit cumbersome.
But basically when the runtime deserializes a workflow, its assembly is loaded using the standard assembly location techniques (probing, GAC, etc.). When making changes to workflows you have to use due care when deploying. At this point the reccomended approach is to deploy to the GAC or modify the probing paths, but I havent been able to get the probing paths functionality worked out.
For example, say I have workflows that were created under v1.0.1 and they are still waiting for completion. I can deploy v1.0.2 and run new workflows with the code in v1.0.2 but the v1.0.1 assembly still needs to live on the machine somewhere.
What I have found is that because my assemblies are named the same, and I use probing paths, the runtime finds some assembly using the assembly name and then stops looking even though it has found the wrong version.
This is seriously bad. That they don't check for assembly version AND fileversion is going to hurt this technique. We all know the .NET 1.1 SP1 change in the SortedList class which made data saved in a program running on Sp1 not deserializable on .NET 1.1 without Sp1. Thus serializable classes have to be able to deserialize themselves from data serialized in older versions, that's what this says IMHO. That will be a challenge
.
So, Frans, yes you are correct, developers must be very careful when versioning and deploying workflows. When making changes to a given workflow, if you change the structure of the workflow, you need to increment the version number otherwise, existing workflows may be broken. So you almost need to treat published workflows like a published interface, whereby, once its published you dont take things away.
Even more restrictive I think: the interface might not even change, but the implementation did: crash
. So all the deserializer code needs stuff like I use in the designer, a set of methods which simply swallow an exception if a value isn't there, like:
object value = GeneralUtils.InfoGetValue(info, "_foo", typeof(int));
if(value==null)
{
// not found, old version
}
etc. and InfoGetValue simply tries to read _foo from info, which always throws an exception if it's not there (which is a retarded design, why isn't there some kind of ContainsKey or Contains method..
), and if so, returns a default value, e.g. null
One thing that is really cool is that you can make workflows extensible by using activities that allow you to dynamically add and invoke other workflows into the execution chain of the executing workflow.
I havent tried using entities with workflows. Actually, I am not sure that I would want serialize an entity with fetched data into a workflow. My approach when using business objects in workflows is to make the business objects stateless, and only serialize the data needed to fetch / recreate / re-hydrate the business object / entity when the workflow starts up again. So, I may only serialize the key values of the entities into properties of the workflow, and then in other methods of the workflow, re-fetch the entitites using the previously serialize key values. One thing that I can see doing is using entities for CRUD operations that need to be invoked during the execution of an activity within a workflow.
I think that's a good solution
you then avoid the serialization problem altogether