dioptre wrote:
Otis wrote:
May I first ask why you're using WCF to backup a lot of data? XML serialization isn't the fastest. As it's backup, both sides are .net and know llblgen pro entities, so I'd use remoting and fast serialization. You can then add a zip component in the chain so data is zipped on the fly as well for even smaller datablocks.
Well from my knowledge, the WCF services work with Binary/MTOM/Text and can be compressed...
Sure, but the data to compress/send as binary/ is ALWAYS first serialized to XML, so entities are always first serialized to XML, and then binary encoded.
There is a chunking channel and a compression example in the windows sdk that look like they would do a job as good as remoting....
Though our remoting serialization is a tad faster and way more compact than standard
the net.tcp binding looks like it supports binary serialization out of the box, was thinking this was the way forward (long term)?
No, it supports binary encoding.
I thought the best way would be to serialize llblgen entity collection (fastest way possible), then zip it, then pass it as object to WCF. In WCF, I can then choose to use binary/MTOM with chunks given links above?!
How would you recommend doing it specifically/differently?
Use fast serialization (plain remoting with our fast serialization enabled) and add a zip component to the serializer helper (see manual) to compress it even further.
You're looking at a lot of data, a lot of entities, and with WCF these are all going to be serialized to XML and then the XML is post-processed into binary data, etc. but is that really what you NEED for this particular problem? I don't think it's the best solution.
Just try to serialize a lot of entities through WCF with binary encoding and then try to serialize a lot of entities through remoting with fast serialization enabled (see manual). 10 to 1 the fast serialization is faster and more compact.