Best way to Read/write blob fields as C# IO.Streams

Posts   
 
    
TomDog
User
Posts: 620
Joined: 25-Oct-2005
# Posted on: 09-Mar-2022 04:39:40   

A continuation of sort of Does LLBLGen support accessing SQL Server blob fields as C# IO.Streams

I have successfully experimented with code to read and write to blob fields with streams, but it seems a bit harder than it needs to be so wondering if there is a better way? Code is here: (https://github.com/JeremyThomas/LLBLStreaming)

Firstly writing:

      /// <summary>
    /// Inserts an Entity in the database asynchronously, with the specified field streamed.
    /// </summary>
    /// <param name="dataAccessAdapter">The data access adapter.</param>
    /// <param name="entity">The entity to be inserted.</param>
    /// <param name="indexOfFieldtoBeStreamed">Index of the field in the entity to be streamed.</param>
    /// <param name="stream">The stream.</param>
    /// <param name="cancellationToken">The cancellation token.</param>
    /// <returns>The number of rows affected or the integer primary key of inserted row (if applicable)</returns>
    static async Task<long> InsertEntityWithOneFieldStreamedAsync(DataAccessAdapter dataAccessAdapter, EntityBase2 entity, int indexOfFieldtoBeStreamed, Stream stream,
      CancellationToken cancellationToken)
    {
      Logger.DebugMethod(dataAccessAdapter.ConnectionString, entity, indexOfFieldtoBeStreamed, stream.GetType(), cancellationToken);
      entity.Fields[indexOfFieldtoBeStreamed].CurrentValue = stream;
      var actionQuery = dataAccessAdapter.CreateInsertDQ(entity);
      var numRowsAffected = await dataAccessAdapter.ExecuteActionQueryAsync(actionQuery, cancellationToken);
      var id = actionQuery.ParameterFieldRelations.FirstOrDefault()?.Field.CurrentValue;
      return id switch
      {
        int intID => intID,
        long longID => longID,
        _ => numRowsAffected
      };
    }

    // __LLBLGENPRO_USER_CODE_REGION_START CustomDataAccessAdapterCode

    /// <summary>
    /// Creates a new Insert Query object which is ready to use.
    /// </summary>
    /// <returns>IActionQuery Instance which is ready to be used.</returns>
    /// <remarks>Generic version.</remarks>
    /// <exception cref="T:System.ArgumentNullException">When fields is null or fieldsPersistenceInfo is null</exception>
    /// <exception cref="T:System.ArgumentException">When fields contains no EntityFieldCore instances or fieldsPersistenceInfo is empty.</exception>
    /// <exception cref="T:SD.LLBLGen.Pro.ORMSupportClasses.ORMQueryConstructionException">When there are no fields to insert in the fields list. This exception is to prevent
    /// INSERT INTO table () VALUES () style queries.</exception>
    public IActionQuery CreateInsertDQ(IEntity2 entity)
    {
      return CreateDynamicQueryEngine().CreateInsertDQ(entity.Fields.GetAsEntityFieldCoreArray(), GetAllFieldPersistenceInfos(entity), GetActiveConnection());
    }

    /// <summary>
    ///   Gets all field persistence infos.
    /// </summary>
    /// <param name="entity">The entity.</param>
    /// <returns></returns>
    IFieldPersistenceInfo[] GetAllFieldPersistenceInfos(IEntity2 entity)
    {
      return PersistenceInfoProviderSingleton.GetInstance().GetAllFieldPersistenceInfos(entity);
    }

    // __LLBLGENPRO_USER_CODE_REGION_END

Issues: 1. I had to add code to the DataAccessAdapter to allow this 2. I'm not sure how to make an update version of it 3. I'm uncomfortable working such low level

As for reading:

    public static async Task<long> StreamLargePhotoToFileAsync(IDataAccessAdapter portalAdapterToUse, long productPhotoID, string filePath, CancellationToken cancellationToken,
      IProgress<long> progress)
    {
      using var dateReader = await GetProductLargePhotoReader(portalAdapterToUse, productPhotoID, cancellationToken);
      if (dateReader is DbDataReader reader && await reader.ReadAsync(cancellationToken).ConfigureAwait(GeneralHelper.ContinueOnCapturedContext))
      {
        var stream = GetStream(reader, 1);
        if (stream.CanRead)
        {
          using var file = new FileStream(filePath, FileMode.Create, FileAccess.Write);
          using var streamProgressWrapper = new StreamProgressWrapper(stream, progress);
          await streamProgressWrapper.CopyToAsync(file).ConfigureAwait(GeneralHelper.ContinueOnCapturedContext);
          return file.Length;
        }
      }

      return 0;
    }

    public static Task<IDataReader> GetProductLargePhotoReader(IDataAccessAdapter portalAdapterToUse, long productPhotoID, CancellationToken cancellationToken)
    {
      Logger.DebugMethod(portalAdapterToUse.ConnectionString, productPhotoID);
      var qf = new QueryFactory();
      var q = qf.Create()
        .Select(ProductPhotoFields.ProductPhotoID, ProductPhotoFields.LargePhoto, ProductPhotoFields.LargePhotoFileName)
        .Where(ProductPhotoFields.ProductPhotoID == productPhotoID);
      return portalAdapterToUse.FetchAsDataReaderAsync(q, CommandBehavior.SequentialAccess, cancellationToken);
    }

    /// <summary>
    ///   Gets the stream. Should be System.Data.SqlClient.SqlSequentialStream
    /// </summary>
    /// <param name="reader">The reader.</param>
    /// <param name="ordinal"></param>
    /// <returns>The stream</returns>
    public static Stream GetStream(DbDataReader reader, int ordinal)
    {
      var readerType = reader.GetType();
      Logger.DebugMethod(readerType);
      if (reader.IsDBNull(ordinal))
        return null;
      var stream = reader.GetStream(ordinal);
      Logger.DebugFormat("{0} returned {1} which has CanRead: {2}", readerType, stream.GetType(), stream.CanRead);
      return stream;
    }

It seems clunky and I had to use queryspec as I can't see how to do it with Linq?

Jeremy Thomas
Otis avatar
Otis
LLBLGen Pro Team
Posts: 39760
Joined: 17-Aug-2003
# Posted on: 09-Mar-2022 11:15:52   

I don't really see how your code that you added to the dataaccessadapter makes a stream insert properly? Linq is a black box, we don't expose a datareader system for linq, so there's no way to utilize it at that level.

As for fetching: as I described in the other thread you refer to: fetching a blob as a stream isn't doable in our system out of the box: it fetches all fields in one go and then closes the reader. (for entity materialization). So the only other route is to obtain a datareader specifying you want sequential access.

I'm not sure about the dataaccessadapter code tho, is that to circumvent the lack of file type support in the driver?

Frans Bouma | Lead developer LLBLGen Pro
TomDog
User
Posts: 620
Joined: 25-Oct-2005
# Posted on: 09-Mar-2022 13:21:42   

Otis wrote:

I don't really see how your code that you added to the dataaccessadapter makes a stream insert properly?

I'm not sure about the dataaccessadapter code tho, is that to circumvent the lack of file type support in the driver?

Dohconfounded! yeah your right! no need for that malarkey at all. (In my defence ended up there by starting with the plain SQL API and then realised I didn't need to) Just setting the CurrentValue to a stream is all that is needed:

    static async Task<long> StreamProductPhotoToDataBase(IDataAccessAdapter dataAccessAdapter, CancellationToken cancellationToken,
      IProgress<long> progress, Stream stream,
      string fileName, bool incrementalProgress = false)
    {
      Logger.DebugMethod(dataAccessAdapter.ConnectionString, cancellationToken, null, stream.GetType(), fileName, incrementalProgress);
      var productPhotoEntity = new ProductPhotoEntity
      {
        LargePhotoFileName = fileName, ModifiedDate = DateTime.Now
      };

      using var streamProgressWrapper = new StreamProgressWrapper(stream, progress, incrementalProgress);
      productPhotoEntity.Fields[(int)ProductPhotoFieldIndex.LargePhoto].CurrentValue = streamProgressWrapper;
      await dataAccessAdapter.SaveEntityAsync(productPhotoEntity, cancellationToken);
      return productPhotoEntity.ProductPhotoID;
    }

Otis wrote:

Linq is a black box, we don't expose a datareader system for linq, so there's no way to utilize it at that level. As for fetching: as I described in the other thread you refer to: fetching a blob as a stream isn't doable in our system out of the box: it fetches all fields in one go and then closes the reader. (for entity materialization). So the only other route is to obtain a datareader specifying you want sequential access.

For reading what I think I want is to fetch the entity with the field excluded then have some sort of function similar to DataAccessAdapter.FetchExcludedFields, say FetchExcludedFieldsAsStreams, which sets the CurrentValue of all the ExcludedFields to streams.

Jeremy Thomas
TomDog
User
Posts: 620
Joined: 25-Oct-2005
# Posted on: 10-Mar-2022 10:55:55   

I've hacked together something that works, see FetchExcludedFieldsAsStreamsAsync in DataAccessAdapter.cs

Jeremy Thomas
Otis avatar
Otis
LLBLGen Pro Team
Posts: 39760
Joined: 17-Aug-2003
# Posted on: 10-Mar-2022 10:58:38   

FetchExcludedFields fetches all values of the excluded fields in 1 set and then merges them in memory. For streams that's not possible as you have to fetch a stream per row, then move on to the next row. So it's a different approach, and also why we didn't implement this in our system (yet) because the materializer stalls after every row: you have to read the row, fetch the streams then move on to the next row. For fetching the streams afterwards, that's the same thing, right? Say for 4 rows you have to fetch the stream fields, you have to fetch the first, wait for completion, then move to the next field, wait for completion, till all 4 are fetched. So it's more iteration than set merging.

A method like you propose therefore would hide the iteration part and wrap a datareader.

(edit) I see you wrote something similar to that already. I don't see any way to make that more efficient, to be honest...

Frans Bouma | Lead developer LLBLGen Pro
TomDog
User
Posts: 620
Joined: 25-Oct-2005
# Posted on: 10-Mar-2022 12:37:12   

Yeah I've been struggling to think what a universal read stream API would look like.

Its not in the sample solution but DownloadAttachmentAsync at https://github.com/JeremyThomas/LLBLStreaming/blob/598161a7942c28eb640e845de9a550d334547feb/LLBLStreaming.Sample.Web/Controllers/HomeController.cs#L184 is what our real world scenario looks like - lifecycle management of the DbDataReader is front and centre so probably need to expose that. What ever you did it would probably be single entity only, as you say you gain nothing by operating on multiple entities.

Perhaps do something similar to FetchExcludedFieldsAsStreamsAsync but for single entity only and it returns the DbDataReader so its closure can be controlled. Maybe be break it down into two parts for looping through multiple entities.

If nothing else a method that takes an entity with its PKs set and Fetches a DataReader with CommandBehavior.SequentialAccess would be great help. Or take a step back and just something that turns an entity with its PKs set into a QueryParameters (I guess CreatePrimaryKeyFilter does part of that)

Jeremy Thomas
Otis avatar
Otis
LLBLGen Pro Team
Posts: 39760
Joined: 17-Aug-2003
# Posted on: 11-Mar-2022 10:15:49   

At the moment I don't have a more simpler suggestion on offer... we could add some helpers in the future for this but it won't be today so it won't help you now, I'm afraid.

Frans Bouma | Lead developer LLBLGen Pro
TomDog
User
Posts: 620
Joined: 25-Oct-2005
# Posted on: 11-Mar-2022 15:22:37   

Otis wrote:

At the moment I don't have a more simpler suggestion on offer... we could add some helpers in the future for this but it won't be today so it won't help you now, I'm afraid.

Something in the future would be awesome, could that enable support for FILESTREAMS as well?

Anyway I've added a second version of FetchExcludedFieldsAsStreamsAsync this timed based on FetchEntity (and therefore simpler) and returning the datareader

Its usage can be see in StreamLargePhotoToFileWithExcludedFieldsAsync2

Jeremy Thomas
Otis avatar
Otis
LLBLGen Pro Team
Posts: 39760
Joined: 17-Aug-2003
# Posted on: 12-Mar-2022 09:51:30   

Yes, any stream type, the ones we currently skip due to the problems you have encountered simple_smile

Frans Bouma | Lead developer LLBLGen Pro
AlexP
User
Posts: 17
Joined: 24-Jan-2020
# Posted on: 10-Jul-2024 11:40:48   

Otis wrote:

Yes, any stream type, the ones we currently skip due to the problems you have encountered simple_smile

Is there any development when it comes to streaming blob fields to/from a database?

We're looking for this as well. We're trying to stay away from using specific Oracle (in our case) classes like OracleBlob. The thing is we need to write streams of several megabytes into blob fields. (and read them back off course). These are now consuming a lot of memory on the Large Object Heap.

Otis avatar
Otis
LLBLGen Pro Team
Posts: 39760
Joined: 17-Aug-2003
# Posted on: 11-Jul-2024 08:22:53   

For an entity system to work, streaming data per entity in is problematic, hence we didn't (and aren't planning to) support streaming for large objects.

Frans Bouma | Lead developer LLBLGen Pro