Unable to upload a file bigger than 350Mb.

We are creating a C# application capable of uploading (potentially very) large files to an Azure Blob store. At the moment the largest file we have been able to upload is about 350Mb. Its possible that our clients might have video files that are larger than that. I have included the definition of the 'Uploader' control and a snippet of the server-side code.

Have we missed a setting or have something defined incorrectly? Is this possible?

Client Side Javascript

blobUploadObject = new ej.inputs.Uploader({
        autoUpload: false,
        asyncSettings: {
            saveUrl: hostUrl + 'api/AzureBlobStorage/Save',
            removeUrl: hostUrl + 'api/AzureBlobStorage/Remove',
        },
        chunkSize: 327680,
        maxFileSize: 107374182400
    });


Server-Side code

[HttpPost]
[Route("Save")]
[RequestFormLimits(ValueLengthLimit = 2147483647, MultipartBodyLengthLimit = 2147483647)]
public async Task<KeyValuePair<string, object>> SaveAsync(IList<IFormFile> chunkFile, IList<IFormFile> UploadFiles)
{
    try
    {
       // for chunk-upload
        foreach (var file in (List<IFormFile>)Request.Form.Files)
        {
            try
            {
                Guid guid = Guid.NewGuid();

                CloudBlockBlob blob = container.GetBlockBlobReference($"{guid.ToString()}{file.FileName}");
                blob.Metadata.Add("Guid", guid.ToString());

                await blob.UploadFromStreamAsync(file.OpenReadStream());
                BlockBlobClient blockClient = new BlockBlobClient(await _keyVaultManager.GetSecret("BlobConnectionString"),
                   Configuration.GetValue<string>("BlobStorageData:ContainerName"),
                          $"{guid.ToString()}{file.FileName}");

                Response.Headers.Add("GUID", guid.ToString());

                return new KeyValuePair<string, object>("GUID", guid);
            }
            catch (Exception e)
            {
                 _logger.LogError(e, MethodBase.GetCurrentMethod(), "Error uploading file '{File}' to Azure blob store.", file.FileName);
            }
      }

      return new KeyValuePair<string, object>("GUID", null);
   }
   catch (Exception e)
   {
      Response.Clear();
      Response.StatusCode = 204;
      Response.HttpContext.Features.Get<IHttpResponseFeature>().ReasonPhrase = "File failed to upload";
      Response.HttpContext.Features.Get<IHttpResponseFeature>().ReasonPhrase = e.Message;

      return new KeyValuePair<string, object>("GUID", null);
    }
}

3 Replies

PM Ponmani Murugaiyan Syncfusion Team March 3, 2022 12:01 PM UTC

Hi James, 

This issue based on your service (backend). You can resolve this issue using maxAllowedContentLength attribute in your webconfig file. This is optional uint attribute. Specifies the maximum length of content in a request, in bytes. The default value is 30000000, which is approximately 28.6MB.   
 
[web.config]   

<security>   
      <requestFiltering>   
        <!-- This will handle requests up to 50MB -->   
        <requestLimits maxAllowedContentLength="52428800" />   
      </requestFiltering>   
</security>   


Regards, 
Ponmani M      



JC James Cullis March 3, 2022 07:07 PM UTC

We already have the following in the web.config on the Azure server:

<security>
      <requestFiltering>
        <requestLimits maxAllowedContentLength="2147483647" />
      </requestFiltering>
</security>

When uploading large files (>350Mb) the Uploader control appears to be doing the upload (ie the progress bar updates until it reaches 100%) but it never reaches the Api endpoint.  The following is the result:





PM Ponmani Murugaiyan Syncfusion Team March 7, 2022 02:16 AM UTC

Hi James, 

We checked your reported query, but we were unbale to replicate the reported issue in our end. We have prepared the sample and attached below for your reference. 


Regards, 
Ponmani M 


Loader.
Up arrow icon