BoldDesk®Customer service software offering ticketing, live chat, and omnichannel support, starting at $49/mo. for 10 agents. Try it for free.
We are creating a C# application capable of uploading (potentially very) large files to an Azure Blob store. At the moment the largest file we have been able to upload is about 350Mb. Its possible that our clients might have video files that are larger than that. I have included the definition of the 'Uploader' control and a snippet of the server-side code.
Have we missed a setting or have something defined incorrectly? Is this possible?
Client Side Javascript
blobUploadObject = new ej.inputs.Uploader({
autoUpload: false,
asyncSettings: {
saveUrl: hostUrl + 'api/AzureBlobStorage/Save',
removeUrl: hostUrl + 'api/AzureBlobStorage/Remove',
},
chunkSize: 327680,
maxFileSize: 107374182400
});
Server-Side code
[HttpPost]
[Route("Save")]
[RequestFormLimits(ValueLengthLimit = 2147483647, MultipartBodyLengthLimit = 2147483647)]
public async Task<KeyValuePair<string, object>> SaveAsync(IList<IFormFile> chunkFile, IList<IFormFile> UploadFiles)
{
try
{
// for chunk-upload
foreach (var file in (List<IFormFile>)Request.Form.Files)
{
try
{
Guid guid = Guid.NewGuid();
CloudBlockBlob blob = container.GetBlockBlobReference($"{guid.ToString()}{file.FileName}");
blob.Metadata.Add("Guid", guid.ToString());
await blob.UploadFromStreamAsync(file.OpenReadStream());
BlockBlobClient blockClient = new BlockBlobClient(await _keyVaultManager.GetSecret("BlobConnectionString"),
Configuration.GetValue<string>("BlobStorageData:ContainerName"),
$"{guid.ToString()}{file.FileName}");
Response.Headers.Add("GUID", guid.ToString());
return new KeyValuePair<string, object>("GUID", guid);
}
catch (Exception e)
{
_logger.LogError(e, MethodBase.GetCurrentMethod(), "Error uploading file '{File}' to Azure blob store.", file.FileName);
}
}
return new KeyValuePair<string, object>("GUID", null);
}
catch (Exception e)
{
Response.Clear();
Response.StatusCode = 204;
Response.HttpContext.Features.Get<IHttpResponseFeature>().ReasonPhrase = "File failed to upload";
Response.HttpContext.Features.Get<IHttpResponseFeature>().ReasonPhrase = e.Message;
return new KeyValuePair<string, object>("GUID", null);
}
}
<security>
<requestFiltering>
<!-- This will handle requests up to 50MB -->
<requestLimits maxAllowedContentLength="52428800" />
</requestFiltering>
</security> |
We already have the following in the web.config on the Azure server:
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="2147483647" />
</requestFiltering>
</security>
When uploading large files (>350Mb) the Uploader control appears to be doing the upload (ie the progress bar updates until it reaches 100%) but it never reaches the Api endpoint. The following is the result: