How to upload a large file through an Azure function?

I have found another way of doing things. Here is the solution that works for me.

Infrastructure

When a client needs to upload a file, it calls the Azure Function to be authenticated (using the Identity provided by the Framework) & authorized.

The Azure Function will ask for a Shared Access Signature (SAS) to access a specific Blob. The SAS will give the client access to the Blob storage with Write-only privileges for a limited time (watch out for the clock's skew on Azure).

The client will then use the returned SAS to upload the file directly to the Blob storage. That way, it avoids the long term communication with the client as mentioned by Afzaal Ahmad Zeeshan and reduces the overall cost even more as the Azure Function is no more dependent on the connection speed of the client.


You are following a bad practice here, Kzrystof. Azure Functions are not meant for long term communication with the client devices. I am not sure, why someone might be interested in guiding you at all to write a program to manage the Azure Function and force it to do what it is not intended to be doing.

Large, long-running functions can cause unexpected timeout issues.

Now imagine, you might be having a good Internet connection, but the users may not be. There are several other problems that you must take a note of before anything. And this is an excerpt from official documentation, https://docs.microsoft.com/en-us/azure/azure-functions/functions-best-practices.

If I had to design this application, I would use App Service → Azure Storage → Azure Functions. This would be the workflow of my application's architecture.

In the design approach, my applications would take turns in processing this information, such as App Service could take care of the image uploading, and there I can specify whether the user can upload or not. ASP.NET Core, or any other language or framework can be used to develop that side of the web application, and you do know that this can be easily elevated to support a file upload of up to 20MBs.

Why did I ask you to twist the design? You had a Function to Blob, and I am suggesting a Blob to Function, because,

Functions should be stateless and idempotent if possible. Associate any required state information with your data. For example, an order being processed would likely have an associated state member. A function could process an order based on that state while the function itself remains stateless.

The functions themselves are to be stateless, which means they must not hold any information about anything and solving this will require you to have another middleware (or frontware) to communicate with the Identity servers, which is why I am suggesting to use the App Service here as it can contain the necessary information to authenticate the users, and then Blob and &rarr finally Function, if needed.

Then, once it gets out of there, into the Azure Storage, then I can have the WebHooks, or the direct Blob Storage triggers take care of delegation from there and process the image in the Azure Function — if there is a need of the Function anymore. Take a look at how a Blob Storage trigger can be used to start a Function for various purposes, https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-storage-blob-triggered-function.


As soon as you set the ContentLength header you will no longer be streaming it. You need to use the PushStreamContent class and write to the stream in chunks.

Whether you will still be able to access that stream on the server side as chunks, I don't know. It's possible that something in the Azure Functions pipeline will buffer streams before serving it to the function.