Upload and Read Files from Azure Blob Storage using .Net

In this article we going to learn how to upload and read files from Azure Blob Storage in a .NET API so you can store images, documents, and any other files in the cloud instead of on your server.

Most developers start by saving uploaded files directly to the server's file system. Works fine locally. Then you deploy to Azure App Service and realise App Service doesn't have persistent storage — files you save today are gone after the next deployment or app restart. That's when Azure Blob Storage becomes the right answer.

Blob Storage is Azure's object storage service. You store files as blobs inside containers, and each file gets a URL you can use to access it directly. It's cheap, it scales automatically, and it's built exactly for this use case — storing files that your application needs to read and write.

This tutorial shows how to:

  • Create a Storage Account and container in Azure portal
  • Install the Azure Blob Storage SDK in a .NET project
  • Upload a file to Blob Storage from a .NET API
  • Read and download a file from Blob Storage
  • Delete a blob
  • Generate a SAS URL for temporary private file access
  • Store the connection string securely using Azure App Settings

Why Blob Storage Over File System

When you save files to disk on App Service, a few things can go wrong. App Service can have multiple instances running behind a load balancer — one instance saves a file, but the next request hits a different instance that doesn't have that file. Deployments wipe the file system. App restarts wipe it too.

Blob Storage sits completely outside your app. Every instance of your API talks to the same storage account. Files survive deployments, restarts, scaling events — everything. And you get built-in redundancy, so your files are replicated across multiple data centres automatically.

For any production app that handles file uploads — profile pictures, invoices, documents, exports — Blob Storage is the right way to do it.


Step 1 : Create a Storage Account in Azure Portal

Go to portal.azure.com and click Create a resource. Search for Storage account and click Create.

Fill in the details :

  • Subscription — your subscription
  • Resource Group — use the same one as your App Service to keep things together
  • Storage account name — something like myappfilestorage. Has to be globally unique, lowercase, 3-24 characters.
  • Region — same region as your App Service
  • Performance — Standard is fine for most apps
  • Redundancy — LRS (Locally Redundant Storage) is cheapest and fine for dev/test. Use GRS for production if you need geo-redundancy.

Click Review + Create then Create. Takes about 30 seconds.

Once created, go to the storage account. In the left menu click Containers under Data storage. Click + Container.

Name it something like uploads. Set Public access level to Private for now — we'll use SAS URLs for access control.

Now go to Access keys in the left menu. Copy the Connection string for key1. You'll need this in the next step.


Step 2 : Install the Azure Blob Storage SDK

Open your .NET API project. Install the Azure Storage Blobs package :

dotnet add package Azure.Storage.Blobs

That's the official Microsoft SDK. No third-party packages needed.


Step 3 : Create a Blob Storage Service

Create a service class to wrap all the Blob Storage operations. This keeps the Azure SDK code in one place — your controllers and other services don't need to know anything about Blob Storage directly.

Create BlobStorageService.cs :

using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using Azure.Storage.Sas;

public interface IBlobStorageService
{
    Task<string> UploadFileAsync(Stream fileStream, string fileName, string contentType);
    Task<Stream> DownloadFileAsync(string fileName);
    Task DeleteFileAsync(string fileName);
    string GenerateSasUrl(string fileName, int expiryMinutes = 60);
}

public class BlobStorageService : IBlobStorageService
{
    private readonly BlobContainerClient _containerClient;

    public BlobStorageService(IConfiguration configuration)
    {
        var connectionString = configuration["AzureStorage:ConnectionString"];
        var containerName = configuration["AzureStorage:ContainerName"];

        var blobServiceClient = new BlobServiceClient(connectionString);
        _containerClient = blobServiceClient.GetContainerClient(containerName);
        _containerClient.CreateIfNotExists();
    }

    public async Task<string> UploadFileAsync(Stream fileStream, string fileName, string contentType)
    {
        var blobClient = _containerClient.GetBlobClient(fileName);

        var uploadOptions = new BlobUploadOptions
        {
            HttpHeaders = new BlobHttpHeaders
            {
                ContentType = contentType
            }
        };

        await blobClient.UploadAsync(fileStream, uploadOptions);

        return blobClient.Uri.ToString();
    }

    public async Task<Stream> DownloadFileAsync(string fileName)
    {
        var blobClient = _containerClient.GetBlobClient(fileName);
        var response = await blobClient.DownloadStreamingAsync();
        return response.Value.Content;
    }

    public async Task DeleteFileAsync(string fileName)
    {
        var blobClient = _containerClient.GetBlobClient(fileName);
        await blobClient.DeleteIfExistsAsync();
    }

    public string GenerateSasUrl(string fileName, int expiryMinutes = 60)
    {
        var blobClient = _containerClient.GetBlobClient(fileName);

        var sasBuilder = new BlobSasBuilder
        {
            BlobContainerName = _containerClient.Name,
            BlobName = fileName,
            Resource = "b",
            ExpiresOn = DateTimeOffset.UtcNow.AddMinutes(expiryMinutes)
        };

        sasBuilder.SetPermissions(BlobSasPermissions.Read);

        return blobClient.GenerateSasUri(sasBuilder).ToString();
    }
}

The constructor reads config values and creates the container if it doesn't exist yet. That CreateIfNotExists() call is handy — your app won't crash on first run just because the container wasn't created manually.


Step 4 : Register the Service and Add Config

In Program.cs, register the service :

builder.Services.AddScoped<IBlobStorageService, BlobStorageService>();

In appsettings.json, add the config section :

{
  "AzureStorage": {
    "ConnectionString": "DefaultEndpointsProtocol=https;AccountName=...your connection string...",
    "ContainerName": "uploads"
  }
}

Never commit the actual connection string to your git repo. For local development, use appsettings.Development.json and add that file to .gitignore. For production, set these values in Azure App Service Configuration settings — they override appsettings.json at runtime.


Step 5 : Upload a File from an API Endpoint

Now wire up the service in a controller. Here's a file upload endpoint :

using Microsoft.AspNetCore.Mvc;

[ApiController]
[Route("api/[controller]")]
public class FilesController : ControllerBase
{
    private readonly IBlobStorageService _blobService;

    public FilesController(IBlobStorageService blobService)
    {
        _blobService = blobService;
    }

    [HttpPost("upload")]
    public async Task<IActionResult> Upload(IFormFile file)
    {
        if (file == null || file.Length == 0)
            return BadRequest("No file provided.");

        // Give the file a unique name to avoid overwriting existing files
        var fileName = $"{Guid.NewGuid()}_{file.FileName}";

        using var stream = file.OpenReadStream();
        var fileUrl = await _blobService.UploadFileAsync(stream, fileName, file.ContentType);

        return Ok(new { url = fileUrl, fileName = fileName });
    }

    [HttpGet("download/{fileName}")]
    public async Task<IActionResult> Download(string fileName)
    {
        var stream = await _blobService.DownloadFileAsync(fileName);
        return File(stream, "application/octet-stream", fileName);
    }

    [HttpDelete("{fileName}")]
    public async Task<IActionResult> Delete(string fileName)
    {
        await _blobService.DeleteFileAsync(fileName);
        return Ok(new { message = "File deleted." });
    }

    [HttpGet("sas/{fileName}")]
    public IActionResult GetSasUrl(string fileName, [FromQuery] int expiryMinutes = 60)
    {
        var sasUrl = _blobService.GenerateSasUrl(fileName, expiryMinutes);
        return Ok(new { url = sasUrl });
    }
}

The upload endpoint creates a unique file name using Guid. This is important — if two users upload a file with the same name, without Guid they'd overwrite each other. With Guid every file gets a unique name in storage.

The response returns both the blob URL and the file name. Store the file name in your database so you can reference it later for downloads or deletes.


Step 6 : Generate SAS URLs for Private Files

If your container is private, the direct blob URL won't work — it'll return 403. That's correct behaviour for private files. But sometimes you need to give a user temporary access to a specific file — like a download link that expires in 1 hour.

That's what SAS (Shared Access Signature) URLs are for. The GenerateSasUrl method in the service creates a URL with a signature embedded in the query string. The URL works for the specified duration then expires automatically.

Example SAS URL :

https://myappfilestorage.blob.core.windows.net/uploads/abc123_invoice.pdf?sv=2021-06-08&ss=b&srt=co&sp=r&se=2024-01-01T12:00:00Z&st=2024-01-01T11:00:00Z&spr=https&sig=xxxxx

The se parameter is the expiry time. After that the URL returns 403. Good for things like invoice downloads, report exports, anything you want to be temporarily accessible without making files permanently public.

Call the /api/files/sas/{fileName} endpoint to get a fresh SAS URL whenever a user needs to access a file. Don't store SAS URLs in the database — they expire. Store the file name, generate the SAS URL on demand.


Common Issues

UploadAsync throws RequestFailedException with 403

Your connection string is wrong or the storage account firewall is blocking access. Double check the connection string is copied correctly from Azure portal → Access keys. Also check Storage account → Networking — if it's set to "Enabled from selected networks", your App Service IP needs to be in the allowed list.

File uploads work locally but fail on App Service

Connection string isn't set in App Service Configuration. Go to Azure portal → your App Service → Configuration → New application setting. Add AzureStorage__ConnectionString (double underscore for nested config in Azure) with the connection string value.

GenerateSasUri throws error about account key

SAS generation requires the storage account key to be available in the connection string. If you're using a connection string without the AccountKey (like a SAS-only connection string), use BlobServiceClient initialized with StorageSharedKeyCredential instead.

Downloaded file is corrupted

Make sure you're not reading the stream after it's already been disposed. The using statement in the upload closes the stream — for download, return the stream directly to the response without wrapping it in a using block. The MVC File() result handles stream disposal.


Summary

You learned how to upload and read files from Azure Blob Storage in a .NET API. You covered :

  • Creating a Storage Account and private container in Azure portal
  • Installing the Azure.Storage.Blobs NuGet package
  • Building a BlobStorageService with upload, download, delete, and SAS URL generation
  • Registering the service and storing config in appsettings.json and Azure App Settings
  • Building a FilesController with endpoints for upload, download, delete, and SAS URL
  • Using Guid to generate unique file names and avoid overwrites
  • Generating expiring SAS URLs for temporary private file access
  • Fixing common issues like 403 errors and corrupted downloads

Blob Storage is one of those things that seems complicated at first but the SDK makes it pretty straightforward. Once BlobStorageService is set up, adding file upload to any feature is just injecting the service and calling UploadFileAsync. That's it.

I hope you like this article...

Happy coding! 🚀

Post a Comment

0 Comments