Using Async Enumerable To Save Memory

I have encountered a problem recently. We have feature in the software that allows users to download large amounts of exported data. We found that some users selected so much data that the memory grew uncontrollably in .NET Core. So I researched the problem and found out that the async enumerable feature in C# 8 could solve the issue for me.

Here is how I could reproduce the problem in a controller.

[ApiController]
[Route("[controller]")]
public class StreamingController : ControllerBase
{
    private readonly ILogger<StreamingController> _logger;

    public StreamingController(ILogger<StreamingController> logger)
    {
        _logger = logger;
    }

    [HttpGet]
    [Route("async")]
    public IAsyncEnumerable<string> Get()
    {
        Response.Clear();
        Response.Headers.Add("Content-Disposition", "attachment;filename=somenameasync.csv");
        Response.Headers.Add("Content-Transfer-Encoding", "binary");
        return GetAsyncContent();
    }

    private IEnumerable<string> GetSyncContent()
    {
        var strings = new List<string>();
        for (int i = 0; i < 1000; i++)
        {
            strings.Add(Guid.NewGuid().ToString().PadRight(1000000) + Environment.NewLine);
        }

        return strings;
    }
}

If you run this code, you will see your process hitting 4 GB of RAM. Not an ideal situation for about 1 GB export file. So the solution is to use async enumerable and literally stream the data into the browser. Here is the final version of the same controller

[ApiController]
[Route("[controller]")]
public class StreamingController : ControllerBase
{
    private readonly ILogger<StreamingController> _logger;

    public StreamingController(ILogger<StreamingController> logger)
    {
        _logger = logger;
    }

    [HttpGet]
    [Route("async")]
    public IAsyncEnumerable<string> Get()
    {
        Response.Clear();
        Response.Headers.Add("Content-Disposition", "attachment;filename=somenameasync.csv");
        Response.Headers.Add("Content-Transfer-Encoding", "binary");
        return GetAsyncContent();
    }

    [HttpGet]
    [Route("sync")]
    public IEnumerable<string> GetSync()
    {
        Response.Clear();
        Response.Headers.Add("Content-Disposition", "attachment;filename=somenameasync.csv");
        Response.Headers.Add("Content-Transfer-Encoding", "binary");
        return GetSyncContent();
    }

    private async IAsyncEnumerable<string> GetAsyncContent()
    {
        for (int i = 0; i < 1000; i++)
        {
            await Task.Delay(1);
            yield return Guid.NewGuid().ToString().PadRight(1000000) + Environment.NewLine;
        }

    }

    private IEnumerable<string> GetSyncContent()
    {
        var strings = new List<string>();
        for (int i = 0; i < 1000; i++)
        {
            strings.Add(Guid.NewGuid().ToString().PadRight(1000000) + Environment.NewLine);
        }

        return strings;
    }
}

The key part here is the signature of our method to generate the data – async IAsyncEnumerable. I can now call yield return inside the loop and return one junk export at a time and stream it into the browser. The memory is now staying around 100 MB for the same 1 GB file! You have to of course use the same signature on the controller method to enable the magic.

Enjoy!