Memory is complicated

21/09/2022
MemoryC#.NET

This is a small story about how memory operates in your .NET application. Well not only .NET but how memory does or does not get allocated.

We will see how a 1 Gigabyte big array is only a few megabytes big to some extend. Furthermore I will discuss working set and committed memory.

For starters let's have a look at the following code:

var byteArrays = new List<byte[]>();

// Allocate 1024 * 1024 * 1024 bytes, aka 1 Gigabyte
for (var i = 0; i < 1024; i++)
    byteArrays.Add(new byte[1024 * 1024]);

var workingSetInBytes = System.Diagnostics.Process.GetCurrentProcess().WorkingSet64;
Console.WriteLine($"Memory: {workingSetInBytes / 1024 / 1024} MB");

We create a List<byte> with 1024 byte[] elements. Each of those byte[] elements is 1024 * 1024 elements wide. All in all we allocate 10243 bytes (or better 1 Gigabyte) memory, or?

We can check the memory usage via:

System.Diagnostics.Process.GetCurrentProcess().WorkingSet64;

The documentation to that method says:

Gets the amount of physical memory, in bytes, allocated for the associated process.

Okay, let's run the whole thing:

Memory: 62 MB

What the duck? That is far from 1 Gigabyte! How is this possible? Well let's check again the documentation above. I did an emphasise on physical on purpose. There is a big difference between physical and virtual memory. Virtually we really allocated 1 gigabyte but physically we don't. Only if you access (read and write aka touch an element of our byte array) the OS (Windows, Linux and MacOS at least) will bring this allocation to the physical memory. Now depending on the operating system also here only a part will be physically allocated (4kb at a time). This is all done for performance reasons. Just think of it as your big array is a book, but once you access something you only load a page from that book.

The key takeaway here is: The working set (private) memory is only 62 Megabytes in my example but the commit size is roughly 1 Gigabyte.

Now I access every single element of that big array to force the allocation on the physical layer:

var byteArrays = new List<byte[]>();

// Allocate 1024 * 1024 * 1024 bytes, aka 1 Gigabyte
for (var i = 0; i < 1024; i++)
{
    byteArrays.Add(new byte[1024 * 1024]);
    for (var j = 0; j < 1024 * 1024; j++)
        byteArrays[i][j] = 2;
}
var workingSetInBytes = System.Diagnostics.Process.GetCurrentProcess().WorkingSet64;
Console.WriteLine($"Memory: {workingSetInBytes / 1024 / 1024} MB");

And tada the output is the following:

Memory: 1069 MB

We forced now the allocation because we read every single element of our two-dimensional array.

Conclusion

Naming things is hard. Also memory management is hard. The task manager can give you a false impression of things. I hope you know now what is the difference between virtual and physical memory as well as commited memory vs working set.

Resources

7
An error has occurred. This application may no longer respond until reloaded. Reload x