I learned today that there is a memory leak in the .NET 4.0 ConcurrentQueue class. The leak could explain some trouble I was having in December with a program running out of memory. I didn’t know about this bug at the time, but reducing the sizes of several of my queues seems to have made the problem go away.
I know some of you are thinking, “A memory leak in .NET? I thought the garbage collector was supposed to prevent that!” The garbage collector collects garbage–memory that’s no longer referenced. It can’t collect something that’s still being used. Let me explain.
ConcurrentQueue class uses an array as its backing store. When an item is added to the queue, a reference to the item is placed in the array. As long as an item is in the queue, there is an active reference to it–some part of the program is depending on that item to be good. In garbage collector parlance, the item is “rooted,” meaning that the garbage collector can’t collect it.
When an item is removed from the queue, that item’s slot in the array is supposed to be cleared–set to null. Since the item was removed from the queue, it makes no sense to keep a reference to the item around. And as long as the queue keeps the reference, the garbage collector can’t collect the object.
Problem is, the code that removes items from the queue doesn’t clear the item’s slot in the array. The result is that stale object references remain in the queue’s backing array until one of the following happens:
- An item added to the queue overwrites that slot.
- There are no more references to the queue (i.e. it goes out of scope).
- The array is resized. This happens automatically as the number of items in the queue grows and shrinks.
There might be other ways, but those are all that I can think of at the moment. The point is that items “live” in the queue longer than they absolutely have to, which is a potentially serious memory leak.
I say “potentially serious” because it’s not like the queue grows without bound. At least, it shouldn’t. If your program allows for unbounded queue growth, then you have a more serious problem than this memory leak. That said, with the existence of this bug you have to assume that your queue will always have worst-case memory usage. That is, for the purpose of computing memory consumption, you have to assume that the queue is always full, even when you know that all items have been removed.
This can be a real problem if you use a queue to hold input items, for example, and a separate collection to hold items that have been processed from the queue. For example, imagine a loop that does this:
QueuedThing queued = GetNextItemFromQueue(); ProcessedThing processed = ProcessThing(queued); AddProcessedItemToList(processed);
Your expectation is that, within the limits of the garbage collection interval, the amount of memory used is:
(sizeof(QueuedThing) * queue.Count) + (sizeof(ProcessedThing) * list.Count)
But with this bug, you have to design for the worst case memory usage, which is:
(sizeof(QueuedThing) * queue.Capacity) + (sizeof(ProcessedThing) * list.Count)
ProcessedThing are the same size, then your worst case memory usage is double what it should be.
Microsoft says that the fix for this will be in the next version of .NET. Considering that .NET releases are typically every two years or so, that means at least another year before this is fixed. Until then, the workaround is to be more careful with the way you use
ConcurrentQueue (and BlockingCollection, which uses a
ConcurrentQueue as its default backing store). In particular, you should be careful to limit the size of your queue and limit the scope in which the queue is active.