C# and .NET: What’s next?

About 10 days ago, MSDN’s Channel 9 site released an hour-long video entitled Meet the Design Team, that talks in very vague terms about upcoming features in C# 4.0. You’ll learn that the language will include more dynamic constructs and built-in support for multiple cores. Honestly, that’s about all you’ll learn from watching the video. Granted, either one of those broad features implies many changes to the language and to the underlying runtime.

Improvements to the language are all well and good, but given the choice I’d rather have them address some fundamental runtime issues: the two-gigabyte limit, and garbage collection. Both of these issues have caused me no end of grief over the past year.

All things considered, the .NET garbage collector is a definite win. It handles the majority of memory management tasks much better than most programmers. It’s not impossible to create a memory leak in a .NET program, but you really have to try. Unfortunately, garbage collection is not free. You’ll find that out pretty quickly if you write a long-running program that does a lot of string manipulation. For example, take a look at this clip, which shows bandwidth usage from a Web crawler written in .NET:

Those times of zero bandwidth usage you see coincide with the garbage collector pausing all the threads to clean things up. We lose somewhere around 10% of our potential bandwidth usage due to garbage collection. This particular graph is from a dual-core machine. The graph looks the same on a quad-core processor.

Obviously, they’ll have to do something about the garbage collector if they’re going to support multiple cores. No amount of multi-core support in the language or in the runtime will do me a bit of good if every core stops whenever the garbage collector kicks in.

I’ve mentioned the .NET two-gigabyte limit before. The 64-bit runtime has access to as much memory as you can put in a machine, but no single object can be larger than two gigabytes.  When you’re working with data sets that contain hundreds of millions of items, that’s just not acceptable. When $2,000 will buy you a machine with 16 gigabytes of memory, it’s time that the .NET runtime give me the ability to allocate an object that makes use of that capacity.

I’m happy to see the team continue improving the C# language. I’ll undoubtedly find many of their improvements useful. But no amount of language improvement will increase my productivity if I’m hamstrung by the absurd limit on individual object size and the garbage collector continues to eat my processor cycles.

Unfortunately, we’ll have to wait a bit longer before we know what all will be included in the next versions of C# and .NET. Microsoft is keeping pretty quiet, apparently in an attempt to make a big splash at the Professional Developer’s Conference in October.

Anybody care to pay my way to the conference?