Too much memory

More memory is a good thing, right? Not necessarily. I upgraded my machine last week from four gigabytes of RAM to six (I’m waiting for the replacement two gigabytes to make it eight). Now TextPad won’t load a one-gigabyte file. It used to, back when I only had four gigs of memory. But not now. The error it gives is kind of funny: “Disk full while trying to read .”

I don’t know for sure, but I strongly suspect that the error has to do with the way that TextPad checks for available memory. It’s probably calling the Windows API function that returns the amount of memory as a 64-bit quantity. Since TextPad is a 32-bit application, it just checks the lower 32 bits of the returned result, and assumes that I only have two gigabytes of memory. Oops. I’ll bet that it’ll load the file for me no problem once I have the full eight gigabytes.

But I’m looking for a new editor. And I’d really like a text file viewer. I find it hard to believe that there aren’t file viewers and editors that intelligently handle large files. They all seem to insist on loading the entire file into memory before allowing me to do anything. This is asinine. WordStar could operate on larger-than-memory files 25 years ago. There’s no reason why today’s software shouldn’t be able to do the same.

If you know of a good text editor or text file viewer that can work with very large (multi-gigabyte) files that are larger than available memory, please let me know. And please don’t tell me “move to Linux.” That’s not currently an option, and if the GNU tools for Windows that I’ve downloaded are any indication, they suffer from the same problems.