You forget about cache. MongoDB successfully uses hundreds of GB of RAM w/out having that much RAM available, letting OS put off unneeded data on the disk itself.
to be honest this was something I was also thinking about. Throwing old years of world gen onto hdd somewhere would slow down the process even more since HDD is so much slower than RAM, but basically eliminates any ceiling on memory size since hard disk space is always growing.
@yetanotherstupiddorf: Yes actually, I do deny that systems with that much RAM will be more common in the NEAR future. How much RAM is on the typical new computer right now? 4GB. It took 18+ months for that to become the norm over 2GB, and even now it's just as easy to obtain a system with 2 as it is with 4 - in other words 4GB is STILL not the standard. How much benefit do you think the average consumer is getting out of those extra 2GB? That determines how likely a mainstream supplier is to provide more RAM. Only a custom build or boutique vendor will be likely to include more RAM in their machines because they're targeted for the users who need it. It also so happens that the reason boutique machines are so expensive is because there are fewer people demanding that functionality, ie not the majority.
Besides, you conveniently ignore my other point: what do you do for the people who DO have only 2 or 4GB (ie the majority)? You tell them "too bad for you, guess you just can't generate worlds anymore"? The solution, AGAIN, is not in finding more RAM. The problem here is "world gen tries to take up well over 4GB of RAM when most people don't have that much". The solution should not be in the "most people don't have that much" clause. The problem is in the "world gen tries to take up well over 4GB of RAM" clause and that's where the solution has to originate. You don't fix the problem by telling people to get more RAM, because that's not the problem; you fix the problem by streamlining the worldgen system so that it doesn't attempt to consume that much RAM.
you cannot currently buy more than 16 GB for sane price, so making game capable of taking beyond 2GB limit is useless
"capable" and "requiring" are two very different things, and to outline the difference I'll capitalize all the relevant verbs. Right now, DF REQUIRES well over 3GB of RAM to generate a long-history world. What if that REQUIREment was 32GB? What would you say now? The game would basically only run on a corporate server. Not so easy to overlook, is it? "capable" implies that the game CAN, but does not HAVE TO, make use of that volume of resources. DF CAN make use of over 2GB of RAM, because right now it HAS TO (again all in regards to the generation of long-lived worlds). Obviously we can answer "guess you'll have to generate a smaller world then" and to some extent that is the answer, but it's quite easy to expand a game beyond its means, especially when dealing with huge volumes of procedurally generated content like DF does, and telling people that they just have to keep shrinking their constraints to compensate might well squeeze us to 3-year old worlds. AGAIN, the solution is to streamline the way that worlds are generated, not to throw more RAM at the problem.
And as for multithreading, I never claimed to understand how Toady programmed temperature. I was providing an example that clarified my point - that in some systems, the advantages of multithreading will be slim, and DF could well be one of those systems. It's all dependent on the form of the code. As I said:
really, the effectiveness of multithreading for a program like DF will depend a lot on the architecture of the code and how forgiving it is to parallelization.
Honestly Telamon, I think the issue really boils down to the fact that the game cannot function within it's own parameters right now. The solution we have, at the moment, is to just not generate worlds of such a large size and history. While I have no problem doing that, the fact remains that the game seems to be pushing against a barrier that it theoretically could overcome.
Also, games do often outgrow their customers, even games like WoW that have increased graphics over many years. Despite there being an increase in graphic quality, people were able to still play the game on 10 year old computers by sacrificing their expectations. It's in this same vein I have to draw a similar conclusion: just because the game would potentially be programmed to use 64-bit and/or multithreading, it doesn't mean that people without those options can't play anymore. It just means they are now limited by their computer quality in world gen, instead of by the programming limitations.
And considering computer hardware in general, when I was shopping for a computer for my wife it was nearly impossible to find a computer that
didn't at the very least have 4 cores. Those with a single core were often more expensive because they were better processors. RAM is dirt cheap, even for tri-channel setups. Hell, I'm running a 24 gig tri-channel right now that cost me about 120 bucks, and that's really the absolute high end and totally unnecessary for non-professionals. I think a dual-channel 4 gig set is 25 bucks if you get the Crucial brand (perfectly fine because I highly doubt anyone here needs to worry about data loss to the point that they need something like ECC RAM).
Beyond 64-bit, there's realistically no cap on RAM that we could hope to reach in the next 10 years, at least not in DF. Not making that move, however, is going to mean more problem patches and content limits for one reason or another. Either we're going to have to settle for increasingly smaller worlds and histories, or we're going to have to wade through more optimization patches that attempt to ferret out performance hogs in the programming. And with those two solutions, there is an expiration date on even those working; eventually the content in the game is going to be too much for quick-fix solutions..
I might not be great with programming, but I know logic, and the numbers say we're closing in on that end-date faster than people are willing to admit.