Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  
Pages: 1 ... 3 4 [5] 6 7 ... 9

Author Topic: Has anyone succesfully generated a very long history?  (Read 58711 times)

dirty foot

  • Bay Watcher
    • View Profile
Re: Has anyone succesfully generated a very long history?
« Reply #60 on: February 19, 2012, 03:26:19 am »

Do we know how temperature is handled as it is? In my mind (I'm no programmer), the individual threads functioned simply on entirely different aspects of the game. I never thought to multi-thread down to the tile for temperature, I was thinking of setting aside one core for temp, then another for movement and orders, another for combat, etc., etc; and converging only when an action flags interaction between two or more "zones". For example, magma is static in its temp for 200 years, so there's no need for a temp check. Now something falls in, it is then flagged for an interaction and data convergence takes place. If I remember correctly, it was this programming of item "stasis" that helped optimize water issues in the past.

Am I looking at multi-threading in the wrong way? I'm starting to think it doesn't work on such sweeping sections. If it can work this way, do our problems still remain with converging and separating data?
Logged

dirty foot

  • Bay Watcher
    • View Profile
Re: Has anyone succesfully generated a very long history?
« Reply #61 on: February 19, 2012, 03:29:55 am »

Fair enough, but that still ignores the ripple effect of releasing a new compile. bugs, offset shifts and plain old sh!t happens are bound to occur somewhere, so even if a 64-bit release comes out in a month, the player base will probably take 2-3 months or more to start using such a release. I'd rather the sh!t happened in the process of improving a play mechanic in the game. Besides, it still doesn't solve the other problem I mentioned: switching from one address size to another doesn't instantiate memory. If there was memory waiting to be used, it might unlock it, but the physical limitation of the computer is still there. The benefits of moving to a new address system will be slim when the memory limitation of the user still exists.
You forget about cache. MongoDB successfully uses hundreds of GB of RAM w/out having that much RAM available, letting OS put off unneeded data on the disk itself.
I doubt that game actually uses millions of events it already generated in further calculations, so it wouldn't hurt performance if OS would unload 'em on the disk.
Difference between 32bit and 64bit is in fact that 32bit version _crashes_ upon hitting 2GB limit, while 64bit _slows down but keeps working_ after hitting "all RAM machine has" limit.
Are we sure that past events aren't tagged and kept around for a reason?
Logged

YetAnotherStupidDorf

  • Bay Watcher
    • View Profile
Re: Has anyone succesfully generated a very long history?
« Reply #62 on: February 19, 2012, 07:27:31 am »

most people don't have that much ram.
If they do not have it now, they will have. How much RAM is bought for grand new computers? How much will be in year or two? Do you deny that systems with say 8 or 16 GB will be in future more common?

Beyond that (16 GB) you're looking at server motherboards if you want to handle more;
Your logic "you cannot currently buy more than 16 GB for sane price, so making game capable of taking beyond 2GB limit is useless" is bizzare.

(sorry to those of you who really are overflowing with RAM.... but honestly nothing these days can consume that much RAM anyway, so was there really a point in buying that much? =P)
Of course, 640 kb should be enough for everyone! Dwarf Fortress itself is good example of something that can and will "consume that much RAM".

You're talking about rewriting an entire game,
I do not deny it could be problematic and cause strange bugs unique only to 64 bit version. But claim of "rewriting an entire game" is pure BS.

About multithreading, there is no point in making up scary scenarions, when we do not know, how exactly Toady calculates temeprature. It COULD be something as simple as calcualting just one tile. And you did not touch at all one area that would benefit most - pathfinding. Calculating it is not dependent on each other, so you can, say, calculate half of pathes on one core and other half on another. Main thread managing one game tick would wait for finishing work of all pathfinding threads and go on for other needed areas.
But I agree multithreading would be very large project and it will probably never happen, at least not with this particlular game. Maybe chapter III - it is easier, if game is programmed with multithreading implemented from beginning.
Logged
Dwarf Fortress - where the primary reason to prevent death of your citizens is that it makes them more annoying then they were in life.

telamon

  • Bay Watcher
    • View Profile
Re: Has anyone succesfully generated a very long history?
« Reply #63 on: February 19, 2012, 04:50:12 pm »

Quote
You forget about cache. MongoDB successfully uses hundreds of GB of RAM w/out having that much RAM available, letting OS put off unneeded data on the disk itself.
to be honest this was something I was also thinking about. Throwing old years of world gen onto hdd somewhere would slow down the process even more since HDD is so much slower than RAM, but basically eliminates any ceiling on memory size since hard disk space is always growing.

@yetanotherstupiddorf: Yes actually, I do deny that systems with that much RAM will be more common in the NEAR future. How much RAM is on the typical new computer right now? 4GB. It took 18+ months for that to become the norm over 2GB, and even now it's just as easy to obtain a system with 2 as it is with 4 - in other words 4GB is STILL not the standard. How much benefit do you think the average consumer is getting out of those extra 2GB? That determines how likely a mainstream supplier is to provide more RAM. Only a custom build or boutique vendor will be likely to include more RAM in their machines because they're targeted for the users who need it. It also so happens that the reason boutique machines are so expensive is because there are fewer people demanding that functionality, ie not the majority.
Besides, you conveniently ignore my other point: what do you do for the people who DO have only 2 or 4GB (ie the majority)? You tell them "too bad for you, guess you just can't generate worlds anymore"? The solution, AGAIN, is not in finding more RAM. The problem here is "world gen tries to take up well over 4GB of RAM when most people don't have that much". The solution should not be in the "most people don't have that much" clause. The problem is in the "world gen tries to take up well over 4GB of RAM" clause and that's where the solution has to originate. You don't fix the problem by telling people to get more RAM, because that's not the problem; you fix the problem by streamlining the worldgen system so that it doesn't attempt to consume that much RAM.

Quote
you cannot currently buy more than 16 GB for sane price, so making game capable of taking beyond 2GB limit is useless
"capable" and "requiring" are two very different things, and to outline the difference I'll capitalize all the relevant verbs. Right now, DF REQUIRES well over 3GB of RAM to generate a long-history world. What if that REQUIREment was 32GB? What would you say now? The game would basically only run on a corporate server. Not so easy to overlook, is it? "capable" implies that the game CAN, but does not HAVE TO, make use of that volume of resources. DF CAN make use of over 2GB of RAM, because right now it HAS TO (again all in regards to the generation of long-lived worlds). Obviously we can answer "guess you'll have to generate a smaller world then" and to some extent that is the answer, but it's quite easy to expand a game beyond its means, especially when dealing with huge volumes of procedurally generated content like DF does, and telling people that they just have to keep shrinking their constraints to compensate might well squeeze us to 3-year old worlds. AGAIN, the solution is to streamline the way that worlds are generated, not to throw more RAM at the problem.

And as for multithreading, I never claimed to understand how Toady programmed temperature. I was providing an example that clarified my point - that in some systems, the advantages of multithreading will be slim, and DF could well be one of those systems. It's all dependent on the form of the code. As I said:
Quote
really, the effectiveness of multithreading for a program like DF will depend a lot on the architecture of the code and how forgiving it is to parallelization.
« Last Edit: February 19, 2012, 05:03:48 pm by telamon »
Logged
Playing DF on Windows 98 since.... ?
At 55 frames per minute.

Feb

  • Bay Watcher
    • View Profile
Re: Has anyone succesfully generated a very long history?
« Reply #64 on: February 19, 2012, 05:55:45 pm »

don't we already have an example on how long it took computers (global sales % volume) to go from the 512mb standard to the current 2gb standard?   :o  Here's a hint, "Gamer" machines and Workstations did not and does not make up the bulk of the machines sold.  Even the current retail migration from 2gb to 4gb is slow going.   

Yes, the game is planned to be done by 2030, so yes, we'd probably need a rewrite of the code, but that should not and would not be done in the near future, so the point is moot (check devlogs for near-future plans, multithreading is as important as 3D graphics for DF atm iirc).

Optimization is indeed the way forward for the time being as it's the least labour-intensive option that won't heavily interfere with much needed feature developments (army!).  Since we have such a big community here, I'm pretty sure someone with an awesome comp. would spin a few large world for us!  Actually, I'm should go and ask if someone would spin one for me :3     



 
Logged

dirty foot

  • Bay Watcher
    • View Profile
Re: Has anyone succesfully generated a very long history?
« Reply #65 on: February 19, 2012, 05:58:23 pm »

Quote
You forget about cache. MongoDB successfully uses hundreds of GB of RAM w/out having that much RAM available, letting OS put off unneeded data on the disk itself.
to be honest this was something I was also thinking about. Throwing old years of world gen onto hdd somewhere would slow down the process even more since HDD is so much slower than RAM, but basically eliminates any ceiling on memory size since hard disk space is always growing.

@yetanotherstupiddorf: Yes actually, I do deny that systems with that much RAM will be more common in the NEAR future. How much RAM is on the typical new computer right now? 4GB. It took 18+ months for that to become the norm over 2GB, and even now it's just as easy to obtain a system with 2 as it is with 4 - in other words 4GB is STILL not the standard. How much benefit do you think the average consumer is getting out of those extra 2GB? That determines how likely a mainstream supplier is to provide more RAM. Only a custom build or boutique vendor will be likely to include more RAM in their machines because they're targeted for the users who need it. It also so happens that the reason boutique machines are so expensive is because there are fewer people demanding that functionality, ie not the majority.
Besides, you conveniently ignore my other point: what do you do for the people who DO have only 2 or 4GB (ie the majority)? You tell them "too bad for you, guess you just can't generate worlds anymore"? The solution, AGAIN, is not in finding more RAM. The problem here is "world gen tries to take up well over 4GB of RAM when most people don't have that much". The solution should not be in the "most people don't have that much" clause. The problem is in the "world gen tries to take up well over 4GB of RAM" clause and that's where the solution has to originate. You don't fix the problem by telling people to get more RAM, because that's not the problem; you fix the problem by streamlining the worldgen system so that it doesn't attempt to consume that much RAM.

Quote
you cannot currently buy more than 16 GB for sane price, so making game capable of taking beyond 2GB limit is useless
"capable" and "requiring" are two very different things, and to outline the difference I'll capitalize all the relevant verbs. Right now, DF REQUIRES well over 3GB of RAM to generate a long-history world. What if that REQUIREment was 32GB? What would you say now? The game would basically only run on a corporate server. Not so easy to overlook, is it? "capable" implies that the game CAN, but does not HAVE TO, make use of that volume of resources. DF CAN make use of over 2GB of RAM, because right now it HAS TO (again all in regards to the generation of long-lived worlds). Obviously we can answer "guess you'll have to generate a smaller world then" and to some extent that is the answer, but it's quite easy to expand a game beyond its means, especially when dealing with huge volumes of procedurally generated content like DF does, and telling people that they just have to keep shrinking their constraints to compensate might well squeeze us to 3-year old worlds. AGAIN, the solution is to streamline the way that worlds are generated, not to throw more RAM at the problem.

And as for multithreading, I never claimed to understand how Toady programmed temperature. I was providing an example that clarified my point - that in some systems, the advantages of multithreading will be slim, and DF could well be one of those systems. It's all dependent on the form of the code. As I said:
Quote
really, the effectiveness of multithreading for a program like DF will depend a lot on the architecture of the code and how forgiving it is to parallelization.
Honestly Telamon, I think the issue really boils down to the fact that the game cannot function within it's own parameters right now. The solution we have, at the moment, is to just not generate worlds of such a large size and history. While I have no problem doing that, the fact remains that the game seems to be pushing against a barrier that it theoretically could overcome.

Also, games do often outgrow their customers, even games like WoW that have increased graphics over many years. Despite there being an increase in graphic quality, people were able to still play the game on 10 year old computers by sacrificing their expectations. It's in this same vein I have to draw a similar conclusion: just because the game would potentially be programmed to use 64-bit and/or multithreading, it doesn't mean that people without those options can't play anymore. It just means they are now limited by their computer quality in world gen, instead of by the programming limitations.

And considering computer hardware in general, when I was shopping for a computer for my wife it was nearly impossible to find a computer that didn't at the very least have 4 cores. Those with a single core were often more expensive because they were better processors. RAM is dirt cheap, even for tri-channel setups. Hell, I'm running a 24 gig tri-channel right now that cost me about 120 bucks, and that's really the absolute high end and totally unnecessary for non-professionals. I think a dual-channel 4 gig set is 25 bucks if you get the Crucial brand (perfectly fine because I highly doubt anyone here needs to worry about data loss to the point that they need something like ECC RAM).

Beyond 64-bit, there's realistically no cap on RAM that we could hope to reach in the next 10 years, at least not in DF. Not making that move, however, is going to mean more problem patches and content limits for one reason or another. Either we're going to have to settle for increasingly smaller worlds and histories, or we're going to have to wade through more optimization patches that attempt to ferret out performance hogs in the programming. And with those two solutions, there is an expiration date on even those working; eventually the content in the game is going to be too much for quick-fix solutions..

I might not be great with programming, but I know logic, and the numbers say we're closing in on that end-date faster than people are willing to admit.
Logged

Feb

  • Bay Watcher
    • View Profile
Re: Has anyone succesfully generated a very long history?
« Reply #66 on: February 19, 2012, 06:10:54 pm »

Where are you located dirty foot?  I'm asking out of curiosity since most inventories are still on duo and tri core atm afaik, so when you say quad-core, I'm a little jealous :P  (I'm still on some on old duo-cores)
Logged

YetAnotherStupidDorf

  • Bay Watcher
    • View Profile
Re: Has anyone succesfully generated a very long history?
« Reply #67 on: February 19, 2012, 06:26:58 pm »

@yetanotherstupiddorf: Yes actually, I do deny that systems with that much RAM will be more common in the NEAR future.
Define near future. 18 months? In this case I agree - 16 GB being more common (defined as reasonable price) will be in 2, 3 years IMO.

Besides, you conveniently ignore my other point: what do you do for the people who DO have only 2 or 4GB (ie the majority)? You tell them "too bad for you, guess you just can't generate worlds anymore"?
And again with making up. It starts to annoy me. Current solution is "system with less RAM can generate smaller worlds".

What if that REQUIREment was 32GB?
Do you really think that making up spurious scenarios, inventing facts and constructing your own reality help your case? Sorry, I cannot treat your arguments seriuosly.

AGAIN, the solution is to streamline the way that worlds are generated, not to throw more RAM at the problem.
There is time when "throwing more RAM" will be only answer and solution. You can compress/optimize data only so much. I think Toady will be forced to seriously consider 64-bit soon (in next two, three years). Incidentally, it will be time when computers with 16 Gb will be incerasingly popular and common. Also 32-bit will be slowly phased out (it already begun). And before you will start inventing truth again: no, Toady will for years put our both 32 and 64-bit versions for download.
Logged
Dwarf Fortress - where the primary reason to prevent death of your citizens is that it makes them more annoying then they were in life.

dirty foot

  • Bay Watcher
    • View Profile
Re: Has anyone succesfully generated a very long history?
« Reply #68 on: February 19, 2012, 11:05:01 pm »

Where are you located dirty foot?  I'm asking out of curiosity since most inventories are still on duo and tri core atm afaik, so when you say quad-core, I'm a little jealous :P  (I'm still on some on old duo-cores)
I'm down in Florida, but you can honestly pick up the machine I have at Best Buy; it's just pricey.

I actually run an 8-core, but if there's a game out there that uses that many, I haven't seen it yet. I'm lucky if I see a third core get used. Honestly, I got snowed on my computer purchase; every game company on the planet had a huge hard on for threading processors, then for some reason dropped ALL interest within months of my purchase. For nearly every game on the planet, it's all about the graphics card, and I seriously flounder in that area with only a Geforce 560 Ti, and I'm only running one card where I should be running two.

Ironically, my wife's computer is technically better than mine because while she only has 4 cores, they're each a GHz more powerful. Her computer cost 600 bucks less than mine too. I do like my i7 though, I just wish I customize differently.
« Last Edit: February 19, 2012, 11:06:54 pm by dirty foot »
Logged

KillzEmAllGod

  • Bay Watcher
  • Searching for the other sock.
    • View Profile
Re: Has anyone succesfully generated a very long history?
« Reply #69 on: February 19, 2012, 11:40:06 pm »

I do like my i7 though

I Love my AMD though everyones always bring out something better every few months.
The Game runs fine at the moment thought besides world gen going at snails speed... multithreading is always nice but we can't always get what we want sadly, the game would be pretty differnt with multithreading thinking about it...
Logged

Feb

  • Bay Watcher
    • View Profile
Re: Has anyone succesfully generated a very long history?
« Reply #70 on: February 20, 2012, 01:14:04 am »

Yeah, pricey = lower sales volume, makes sense then :P

Welp, I think I solved it, if you want a long history, you just need a LOT less historic figure and lot less events.  I managed to hit 900 odd years with minimal civs (6), 10 of each titan/megabeast/semi with pop cap at 1000 and other parameters that helps with event/hist. fig. reduction.  Total event was around 2.7 million events before it become unresponsive.
Logged

dirty foot

  • Bay Watcher
    • View Profile
Re: Has anyone succesfully generated a very long history?
« Reply #71 on: February 20, 2012, 01:29:55 am »

Yeah, pricey = lower sales volume, makes sense then :P

Welp, I think I solved it, if you want a long history, you just need a LOT less historic figure and lot less events.  I managed to hit 900 odd years with minimal civs (6), 10 of each titan/megabeast/semi with pop cap at 1000 and other parameters that helps with event/hist. fig. reduction.  Total event was around 2.7 million events before it become unresponsive.
Were you using the "large address aware" exe? If not, you may just get to 1050, depending on your computer.
Logged

knutor

  • Bay Watcher
  • ..to hear the lamentation of the elves!
    • View Profile
Re: Has anyone succesfully generated a very long history?
« Reply #72 on: February 20, 2012, 04:03:40 am »

The number of processes might also be the cause.  I'm not implying anything about anything anyone here said.  Its all good, but I see there is a priority toggle here in DF's init.txt for those who insist on blazing game speeds.  Yet I haven't read here of anybody who mentioned trying Real-time.  Maybe putting Dwarf Fortress first in that list of processes in Real-time might help.  At least to get a world created, and then crank it back to normal to play.

Some PC operators run everything they have installed, all at once.  I worked with a guy that had twelve system monitoring tools running, all monitoring his system.  Made no sense to me.  He was running a half dozen firewalls too.  That's a lot of wasted cycles and processes. I'm sorry a half a dozen instant messengers, not firewalls.  Yeah, he flipped out when they told him he wasn't aloud to leach free movies off the company's linkup.  That was back in the late 90s. Tho.  *shrug*

I dunno, its an option, I'd never do it, however.  Here's what I do, I use a RAID array.  They don't endorse this kind of configuration, anymore.  It has no failsafe.  Instead they just want us to buy more and more RAM.  Instead of faster, controllers and faster storage.  I kinda prefer smaller RAM.  Takes too long to fetch data from these busy overloaded 64bit banks that in most cases are too busy performing ripoff code that replaces physical parts and circuitry.
« Last Edit: February 20, 2012, 04:18:26 am by knutor »
Logged
"I don't often drink Mead, but when I do... I prefer Dee Eef's.  -The most interesting Dwarf in the World.  Stay thirsty, my friend.
Shark Dentistry, looking in the Raws.

Feb

  • Bay Watcher
    • View Profile
Re: Has anyone succesfully generated a very long history?
« Reply #73 on: February 20, 2012, 08:02:41 am »

Nope (at the large exe thing), was just using phoebus' v34.02v02 and advanced parameters.  I've a hunch that EACH EVENT gets processed each year gen along with hist. figs, hence the problem with processing.  I've indeed done it in real-time (via manager mind you, not init), it helps somewhat, but there was really no difference (since I mostly just let the comp sit and gen while I do other stuff).  My comp is several years old, so it's on the subpar end by today's standards (only exception is the GFX which doesn't help with DF :P)

« Last Edit: February 20, 2012, 12:10:07 pm by Feb »
Logged

Biopass

  • Bay Watcher
  • Human
    • View Profile
Re: Has anyone succesfully generated a very long history?
« Reply #74 on: February 20, 2012, 10:42:11 am »

Just genned a medium world to 1050 years in 20 minutes. Currently genning a large world, but I'm a full hour in and it's still at year 242 out of the desired 550.

Were you using the "large address aware" exe? If not, you may just get to 1050, depending on your computer.

There's a large address aware exe? Where is it, and who do I have to kill to get it?
Logged
500 fps vanilla 4x4 embark. intel master race.
Pages: 1 ... 3 4 [5] 6 7 ... 9