Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  

Author Topic: Use GPU to speed up dwarf fort  (Read 2879 times)

OldMiner

  • Bay Watcher
    • View Profile
Use GPU to speed up dwarf fort
« on: December 18, 2006, 06:22:00 pm »

Maybe this would involve too much effort to work in, but if dwarf fort continues to increase in processing requirements, it might be worthwhile to consider taking advantage of a programmable GPU, if available.  I know I've seen BrookGPU which attempts to simplify the process of taking an algorithm and getting it on to a video card without too much effort.  Since video cards work awesomely well on vector and array operations, and that seems to be exactly what Dwarf Fort is spending most of its time on, it might be a cool thing to do.

Granted, it might be too involved to be worth Toady's time, especially when a lot of people might not be able to take advantage.

Also, from the papers I've read that moved an existing algorithm from CPU to GPU use, the issue of not being able to reuse a result from one operation on the same cycle that it's calculated can creep up, so the process of temperature and flow updates may require being rearranged into several discrete passes to make it most efficient.  Then again, taking any advantage of the idle cycles of the GPU might be enough to significantly improve performance.

Just an idle thought.  Cheers.

Logged

Eagleon

  • Bay Watcher
    • View Profile
    • Soundcloud
Re: Use GPU to speed up dwarf fort
« Reply #1 on: December 18, 2006, 07:45:00 pm »

I don't know much about what might be done to improve DF towards this end, but I think there are more people playing it that could take advantage of this than you might think. It really needs a somewhat new computer to run anyway - I could barely play it on my old 400 Celeron - and graphics cards can be updated more easily than the CPU, which, with the socket shuffle going on now, usually requires a new mobo to go with it.
Logged
Agora: open-source, next-gen online discussions with formal outcomes!
Music, Ballpoint
Support 100% Emigration, Everyone Walking Around Confused Forever 2044

Toady One

  • The Great
    • View Profile
    • http://www.bay12games.com
Re: Use GPU to speed up dwarf fort
« Reply #2 on: December 18, 2006, 08:15:00 pm »

I'm just not sure where to begin with this sort of thing, and things like multi-threading and so on, given that I have a giant mass of code that's currently being run in one sequence.  It seems like it would take a lot of effort to figure this out and the best ways to use it and so on, when I still have plenty of regular optimizations to take advantage of.
Logged
The Toad, a Natural Resource:  Preserve yours today!