Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  
Pages: 1 ... 9 10 [11] 12 13 ... 43

Author Topic: Anouncing The PathFinder Project  (Read 98115 times)

Footkerchief

  • Bay Watcher
  • The Juffo-Wup is strong in this place.
    • View Profile
Re: Anouncing The PathFinder Project
« Reply #150 on: October 20, 2009, 12:43:28 am »

There are good reasons to be instinctively averse to generalized systems, but I really don't think the concern is warranted here.  The problem being addressed is inherently abstract -- DF's needs aren't exactly unique.  Besides, it's not as if anyone working on the project ever has to choose between a) make it work for DF and b) make it work for any client program.  Any sane design approach will have both some very general low-level pathfinding code and some DF-specific code, with clear divisions between them.

Also, my two cents on using user-defined rooms as graph regions: experiment with it later.  It doesn't matter yet.  I personally don't expect it to pan out, but experimentation is half the purpose of this project, right?
« Last Edit: October 20, 2009, 12:48:19 am by Footkerchief »
Logged

JanusTwoface

  • Bay Watcher
  • murbleblarg
    • View Profile
    • jverkamp.com
Re: Anouncing The PathFinder Project
« Reply #151 on: October 20, 2009, 12:46:09 am »

Any sane design approach will have both some very general low-level pathfinding code and some DF-specific code, with clear divisions between them.
Wait... We play DF. I don't think any one of us is completely sane by any definition.
Logged
You may think I'm crazy / And I think you may be right
But life is ever so much more fun / If you are the crazy one

My blog: Photography, Programming, Writing
Novels: A Sea of Stars, Confession

Silverionmox

  • Bay Watcher
    • View Profile
Re: Anouncing The PathFinder Project
« Reply #152 on: October 20, 2009, 05:05:42 am »

Well, then design the library for other games... but then I wonder what it's doing here? Trying to to serve many masters will end up satisfying none. Both approaches are complementary rather than mutually exclusive anyway.
Just use any present designations as a starting point. Let the algorithm determine the rest of the terrain sections automatically.. which would have to happen anyway without using the designations. The result will be a little different than full automation, but the sections will resemble the vision of the player more closely.

It was written clear as day on the very first post...

Quote
Create a flexible BSD licensed C++ path-finding library specialized for grid based systems like that of Dwarf Fortress and able to efficiently handle rapidly changing map geometry and other challenges such as but not limited too multiple travel domains, multi-tile creatures, variable movement cost terrain and path caching.
That is our goal. Nothing more nothing less. Considering that using designations both backs us into a corner and creates more headaches than performance gains I don't see the value in it.
And I'm voicing my concerns with that approach in the very same thread, where else? Trying to be useful to an unknown number of hypothetical future games might prove to be just as constricting.

As Footkerchief says, there will be room for both general and DF-specific code. There's no reason to exclude DF-specific elements from the start. Even better, those elements can grow into concepts that are universally useful.

Solving general goals from a general viewpoint has been attempted by a lot of people already. If you want to do useful work in that area, the first months will encompass looking for papers, projects and studying what your predecessors did and what their problems were. Otherwise you're just going to replicate that in a laborious way, and that's not useful.  If after that enough people are still able/interested to make more improvements, great! I just suspect that the greatest payoff in FPS improvement will come from DF-specific code.
Logged
Dwarf Fortress cured my savescumming.

Sowelu

  • Bay Watcher
  • I am offishially a penguin.
    • View Profile
Re: Anouncing The PathFinder Project
« Reply #153 on: October 20, 2009, 05:11:29 am »

Solving general goals from a general viewpoint has been attempted by a lot of people already. If you want to do useful work in that area, the first months will encompass looking for papers, projects and studying what your predecessors did and what their problems were.

You are my most favorite person EVER.  *pastes a gold star onto*

Seriously, people seem to get all attracted to the romantic sides of programming (making a grand useful library), without realizing that solving hard problems is 99% good algorithms and 1% translation into code.  Computer science is not coding--it's figuring out the best way to solve problems.

However, I -do- like the idea of a project to create a pathfinding library like this.  It's hideously insane to try and do room-based pathfinding for arbitrary rooms...BUT, if you dump months of effort on the problem, you might actually come up with a good solution (and you might end up with a paper of your own to publish!).

Just, people, please realize.  Doing this well could be a thesis-worthy project.  And an alarming number of thesis papers are on subjects that don't turn out to be useful or productive at all.  Go after it from an academic standpoint, NOT a coder standpoint... I would love to see what turns up!
Logged
Some things were made for one thing, for me / that one thing is the sea~
His servers are going to be powered by goat blood and moonlight.
Oh, a biomass/24 hour solar facility. How green!

Impaler[WrG]

  • Bay Watcher
  • Khazad Project Leader
    • View Profile
Re: Anouncing The PathFinder Project
« Reply #154 on: October 20, 2009, 06:26:01 am »

It seems were going through a "Question the premise of everything" phase, first its programming language now the focus of the project in general.  I'm glad the mission statement got tossed around in any event so people at least know what their rebelling against  ;D

A library is really no more difficult to program then anything else and we really have no choice, we don't have access to Dwarf Fortrees's code and the best way to make it usable for DF is to make it universally usable with a well designed API.  We've already got a slew of planned features that are geared towards DF needs such as multi-tile creatures and path caching.  If we find ways to add more DF friendly features that's a plus and we would definitely want to look into it.  But people must remember that a lot of "pathing improvements" are all about WHAT things to request paths or estimates too.  the long awaited 'Burrows' concept is like this, its going to restrict dwarves to working within a defined area.  And It's entirely 'Client side' from the perspective of the pathing engine as it nips expensive path requests in the bud.  Keep this concept in mind and ask yourself if a speculative feature is really client or server before saying we need it.

Also this whole thing on using Room data from DF is entirely moot until someone actually finds a way to extract that from DF, in any event we have to make the system work without it first as has been pointed out several times already.

As for research and what kind of initial implementation were going to make we know its going to be a Heirarchial A* implementation as described in the earlier linked examples.  I'm all for lifting as much design & code as we (legally) can from that example as well.  In fact I'd like to see the code team discuss more of the high level details of implementation plan and move the debate onto the details of that.  On a related note I found another nice paper that has a nice method of zoning, it uses a two tier system using a grid to divide the map into square Sectors, the sectors are then sub-divided into one or more continuously connected areas called regions which are tested for connectivity to each other and become the unit for A* pathing as in the previous examples.  The Sectors make it easy to translated requested grid coordinates into the start zone for the path.  Here's the link, its a pdf slide show.

http://www.cs.ualberta.ca/~nathanst/talks/mmabstraction_talk.pdf

Edit:  Found the white-paper for the above slide, much more in depth of course
http://www.cs.ualberta.ca/~nathanst/papers/mmabstraction.pdf
« Last Edit: October 20, 2009, 07:37:24 am by Impaler[WrG] »
Logged
Khazad the Isometric Fortress Engine
Extract forts from DF, load and save them to file and view them in full 3D

Khazad Home Thread
Khazad v0.0.5 Download

Silverionmox

  • Bay Watcher
    • View Profile
Re: Anouncing The PathFinder Project
« Reply #155 on: October 20, 2009, 07:23:41 am »

Those things have to be clear from the start. So there'll be three areas of attention:
1. Pure pathfinding, GI
2. DF-specific pathfinding optimizations, very much dependent on what Toady is comfortable with revealing from his code
3. Non-pathfinding tweaks that reduce pathfinding demands.

1 will be the main focus of the coding, 2 is not even an option yet if ever, and 3 are just suggestions as normal.

Any attempts or success in recruiting elsewhere for #1? Two coders will not suffice for the long term, if only for afk issues.
Logged
Dwarf Fortress cured my savescumming.

Starver

  • Bay Watcher
    • View Profile
Re: Anouncing The PathFinder Project
« Reply #156 on: October 20, 2009, 09:21:35 am »

http://www.cs.ualberta.ca/~nathanst/talks/mmabstraction_talk.pdf
Interesting.  Read through that.  Have also glanced at the white paper link, but not been able to absorb it all in the few minutes I can spend right now.

Is there an answer in the latter (or the former, if I missed it) about how one ensures that "all regions connect to all other regions"?  Complex defensive structures could quite easily not just disbar pathing from local location 1 to local location 2 (including through exiting the region and then heading back in) but could quite easily shut half the map off from the other half.  Not usually an issue in a game that autogenerates a landscape with isolated pockets of disconnection, but here we'll be dealing with players that, if they aren't shutting whole regions off from each other (vertically, as well has horizontally) merely through defensive play, are full-on MegaProjecting themselves a Great Wall of MountainHomes dividing the map in two or flooding the surrounds with impassable (to most) lava-lakes creating wide expanses of impassability, around little pockets of free but isolated movement.

The visual given seems to suggest a regular region layout, but I could seen the need to shuffle those regional borders around in the DF-map, if you must have somewhere-in-every-region-routes-to-somewhere-in-every-other-region, extend 'tongues' of region, if need be, and in fact do much the opposite of the aim of having everywhere within a regional juristiction trivially connected to each other, and handle the (im)passability between other regions on a level further up the chain.  (Yes, so the sub-regional possibly-logically-isolated elements take on that aspect, but with transient connectivities I think it makes more sense to have connected (howsoever transient) elements become themselves a region of a higher order, and these regions connections to similarly ordered regions be referenced upon by the next higher order.)


BTW, not that my own ideas are going in this direction, but could you see a major problem with taking a roughly calculated route, far from optimal (e.g. could easily be /\/\/\/\/\), subdividing it into segments and then assessing the optimal pathing between the centres of each segments as mini challenges (all of them resource-light, even combined) rinsing and repeating for a loop or three until the inefficiences of the route are smoothed out (becoming __________, in this stupidly simple example).  On a 100-step-line cycle, 49 quite trivial "two steps at a time" tests highlighting kinks in an unecessarily bendy route (but necessarily and unavoidable bendy around obstacles), and then paying attention to the areas around that route neighbouring that 'betterment' might help.  (Of course, the ultimate counter-example is where, from the start point, one has a choice of stepping left to go through one obstacle course or right through another, the exit to each being straight onto the finish-point.  The 'shrinking' algorithm would never have chance to 'snap' the route from one side to another, to see if that were more optimal.  Or even not an obstacle course at all.  Which is a bit of a downer.  God job that's not my chosen approach, eh? :))
Logged

Urist McDepravity

  • Bay Watcher
    • View Profile
Re: Anouncing The PathFinder Project
« Reply #157 on: October 20, 2009, 11:57:15 am »

I would agree that best starting point is testing framework rather than actual pathfinding.
It should be some kind of test-server with client libraries, or wise versa, but with ability to run all needed testcases on all algo's and precisely measure time/memory usage.
It should feed initial map to the solver, trace results for preprocessing, then feed tests and map altering commands.
We have quite many map-alterers, including some very rapid ones (cave-ins, water floods), so cache-based algo's should easily adapt to changes.
We have some 'flapping' alterers, like doors, which should be special case as well, i believe (caching algo's may decide it worth to cache graphs for both states if, for example, this particular door change its state often. And it could be so, if its linked to the plate and water-based events generator, or just lever which is pulled often).
Having single test framework would let different teams to create different algo's while maintaining compatibility.
Logged

Combatjuan

  • Bay Watcher
    • View Profile
    • http://www.isoftdata.com
Re: Anouncing The PathFinder Project
« Reply #158 on: October 20, 2009, 12:46:50 pm »

I'm interested in helping with this project especially since I am in the process of making a game that would make good use of it.  And I think that lots of the needs of DF intersect with my needs (dynamic, 3d, tile-based world, multiple movement types, possibility of multi-tile creatures, etc...).

I do not, unfortunately, have much time available to me in the next two weeks so I certainly can't do much in the way of maintaining and setting up a test framework for all this (at least until that time).  Someone posted a suggested API a few pages back that seemed quite reasonable to me.

I am convinced that the best approach to this is to construct a test interface and an API and start coding and let the competing ideas compete on real tests.  Let's see which regioning turns out to be most efficient.  Let's see if those who want a more general solution can write a fast such solution in a reasonable amount of time and if those who want a specific solution can do so while still making it sufficiently extensible/adaptable.  I know of no other way to answer such questions.

And then I'll go and be a hypocrite and add that I don't have time to get a github going or an API spec or anything.  And I'll also offer this consideration which I don't think has yet been mentioned:
 * I believe that the map is broken up into 48x48xMAX_HEIGHT tile sectors and that these sectors are dynamically loaded (in adventure mode) as one moves over the map.  It might make sense to confine our higher level regions to being within a sector since these may need to be culled from the tree much like door/drawbridges (the above-posted Dragon Age pdf article actually took this approach anyway).  And relatedly, it also sounds like Today is making some changes to how sky is being handled as well as various underground layers, though I doubt they will affect things significantly.
Logged

Sunken

  • Bay Watcher
  • Wabewalker
    • View Profile
Re: Anouncing The PathFinder Project
« Reply #159 on: October 20, 2009, 02:53:21 pm »

BTW, not that my own ideas are going in this direction, but could you see a major problem with taking a roughly calculated route, far from optimal (e.g. could easily be /\/\/\/\/\), subdividing it into segments and then assessing the optimal pathing between the centres of each segments as mini challenges (all of them resource-light, even combined) rinsing and repeating for a loop or three until the inefficiences of the route are smoothed out (becoming __________, in this stupidly simple example).  On a 100-step-line cycle, 49 quite trivial "two steps at a time" tests highlighting kinks in an unecessarily bendy route (but necessarily and unavoidable bendy around obstacles), and then paying attention to the areas around that route neighbouring that 'betterment' might help.  (Of course, the ultimate counter-example is where, from the start point, one has a choice of stepping left to go through one obstacle course or right through another, the exit to each being straight onto the finish-point.  The 'shrinking' algorithm would never have chance to 'snap' the route from one side to another, to see if that were more optimal.  Or even not an obstacle course at all.  Which is a bit of a downer.  God job that's not my chosen approach, eh? :))
I think you put your finger on it. In formal terms: a simulated annealing approach; sensitive to local minima
Logged
Alpha version? More like elf aversion!

numerobis

  • Bay Watcher
    • View Profile
Re: Anouncing The PathFinder Project
« Reply #160 on: October 20, 2009, 04:23:10 pm »

Lots of people saying "wouldn't it be nice if there were a testing framework."

Not lots of people discussing why what I posted isn't going to cut it.

Anyway, I updated it, new URL.  I know nothing about good places online to put code repos.  I'm using mercurial for this.

README:

read.h/.cpp : read in a grid from a file
point.h : x/y/z
grid.h : store the grid.  No information except whether squares are passable.
graph.h : Two kinds of graphs: adjacency lists, and a thin wrapper over a grid.
heuristics.h : an example of a heuristic.  A particularly simple example.
astar.h : an example of a search algorithm.  This one is a particularly bad implementation of A*.  Works on any graph, and any heuristic that knows how to handle vertices of that graph.
main.cpp : put it all together.  Do 500 random searches of A* on the L1 heuristic and grid graphs.  Report "performance" -- the number of times we touched a grid node.

Requires boost.  The makefile is set up for the fink package manager on OSX: it passes -I /sw/include.  To set it up for macports, use /opt/local/include.  For linux, leaving it as-is won't cause problems.  For cygwin, I have no idea.

The interface a graph presents is just: (a) given a node, return an iterator to the neighbours of the node.  (b) given a node, return the weight of the node (the cost of entering it, from anywhere).  We probably want (c) given a pair of nodes, return the weight of that edge, if any edge exists.  I didn't implement that, it's easy to add.

The interface a grid presents is just: given a point (x/y/z), return whether that point is in bounds, and passable.  We probably also want: given two points, return whether you can go from one to the other (otherwise, we can't represent floors and stairs properly), and who can enter the square, and so on.  Plenty to add here; upon changing that, we'd need to change the GridGraph implementation also to use the edge-based passability definition.

The file format is: x/y/z on the first line, saying how big the map is.  Then the maps, one per level.  Lines started with ";;" are comment lines (since # often means wall).  With a grid that supports floors and stairs (rather than just open space or wall), it will be easy to add parsing for < X > versus . versus _.  No support in grid or the file format yet for weights for nodes (though there is in the graph structure).

I have no file format yet for searches to do, nor for dynamic updates.  The former is easy to deal with.  For the latter, I feel that so far, even just the static case is worth experimenting with.
Logged

sir monko

  • Bay Watcher
    • View Profile
    • wsa
Re: Anouncing The PathFinder Project
« Reply #161 on: October 21, 2009, 03:18:49 am »

hi all, long time no see. unfortunately i'll neither be able to invest a lot of time in this project, nor am i able to contribute code (heavens, C++! halp!). moreover, i'm not very skilled in the aracane arts of algorithmicism, optimifluxcompensating and low-level/close-to-the-metal hacking.

but after skimming this thread, here's my opinion:

dreiche2 already mentioned it, and i wholeheartedly agree: the single most important thing would be to provide a testing and benchmarking framework, so a contributor would have a rough clue if his improvements actually work. this may be (a lot!) harder than it sounds; benchmarking is an art of its own. a careful selection of maps, paths and populations would be needed, so solutions wouldn't specialize too much. long paths, short paths, unreachable destinations, ... ideally, this framework should also account for changing environments.

the closer it'd be to the real thing the more accurate, but also less usable. *sigh* benchmarking is hard.

at least the benchmarking framework wouldn't have to be written in C++. use something simpler. java (noooo!), python, javascript, Lua, whatever. it's about the idea, not the implementation. absolute performance doesn't really matter at this point.

i don't even know how pathfinding is done currently; afaik there's A* calculated on-demand, stored in the dwarves memory and followd until it invalidates because the dwarf runs into an obstacle that wasn't there when the path was calculated. if no path can be found the algorithm terminates after a certain number of iterations to avoid flooding and uses the most promising solution found til then.

this approach is actually pretty nice, really. most importantly, it's simple. also, by storing and evaluating-on-the-go it allows for inaccuracy. on the one hand that looks realistic (why should the dwarf know a wall on the other side of the fortress vanished since he was there a week ago?), on the other hand it greatly reduces complexity.

i already described what i'd do about it: caching. just store the already calculated paths and keep them in memory. of course, this adds a bit complexity, but not as much as, say, machine learning (which sounds nice, but i'm afraid it may be a bit too hardcore) or automatic zoning and waypointing or future prediction magic.


how path caching works:

1. a dwarf needs a path between A and B
2. look at the cache if the path is already there
  * if not, calculate as usual
  * if it is, take it from there, and increment some use-counter
3. follow the path; if it's blocked, mark for garbage collection.

* every X cycles, garbage collect:
  * throw out old and invalidated paths
  * if memory's sparse, throw out least used paths

the pros/cons/limitations of path caching, from the top of my head:

  • ~ trades memory for speed
  • + scalable: you can control the cache size. the more memory, the more cache.
  • + optional (gracful degradation)
      * no memory available for caching? no problem - it then behaves exactly as it does now.
      * also, this could be a bit of a limitation, because it may not speed up machines that'd needed it most. though i don't really know; it may very well be that path caching speeds up even those, depending on how it's influenced by virtual memory/page swapping/etc.
  • + algorithm independent. doesn't care if A* or HHA* or whatever runs in the background. in fact, the current A* implementation should work without much adaption.
  • - in the worst case, if no path is used more than once, there would be a slight cache-lookup penalty. this could be reduced by data structures optimized for fast lookup (hashes, trees, ...)
  • possibility for partial paths: if one path is a sub-path of another one, use this one
  • possible to implement as part of some kind of "pathfinding-manager" that also handles threading/concurrency, alternative algorithms (one for short paths, different one for long paths, ...)


finally, i already implemented path caching in a very simplified manner to see if it's actually possible.

my findings (if i remember correctly):
  • caching is possible without adding too much complexity
  • in my very simplified environment it worked very well. between 50% and 90% cache hits
  • i have no clue if it would work in an environment as complex as DF (my guess: not as good, but still)

back to work now.
Logged

Sunken

  • Bay Watcher
  • Wabewalker
    • View Profile
Re: Anouncing The PathFinder Project
« Reply #162 on: October 21, 2009, 03:39:08 am »

I doubt if you'd see many cache hits for DF on a tile-by-tile level, but it might be more practical on a higher abstraction level such as rooms or zones. The cached path might be suboptimal in that case, but one could live with that.
Logged
Alpha version? More like elf aversion!

Impaler[WrG]

  • Bay Watcher
  • Khazad Project Leader
    • View Profile
Re: Anouncing The PathFinder Project
« Reply #163 on: October 21, 2009, 04:15:41 am »

Any successful caching system will defiantly need to work at higher levels in the abstraction hierarchy to be successful, if done properly I expect >99% hits at high level abstraction if the cache size is north of a thousand.
Logged
Khazad the Isometric Fortress Engine
Extract forts from DF, load and save them to file and view them in full 3D

Khazad Home Thread
Khazad v0.0.5 Download

sir monko

  • Bay Watcher
    • View Profile
    • wsa
Re: Anouncing The PathFinder Project
« Reply #164 on: October 21, 2009, 05:00:43 am »

I doubt if you'd see many cache hits for DF on a tile-by-tile level, but it might be more practical on a higher abstraction level such as rooms or zones. The cached path might be suboptimal in that case, but one could live with that.

one:

the current path is suboptimal too. a* doesn't guarantee the best path, and because it's stored after calculating (instead of recalculated after every single step) it lets dwarves run into paths that became blocked in the meantime too.

of course cache hits would work *even better* on a higher abstraction level, but i fear those abstractions are actually very hard to implement and possibly very costly to manage in an ever-changing map.  if someone manages to implement it successfully: congratulations, you have probably solved most of DF performance problems.

but i'm not convinced. acutally, i fear the added complexity to be a major show-stopper.

two:

my test ran with paths between (randomized) points of interest (POI) and arbitrary locations. i thought of POIs as workshops, piles, etc.
i assume that most movements are not between quasi-random destinations (e.g. a hunter wandering around until he finds game) but few highly frequented locations.

Any successful caching system will defiantly need to work at higher levels in the abstraction hierarchy to be successful, if done properly I expect >99% hits at high level abstraction if the cache size is north of a thousand.

depending on the abstraction, 99% hits is not unreasonable. i assume the number "1000" is just speculation.

in my humble opinion, per-tile path caching (with some tricks) and without further abstraction layers seems like the most reasonable approach, as it doesn't add a lot of complexity and i'm sure will increase performance.

summary:

as always, we have rival forces at play:

+ more cache = more hits = less accuracy = higher lookup cost = more memory needed

* more hits are great!
* less accuracy doesn't really matter IMO, as paths don't get old and are garbage collected as soon they're invalidated. in the worst case a dwarf doesn't use a new shortcut for some minutes.
* higher lookup cost: ha, this is what benchmarking would tell us. to be honest, i don't think lookup would be too expensive - after all, it's not much more than a hash-table lookup (depending on the implementation details)

* more memory needed: scales, may be restricted.  additionally, i don't even expect memory cost to be very high. a cached path of, say, n steps may cost n+c bytes, where c is constant (store directions, not coordinates). n can even easily RLE-compressed.
so, assuming paths to use an average of ~200 bytes, you could store over 5k paths per megabyte (yes, that's mega, not giga).


now, the questions are:
a) are the costs of cache management and lookup smaller than those of an a*-recalculation?
b) does this increase performance of ... YAY LUNCH BREAK WOOOOOOO!

b) does this performance increase justify the added complexity?
c) does it work on systems with less memory?

update: back from lunch
« Last Edit: October 21, 2009, 05:37:58 am by sir monko »
Logged
Pages: 1 ... 9 10 [11] 12 13 ... 43