@Andeerz
Just to give you and anyone else some supah-useful info as a CS major (if you're at all interested), pathfinding literally translates to traversing a graph of nodes, in an attempt to find the path from one node to another. Given N nodes, the worst case number of operations (of arbitrary size) is N^2, just due to the way the algorithm works out (it's thoroughly studied in computer science, and that's the best anyone on the planet can come up with.) It's generally much less than this, say if a dwarf is only moving to a nearby cell, but the worst case gives both an idea of how expensive the algorithm is and an accurate approximation for whenever dwarves travel far away (all the time, I think.) Mainly, polynomial complexity of the algorithm prevents DF from doing it in a very straightforward manner (the straightforward manner being each cell considered a node in the graph... if there are 10,000 cells in an area, you couldn't possibly calculate this every tick and not cause tremendous lag, because it'd be 10,000*10,000 operations worst case.) So DFs solution to this is to redefine what a "node" is-- suppose DF can break the game world into "zones" (rooms, hallways, etc), then each zone, rather than each cell in it, comprises the graph-- instead of 10,000 nodes, you get maybe 50! I imagine DF is already doing this or something better, and Toady has likely poured tons of time into optimizing it. This is an
extremely difficult problem, one that he's probably already spent months on, and it's likely the algorithm couldn't be sped up without significantly changing a game mechanic (like allowing dwarves to teleport through walls, thereby eliminating the need for pathfinding altogether. Instead maybe mark some zones, like unexplored ones, as "unenterable to all dwarves" so dwarves never teleport there-- would be an instant lookup (polynomal time of 1

))
If by any chance Toady hadn't done something similar yet, I was thinking a great way to speed up the pathfinding would be to have "graphs of graphs of graphs of..." however many levels as determined by the current number of total nodes. Partitioning the graphs into distinct nodes would be fairly expensive (N^2 * log2(N) I think... check
http://en.wikipedia.org/wiki/Graph_partition), but the beauty is that it only has to be calculated infrequently. You could think of the graphs layers as being "regions->locations->structures->rooms->cells", though the number of layers higher than "cells" would actually be determined dynamically as the fortress took shape. With a tree like structure, I think traversing the graph would change from N^2 to (N/B)^2 / logB(N), where N is your node count and B is your number of nodes per partition. With N=10,000 and B=25, you'd go from 100,000,000 operations to about 56,000! In uh, non-geek terms, it's the difference between searching through neatly organized folders and folders scattered all over the floor

I hope that wasn't too jargony

As far as an "accessibility map" goes, I think what the premise was might translate roughly to "caching" frequently calculated paths (I might have misunderstood). It'd only work if the number of nodes could be generalized to some degree, like in a hierarchical graph system (caching every possible path would be (cell count)^2 otherwise, so with tens of thousands of cells, you wouldn't have enough RAM). Basically, caching paths for every room combination is feasible, but paths between every cell combination isn't. So the aforementioned zone system would work well with it, and that super-meta-graph thing I cludged together would work REALLY well with it!
Hope that wasn't too rambly, just thought it might be good to give an idea of what kinda stuff would have to go into this idea

I really like the idea, I hope I didn't give an impression to the contrary, I just thought it'd be good for people to know what they were asking for

EDIT: On the offchance Toady reads this, apparently Fibonacci heaps are good with Dijkstra's shortest path algorithm:
http://en.wikipedia.org/wiki/Fibonacci_heap. You should add one if possible: you'll totally feel like a badass if you do.
EDIT again: Apparently choice of heap also depends on how "dense" the graph is. You could dynamically check densities and swap between Fib heaps (for dense) and Binary heaps (for sparse) depending on the situation.
http://stackoverflow.com/questions/504823/has-anyone-actually-implemented-a-fibonacci-heap-efficiently (I happen to be on the subject of heap selection for a research project, haha)