Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  

Author Topic: DFhack metrics plugin.  (Read 202 times)

billw

  • Bay Watcher
    • View Profile
DFhack metrics plugin.
« on: August 11, 2018, 08:03:33 pm »

After struggling with stress issues in my latest fort and being unable to tell if my attempted remedies were having any effect, I decided to write a plugin that dumps stress levels so I can see their trend-lines over time.

If this is of interest to other people (perhaps something else that does this already exists, but I didn't find it), I can put in a PR to DFhack to get it added. It isn't trivial to use though, it requires knowledge of elasticsearch and kibana to get useful outputs from it (although I might make alternative outputs like Excel or something).

/edit:

Here is an example showing what happens when you burrow all your dwarves in the dining room with a dead body and lock the doors.
Note some of the data series terminating abruptly.
Spoiler (click to show/hide)
Also interestingly the only person not affected is my queen. In fact she is getting happier by the minute:
Spoiler (click to show/hide)

« Last Edit: August 12, 2018, 07:54:07 am by billw »
Logged

Fleeting Frames

  • Bay Watcher
  • Spooky cart at distance
    • View Profile
Re: DFhack metrics plugin.
« Reply #1 on: August 12, 2018, 04:08:22 am »

What format is the dump, btw?
Something like .csv that digfort uses is pretty easy, just values separated by ;

billw

  • Bay Watcher
    • View Profile
Re: DFhack metrics plugin.
« Reply #2 on: August 12, 2018, 06:08:54 am »

Currently it only outputs to elasticsearch. It gives live output and fairly good graphic, analysis and search functions. If there is something more appropriate and lighter weight I would be interested. This plugin is mostly useful to me because the output is live, dumping to a csv or whatever would not give a live output (unless there is some way to do this I am not aware of?). I could still do this if there is significant interest, but I wouldn't be using it personally.
Logged

Fleeting Frames

  • Bay Watcher
  • Spooky cart at distance
    • View Profile
Re: DFhack metrics plugin.
« Reply #3 on: August 12, 2018, 08:32:31 am »

Aha, I see. I have not used the tools you use to display before; though I saw they were open-source with a brief search. Didn't catch that it was real-time updating, for instance.

There's also the third option of publishing separately like cavern keeper does. I can certainly see some potential for this family of tools; many times I've run checks for how much someone's attributes and gained experience changed in a time period, but that would be too much trouble to do by hand for everyone.

I can't say whether I'd use them any time soon, however.

And yeah, dumping to csv would be pretty static, afaik at least major tools like excel and libreoffice only read a snapshot of the file at the moment of opening.

fortunawhisk

  • Bay Watcher
    • View Profile
Re: DFhack metrics plugin.
« Reply #4 on: August 12, 2018, 08:26:46 pm »

Intriguing! I'd be interested to see how you did it (plugin vs lua vs ...?).

In terms of other tools, it depends on what part of your pipeline you want to replicate:
DwarfMonitor is probably the closest to the whole package, but doesn't exactly do what you're doing. http://dfhack.readthedocs.io/en/stable/docs/Plugins.html#id276
The dfhack script "repeat" gives you the ability to call a script on a consistent timed basis. https://dfhack.readthedocs.io/en/stable/docs/_auto/base.html#repeat
Dwarf Therapist gives you ability to see everything for everyone, but has no historical record or output options (afaik).

Couple of questions:
- Is the elasticsearch instance local or remote?
- How does the plugin pass multiple values? A single http push, multiple http pushes...?
- Any thoughts on how you'd convert timestamps to dwarf fortress timestamps? For instance, 1300-1330 -> Granite 01-28, Year 5.
Logged

billw

  • Bay Watcher
    • View Profile
Re: DFhack metrics plugin.
« Reply #5 on: August 13, 2018, 03:19:44 am »

I also use the tools you mentioned so it's good to get some confirmation I didn't miss something obvious, thanks!
In fact I use the repeat script with this system for the timing at the moment. It just dumps the data one off to ES when called, and I schedule it with repeat to run once a day.
I used docker to setup ELK (https://hub.docker.com/r/sebp/elk/), I am running it on a second PC on the same network, but it could easily be remote or local I think, the data volume is very low. I use a single http session (I added libcurl to dfhack) and a separate push for each statistic. I couldn't see offhand how to push multiple docs in one go to ES, although I seem to remember that it is possible. I did think of trying to use df time AS the time stamps, however given what ES how kibana is designed it seemed like it would be going against the grain so I just use real time and added DF time as separate fields. It has worked okay so far. I might try it out though as I already wrote the code to do it.

//edit I might switch to using python, plotly and dash instead of ELK (https://medium.com/@plotlygraphs/introducing-dash-5ecf7191b503), as it is easier to distribute and is more appropriate for non "real world" metrics, i.e. not everything tied to a real world time stamp.
« Last Edit: August 13, 2018, 06:00:54 am by billw »
Logged