511
DF General Discussion / Re: Advanced Tweaking (Warning: !!SCIENCE!!)
« on: September 25, 2010, 08:53:22 pm »
Oh no, I understand entirely. It's just that some tweaks only make a difference at high FPS, and some only at low. Sadly, FPS is the only benchmarking available that's relevent to DF. Synthetic benchmarks can be DRASTICALLY inaccurate on the grounds that they behave nothing like the real application.
Rest assured however that each tweak was tested across no less (but maybe more) than ten different maps of varying size, age, clutter, and activity, at the exact same frames and time periods. (IE, I didn't test, save, then load and test again. They were all tested on neutral grounds.)
I tried to only post those tests that were relevent, in FPS, on a fixed scale. IE, the restarting made the same maps jump by a factor of ten or more.
Others were more unilateral, such as the CPU cores clocks- the FPS drop was proportional to the frequency. IE, 1000 went to 1200, 100 went to 120, 10 went to 12.
Also, one interesting note, I could not embark on an area larger than 42x42 in any but a dedicated-core, ganged and optimized site. It ran well below 1 FPS. To say it was crippled was rediculous. After an hour, they moved the twelve spaces to begin mining. X.x
Edit:
This is also why I completely omitted what various driver versions did, because they had no clear and substantiated link, and no differences that were appreciable.
I also omitted any differences below 200 FPS. Going from 150 to 200 isn't really pertinant, and 200 to 400 is as simple as designating mining. I made a strong effort to only include findings that made a reasonably significant improvement.
VRAM, for example, made lots of G_FPS differences at extreme FPS (1000+), but noone plays at that realistically. In real forts, it made absolutely no difference on FPS or G_FPS. Certainly not enough to bother tweaking to accomplish.
I follow the '2% rule'. It the FPS didn't, under heavy load, make an average of 2% of better improvement at a very severely lagging fort (10 FPS), I didn't bother to report it here.
So, take all my research with a grain of rock salt, but rest assured that although it may not be as solid at Granite, it's as valuable as Limestone.
Rest assured however that each tweak was tested across no less (but maybe more) than ten different maps of varying size, age, clutter, and activity, at the exact same frames and time periods. (IE, I didn't test, save, then load and test again. They were all tested on neutral grounds.)
I tried to only post those tests that were relevent, in FPS, on a fixed scale. IE, the restarting made the same maps jump by a factor of ten or more.
Others were more unilateral, such as the CPU cores clocks- the FPS drop was proportional to the frequency. IE, 1000 went to 1200, 100 went to 120, 10 went to 12.
Also, one interesting note, I could not embark on an area larger than 42x42 in any but a dedicated-core, ganged and optimized site. It ran well below 1 FPS. To say it was crippled was rediculous. After an hour, they moved the twelve spaces to begin mining. X.x
Edit:
This is also why I completely omitted what various driver versions did, because they had no clear and substantiated link, and no differences that were appreciable.
I also omitted any differences below 200 FPS. Going from 150 to 200 isn't really pertinant, and 200 to 400 is as simple as designating mining. I made a strong effort to only include findings that made a reasonably significant improvement.
VRAM, for example, made lots of G_FPS differences at extreme FPS (1000+), but noone plays at that realistically. In real forts, it made absolutely no difference on FPS or G_FPS. Certainly not enough to bother tweaking to accomplish.
I follow the '2% rule'. It the FPS didn't, under heavy load, make an average of 2% of better improvement at a very severely lagging fort (10 FPS), I didn't bother to report it here.
So, take all my research with a grain of rock salt, but rest assured that although it may not be as solid at Granite, it's as valuable as Limestone.