Thursday, August 5, 2021

Timing Short Durations

 I don't have time for a long post (HA!), but I wanted to add a pointer to https://github.com/fordsfords/nstm ("nstm" = "Nano Second Timer"). It's a small repo that provides a nanosecond-precision time stamp portably between MacOS, Linux, and Windows.

Note that I said precision, not resolution. I don't know of an API on Windows that gives nanosecond resolution. The one Microsoft says you should use (QueryPerformanceCounter()) always returns "00" as the last two decimal digits. I.e. it is 100 nanosecond resolution. They warn against using "rdtsc" directly, although I wonder if most of their arguments are mostly no longer applicable. I would love to hear if anybody knows of a Windows method of getting nanosecond resolution timestamps that is reliable and efficient.

One way to measure a short duration "thing" is to time doing the "thing" a million times (or whatever) and take an average. One advantage of this approach is that taking a timestamp itself takes time; i.e. making the measurement changes the thing you are measuring. So amortizing that cost over many iterations minimizes its influence.

But sometimes, you just need to directly measure short things. Like if you are histogramming them to get the distribution of variations (jitter).

I put some results here: https://github.com/fordsfords/fordsfords.github.io/wiki/Timing-software