...
Nanosecond resolution is millions of times higher resolution than millisecond resolution. Using a quartz based reference that has accuracy on the order of 500 msec doesn't seem worth the effort when a call to the system clock can obtain time at 1.78 nsec resolution. That's actually a difference of what, 280 million times higher frequency?
Given that any modern PC uses NTP time sycing (Microsoft, Apple or Linux, take your pick), nanosecond level time accuracy is available on any modern PC. I don't understand why the timing software is querying the soundcard clock as a time reference, as the soundcard clock is uncalibrated. It also doesn't make sense to me why the software expects a user to use a quartz-based calibration procedure using an external time source (watch) when polling the system clock would provide a reference source for calibration that's accurate to the nanosecond level and is presumably hundreds of millions of times more accurate. Polling the soundcard for timing seems like it would provide a lesser result than polling the system clock on any NTP-based PC system.
I have to assume that I'm misunderstanding something. Thanks for your time.
Thanks for you comments, and to all those that answered already. I don't have much to add to their excellent responses, yet, maybe, I can clear up some misunderstanding.
The system clock of a modern pc, when disciplined via NTP, would time mechanical watches adequately, no question about that. However, let me point out that the situation is not precisely as the lines above imply. From a typical home network, connected via DSL, one might have a 30ms round trip time to a geographically close internet node (just try ping google.com). This is about the inherent uncertainty of any internet based synchronization mechanism. By averaging several samples and assuming that the round trip is symmetric (which often a DSL connection is not), NTP might bring the uncertainty down to a few milliseconds. Nanoseconds, however, no way. It's like measuring lengths with a stick having marks every 30mm: with a bit of luck, one might guess the nearest millimeter, but nanometers? Quoting from
ntp.org: Home of the Network Time Protocol (I'll assume they know their stuff).
The typical accuracy on the Internet ranges from about 5ms to 100ms, possibly varying with network delays. A recent survey suggests that 90% of the NTP servers have network delays below 100ms, and about 99% are synchronized within one second to the synchronization peer.
Yet, even though it is not nanosecond accurate, I believe that the system clock of a PC is a decent reference for our task of timing watches. The real problem, for me, is to interface with the system clock. There are two obstacles. The first and largest one is that I get sound samples, obviously, through the sound card (which, nowadays, is often integrated into whatever chip on the mainboard). This particular piece of hardware has, in general, its own clock, so this is the clock that times the samples, and, by necessity, this is the clock that I have to characterize. To handle sound in a cross-platform portable manner, I rely on a library called portaudio, used and developed by the Audacity folks. This library, hands to tg a bunch of samples every now and then: it's up to the library itself to decide precisely when. Portaudio, in turn, receives the audio from whatever underlying infrastructure the operating system might have, and the OS receives it from the hardware. Now, there is a unpredictable and unspecified delay at each step of this chain, so I cannot know, at any given moment, when the sound samples that tg just received have been actually collected. Then, if I had to correlate this samples with the system clock, I would have another bunch of uncertainties coming from it, even though it is generally more reliable. For instance, you see that on your very machine the ntp daemon resets the clock every ten minutes: how could tg know when the clock is being reset on every system? Surely there will be machines that just set the time once when they are powered on, or some that never do it (consider that tg has been installed on Raspberry Pi's): for timing purposes this is the same, and these computer clocks are often quite bad if not reset by NTP. In conclusion, I would have to embed a little NTP client inside tg just to get a reliable time source, and even if, then, I cannot reliably correlate it with the sound card clock anyway.
My solution has been to use a time signal that is fed directly through the sound card, so the correlation is effected by just timing the reference clock with the device that I want to characterize. This is the direct simple solution, and, I believe, rather robust. I do not say that one cannot use system clock plus NTP. I say that I cannot do it reliably in a manner that works cross-platform on most setups.
Edit. I understand that the 0.5s/d of a quartz watch might sound less than impressive, but that is just the level of uncertainty of the rest of the system. Consider that 0.5s/d is 5ppm, or 0.1ms in 20s. Tg integrates samples over a period of 16s, and one can not realistically ask more than 0.1ms, corresponding to 10KHz, from the audio, which is cut at 20KHz by the anti-aliasing filter anyway.