- Microsecond time from the internal system clock, used in the Web Performance API.
- The “system-level” time of milliseconds and Unix time, on which the others ultimately depend.
setInterval, which typically set delays of milliseconds to seconds.
- “Human-readable” date formats, such as “January 1, 2015”.
Most every expression of time on a digital device is derived from the microsecond “tick-tock” of a chip, which keeps track of the so-called Unix epoch: the number of milliseconds that have passed since midnight, January 1, 1970 UCT. You can see this number for yourself in a browser: open up a console window and type:
The number you’ll get back after hitting Enter look something like this: > 1413067397613
This ever-increasing number, also known as the Unix timestamp, is the means by which most computer time is measured, with everything else – time in the future, past, and the difference between dates – resolved by conversion into this format, after taking into account other factors such as time zones and leap seconds.
High resolution (1 millionth of a second) timing is also available via the Web Performance API, although that is excessive for most purposes.
Enjoy this piece? I invite you to follow me at twitter.com/dudleystorey to learn more.