beyondgrader.com Logo
DemoBrowseAboutTeamLogin

Time to Decade

Geoffrey Challen // 2020.8.0

To work with data computers require that it be digitized: converted to one or more numbers. Time is no exception. Time is a human concept, and so for computers to understand it we have to digitize it.

Humans can and also have digitized time. Like location, systems for temporal measurement predate computers by thousands of years. And computers can and do manipulate time to study events that occurred before they existed.

But computer scientists have also established a way of measuring time uniquely appropriate for use by computers: time since 1970, sometimes referred to as the UNIX epoch. Development of the UNIX family of computers began in 1969, so this way of measuring time corresponds with the advent of early computer systems.

Counting seconds since 1970 is a rough way of measuring time. (If we want to be more precise, we can count milliseconds or even nanoseconds since 1970—although those values are larger and take up more space in memory.)

Any count of seconds since 1970 can be converted into a more human-readable timestamp (like "10:05AM EST on January 17th, 2009"), and any human-readable timestamp can also be converted into seconds since 1970 (the previous timestamp is 1232204700 in UNIX time).

Given a UNIX timestamp we can also determine things about what was happening at that time. Let's write a snippet of code that, given a Long timestamp stored in currentTime, prints 00s if its in the 2000-aughts, 10s if its in the 2010s, and 20s if its in the 2020s. (Note that UNIX time is always in the UTC timezone.)