Time Converter

Convert between seconds, minutes, hours, days, weeks, months, years, and sub-second units like milliseconds and nanoseconds. Enter a value and pick your units.

Report a Bug

How Humans Figured Out Time

Keeping track of time is something we take completely for granted, but getting here took thousands of years of observation and argument. The earliest time measurement was simple: sunrise, sunset, and the phases of the Moon. Hunter-gatherer societies didn't need much more than that. Agriculture changed things. Once people started planting crops, they needed to predict seasons, which meant tracking the Sun's position across the sky over the course of a year.

The Egyptians divided daylight into 12 parts and nighttime into 12 parts, giving us the 24-hour day. The catch was that those hours weren't equal — summer daylight hours were longer than winter ones. Fixed-length hours didn't become standard until mechanical clocks arrived in medieval Europe.

Minutes and seconds came from the Babylonians, who used a base-60 number system. Why 60? Because it's divisible by 2, 3, 4, 5, 6, 10, 12, 15, 20, and 30 — more factors than any smaller number. That flexibility made fractions easier to work with in an era before decimals existed. We inherited their divisions: 60 minutes in an hour, 60 seconds in a minute. The system is over 3,000 years old and shows no sign of being replaced.

The second itself has been redefined multiple times. Originally it was 1/86,400 of a mean solar day. In 1967, the international scientific community redefined it based on the vibrations of a cesium-133 atom — specifically, 9,192,631,770 oscillations of the radiation emitted during a particular atomic transition. That's absurdly precise, but GPS satellites, telecommunications networks, and financial trading systems all depend on that kind of accuracy.

Calendar Quirks: Why Months and Years Are Messy

Converting between months, days, and years sounds straightforward until you actually try to do it precisely. The fundamental problem is that the Earth's orbit around the Sun doesn't take a neat, round number of days. A tropical year — the time from one spring equinox to the next — is approximately 365.2422 days. Not 365. Not 365.25. Roughly 365.2422.

Julius Caesar's calendar reform in 46 BCE handled this by adding a leap day every four years, which assumes 365.25 days per year. Close, but not exact. Over centuries, the calendar drifted. By the 1500s, the spring equinox was arriving about 10 days earlier than the calendar said it should. Pope Gregory XIII fixed this in 1582 by skipping 10 days (October 4 was followed by October 15) and tweaking the leap year rule: century years are only leap years if divisible by 400. So 1900 was not a leap year, but 2000 was. This Gregorian calendar is what most of the world uses now, and it's accurate to within one day every 3,236 years.

Months are even messier. They range from 28 to 31 days with no discernible pattern — you just have to memorize which is which, or use the knuckle trick your second-grade teacher taught you. The average month length is 30.4375 days (365.25 divided by 12), which is what most time converters use. But any specific month could be 28, 29, 30, or 31 days. If someone tells you something is "three months away," that could mean anywhere from 89 to 92 days depending on which months are involved. This imprecision is baked into the system, and there's no fixing it without redesigning the calendar from scratch — which has been proposed dozens of times and never adopted.

Time Zones: A Surprisingly Recent Invention

Before the railroad, every town set its clocks by local solar noon — when the Sun was highest in the sky. That meant the time in a city 50 miles west was a few minutes later than your time, and a city 50 miles east was a few minutes ahead. Nobody cared, because nobody traveled fast enough for it to matter.

Railroads made this chaos unbearable. Train schedules had to list departure times for every station, and each station was on its own local time. In the 1840s, British railways started using Greenwich Mean Time across their entire network, which the public grumbled about but eventually accepted. The United States didn't standardize until 1883, when the railroads carved the country into four time zones on their own authority. Congress didn't make it official until 1918.

The international system of 24 time zones, centered on the prime meridian at Greenwich, was formalized at a conference in Washington in 1884. But the neat 15-degree slices on a globe bear little resemblance to actual time zone boundaries. China spans five geographic time zones but uses only one. India uses a single zone offset by half an hour (UTC+5:30). Nepal is offset by 45 minutes (UTC+5:45). Australia has three major zones, and parts of it observe daylight saving time while neighboring parts don't.

Daylight saving time adds another layer of complexity. Not all countries observe it. Those that do don't all switch on the same dates. Arizona doesn't observe it, but the Navajo Nation within Arizona does, while the Hopi reservation inside the Navajo Nation doesn't. If you're writing software that deals with time zones, you learn to fear these edge cases. The IANA time zone database, which most computers rely on, contains over 300 unique zones precisely because political boundaries make time far more complicated than simple astronomy.

Computing and Time Precision

Computers measure time in ways that would have seemed bizarre a century ago. Most modern systems track time as the number of seconds (or milliseconds, or nanoseconds) elapsed since midnight on January 1, 1970, UTC. This is called Unix time or epoch time, and it's the backbone of how nearly every server, database, and programming language handles dates and times internally.

A Unix timestamp of 1,000,000,000 corresponded to September 9, 2001, at 01:46:40 UTC. The timestamp 2,000,000,000 will arrive on May 18, 2033. Systems that store timestamps as 32-bit signed integers will overflow on January 19, 2038 — a problem sometimes called the "Year 2038 problem" or "Y2K38." At that point, the counter wraps around to a negative number, which the system interprets as a date in December 1901. Most modern systems have already migrated to 64-bit timestamps, which won't overflow for about 292 billion years, so this is mainly a concern for embedded systems and legacy hardware.

At the sub-second level, precision matters enormously. High-frequency trading systems operate on microsecond timescales. A microsecond is one millionth of a second — light only travels about 300 meters in that time. Some trades are executed in nanoseconds, one billionth of a second. GPS satellites need clocks accurate to within about 10 nanoseconds; a 100-nanosecond error translates to roughly 30 meters of positioning error on the ground.

Even your web browser cares about millisecond precision. JavaScript measures animation frames in milliseconds, and a smooth 60 frames-per-second animation has a budget of about 16.67 milliseconds per frame. Exceed that and you get visible stuttering. Video games operate under the same constraint. Database queries that take more than a few hundred milliseconds feel sluggish to users. Our tolerance for delay has gotten remarkably thin as technology has gotten faster, which makes precise time measurement not just a scientific concern but an everyday engineering requirement.

Time Conversion

Converted Value = Input Value × Conversion Factor

Time conversions normalize the input to a base unit (seconds), then convert to the target unit. For example, 24 hours equals 86,400 seconds (24 × 3,600), and converting to minutes means dividing by 60, giving 1,440 minutes. Months use an average of 30.4375 days (365.25 / 12) and years use 365.25 days to account for leap years.

Where:

  • Input Value = The numeric time value to convert
  • Conversion Factor = The ratio between the source and target time units
  • Converted Value = The result expressed in the target unit

Example Calculations

Project Hours to Work Weeks

A project estimated at 240 hours -- how many calendar weeks is that?

One calendar week contains 168 hours (24 hours times 7 days). Dividing 240 by 168 gives 1.4286 calendar weeks. Note that this is calendar time, not work time. For project planning, you would more likely divide 240 by 40 to get 6 standard work weeks. The distinction matters when estimating deadlines versus actual labor hours.

Age in Days

Converting 30 years into days to see roughly how many days a 30-year-old has been alive.

Using the standard average of 365.25 days per year (which accounts for leap years), 30 years equals 30 multiplied by 365.25, giving 10,957.5 days. The exact count depends on which specific years are involved, since leap years add a day. Someone born on March 1, 1994, would have lived through 7 or 8 leap years by their 30th birthday depending on exact dates. The 365.25 average smooths this out for estimation purposes.

Frequently Asked Questions

There are exactly 86,400 seconds in a standard day. That's 24 hours multiplied by 60 minutes per hour multiplied by 60 seconds per minute: 24 × 60 × 60 = 86,400. Occasionally, a leap second is added to keep atomic clocks aligned with the Earth's slightly irregular rotation, making certain days 86,401 seconds long. This has happened 27 times since 1972, most recently in December 2016.

The 365.25 figure accounts for leap years, which add an extra day every four years. Three regular years of 365 days plus one leap year of 366 days averages out to 365.25 days per year. The Gregorian calendar is slightly more precise at 365.2425 days, because century years are skipped as leap years unless divisible by 400. For most practical conversions, 365.25 is accurate enough. The difference amounts to less than one day over a 130-year span.

This converter uses 30.4375 days per month, which is 365.25 divided by 12. Real months range from 28 days (February in common years) to 31 days (January, March, May, July, August, October, December). The average provides a reasonable approximation for general conversions, but if you need precision for a specific date range, you should count the actual days between the two dates rather than relying on an average month length.

A nanosecond is one billionth of a second (0.000000001 seconds). Light travels about 30 centimeters — roughly one foot — in a nanosecond. This unit matters because modern processors execute instructions in nanosecond timeframes. A 4 GHz processor completes one clock cycle every 0.25 nanoseconds. GPS accuracy depends on nanosecond-level timing. Network latency between servers in the same data center is measured in microseconds (thousands of nanoseconds). For high-frequency trading, nanoseconds can determine whether a trade executes profitably.

Unix time counts the number of seconds that have elapsed since midnight UTC on January 1, 1970 — a moment called the Unix epoch. Right now, the Unix timestamp is in the range of 1.7 to 1.8 billion. Computers use this system because a single number is much easier to store, sort, and compare than a date string like "March 15, 2026 at 2:30 PM EST." To convert a Unix timestamp to a human-readable date, software divides it into years, months, days, hours, minutes, and seconds while accounting for leap years and time zones.

Related Calculators