What Is a Unix Timestamp? Epoch Time Explained Simply
Learn what Unix timestamps are, why they start in 1970, the difference between seconds and milliseconds, and how to convert them in JavaScript, Python, and SQL.
If you've ever debugged an API response and spotted a field like created_at: 1714000000, you were looking at a Unix timestamp. It looks like a random number, but it encodes a precise point in time. Once you understand how it works, reading and converting timestamps becomes second nature.
What Is Unix Time and Why Does It Start in 1970?
A Unix timestamp is simply a count of seconds that have elapsed since January 1, 1970, 00:00:00 UTC — a reference point called the Unix epoch. That specific date was chosen by the developers of Unix at Bell Labs in the early 1970s. They needed a fixed starting point close to "now" that was easy to work with, and January 1, 1970 fit the bill. It had no special technical significance — it was just a round, convenient date.
The key property that makes Unix time so useful is that it is timezone-agnostic. The integer 1714000000 means the exact same instant anywhere on Earth, whether you are in Tokyo, New York, or Mumbai. Timezones only come into play when you convert a timestamp into a human-readable date for display. This makes Unix timestamps ideal for storing, sorting, and comparing dates across distributed systems.
You encounter Unix timestamps constantly in software: database columns (created_at, expires_at), HTTP headers (Last-Modified), log files, JWT tokens, and API responses from virtually every major platform.
Seconds vs Milliseconds — A Source of Endless Bugs
Here is where most confusion comes from. There are two common flavors of Unix timestamps, and mixing them up causes off-by-1000-factor errors:
10-digit timestamps count in seconds: 1714000000 → April 25, 2024
13-digit timestamps count in milliseconds: 1714000000000 → the same moment
Different platforms and languages use different conventions:
- JavaScript's
Date.now()returns milliseconds (13 digits). So do most browser APIs. - Python's
time.time()returns seconds as a float:1714000000.123456 - Unix shell
date +%sreturns seconds (10 digits) - GitHub API uses seconds in most fields
- Stripe API uses seconds for
createdfields - AWS CloudTrail uses milliseconds in
eventTimeepoch fields
How to Convert Timestamps in Code
JavaScript
// Current timestamp in milliseconds (JS default)
const nowMs = Date.now() // e.g. 1714000000000
const nowSec = Math.floor(nowMs / 1000) // e.g. 1714000000
// Convert seconds timestamp to Date object const date = new Date(1714000000 * 1000) console.log(date.toISOString()) // "2024-04-25T08:26:40.000Z"
// Convert milliseconds timestamp const dateMs = new Date(1714000000000) console.log(dateMs.toISOString()) // same result
// Display in a specific timezone const formatted = new Intl.DateTimeFormat('en-US', { timeZone: 'Asia/Kolkata', dateStyle: 'full', timeStyle: 'long', }).format(date) // "Thursday, April 25, 2024 at 1:56:40 PM IST"
Python
import datetime
Current Unix timestamp (seconds)
import time
now = time.time() # e.g. 1714000000.123
Convert to datetime (local timezone)
dt = datetime.datetime.fromtimestamp(1714000000)
Convert to datetime (UTC)
dt_utc = datetime.datetime.utcfromtimestamp(1714000000)
print(dt_utc.isoformat()) # "2024-04-25T08:26:40"
Timezone-aware (Python 3.2+)
from datetime import timezone
dt_aware = datetime.datetime.fromtimestamp(1714000000, tz=timezone.utc)
SQL
-- MySQL
SELECT FROM_UNIXTIME(1714000000);
-- Returns: 2024-04-25 08:26:40 (in server's local timezone)
SELECT FROM_UNIXTIME(1714000000, '%Y-%m-%d %H:%i:%s UTC');
-- PostgreSQL SELECT TO_TIMESTAMP(1714000000); -- Returns: 2024-04-25 08:26:40+00
-- SQLite SELECT datetime(1714000000, 'unixepoch'); -- Returns: 2024-04-25 08:26:40
Common Gotchas
Timezone assumptions. The most frequent bug: code that converts a Unix timestamp assumes the server's local timezone rather than UTC. Always convert to UTC first, then localise for display only.
DST transitions. When converting a local datetime to a timestamp, times during Daylight Saving Time transitions can be ambiguous (the same clock time occurs twice). When possible, store and compute in UTC to avoid this entirely.
Off-by-1000 errors. Forgetting to multiply seconds by 1000 in JavaScript (or dividing milliseconds by 1000 in Python) produces dates that are wildly wrong — either 1000 seconds in the future, or a date in 1970. If you see a date near January 1, 1970, you forgot to multiply. If you see a date thousands of years in the future, you multiplied when you should not have.
Negative timestamps. Unix timestamps can be negative — they represent moments before January 1, 1970. -1 is December 31, 1969, 23:59:59 UTC. Most systems handle this correctly, but some older databases or 32-bit systems do not.
The Year 2038 Problem
On January 19, 2038, at 03:14:07 UTC, 32-bit signed integers that store Unix timestamps will overflow. The maximum value a signed 32-bit integer can hold is 2,147,483,647 — which corresponds to that exact moment. One second later, the counter rolls over to −2,147,483,648, which represents December 13, 1901.
Modern 64-bit systems are completely unaffected — a 64-bit timestamp can represent dates approximately 292 billion years from now. But legacy embedded systems (industrial controllers, old network equipment, 32-bit databases), and software compiled with 32-bit time_t, may still be at risk. Several patches to Linux, databases, and programming language runtimes have already been made, but the 2038 problem is still a live concern for systems that cannot be easily upgraded.
Convert Any Timestamp Instantly
Use our free Unix Timestamp Converter to convert any epoch value to a human-readable date in any timezone, or convert any date back to a Unix timestamp. Supports seconds, milliseconds, batch conversion, and a live ticker showing the current epoch time — all in your browser.
Read next: What Is My IP Address and What Does It Reveal About You? — another deep-dive into a number that follows you everywhere online.