A Unix timestamp is the number of seconds that have elapsed since midnight on 1 January 1970 UTC — a moment known as the Unix epoch. It is the universal way computers represent a specific point in time as a plain integer. Right now, the Unix timestamp is somewhere around 1,750,000,000.

Why not just store dates as strings?

Storing dates as human-readable strings like "2026-01-30 14:30:00" creates several problems:

A Unix timestamp solves all of these. It is a single integer, always in UTC, always sortable by standard numeric sort, and easy to do arithmetic on.

Unix timestamps count seconds from the epoch. JavaScript's Date.now() returns milliseconds, so JavaScript timestamps are 1000× larger than standard Unix timestamps. Always check which unit a system uses before comparing timestamps from different sources.

Converting timestamps in different languages

// JavaScript — get current timestamp (milliseconds) const msTimestamp = Date.now(); // e.g. 1738252800000 const secTimestamp = Math.floor(msTimestamp / 1000); // Convert timestamp to Date object const date = new Date(1738252800 * 1000); console.log(date.toISOString()); // "2026-01-30T12:00:00.000Z"
# Python import time from datetime import datetime, timezone # Get current Unix timestamp timestamp = int(time.time()) # Convert timestamp to datetime dt = datetime.fromtimestamp(1738252800, tz=timezone.utc) print(dt.isoformat()) # 2026-01-30T12:00:00+00:00
-- SQL (MySQL) -- Get current timestamp SELECT UNIX_TIMESTAMP() AS ts; -- Convert timestamp to readable date SELECT FROM_UNIXTIME(1738252800) AS readable_date;

Reference: notable Unix timestamps

Unix timestampDate (UTC)
01 January 1970 00:00:00
1,000,000,0009 September 2001 01:46:40
1,500,000,00014 July 2017 02:40:00
1,700,000,00014 November 2023 22:13:20
2,000,000,00018 May 2033 03:33:20
2,147,483,64719 January 2038 (32-bit overflow)
The Year 2038 problem: Systems that store Unix timestamps as a 32-bit signed integer will overflow on 19 January 2038 at 03:14:07 UTC. This is the timestamp equivalent of Y2K. Modern systems use 64-bit integers which extend the range to hundreds of billions of years.

When to use timestamps vs formatted date strings