Unix Timestamp Converter
Convert between Unix epoch timestamps and human-readable dates
Current Unix Timestamp
Live, updates every second.
Timestamp → Date
Enter a Unix timestamp to convert it into a human-readable date.
Date → Timestamp
Pick any date and time to get its Unix timestamp.
Timezone: --
How to Convert Unix Timestamps
Converting between Unix timestamps and human-readable dates with this free online tool takes just a few seconds. The page is split into two converters that work in real time, so you never have to click a "Convert" button -- results update the moment you type.
- Convert a timestamp into a date -- paste or type a Unix timestamp into the "Timestamp → Date" field. Use the toggle to indicate whether your value is in seconds (the most common Unix format) or milliseconds (the format JavaScript and many JSON APIs return). The tool instantly shows the local date and time, the UTC equivalent, the canonical ISO 8601 string, and a friendly "relative time" such as "3 days ago" or "in 2 hours".
- Convert a date into a timestamp -- pick a calendar date and time using the "Date → Timestamp" picker, or click "Now" to grab the exact current time from your computer's clock. The tool returns the Unix timestamp in both seconds and milliseconds, plus the ISO 8601 representation, so you can copy whichever format your application expects.
- Copy with one click -- every output value has a copy button next to it. Click it and the result is in your clipboard, ready to paste into a database query, a script, a log line, a Slack message, or wherever else you need it.
- Watch the live clock -- the box at the top of the page displays the current Unix timestamp updating every second. It is handy when you just need to grab "now" for a request, an SQL insert, or a unit test.
Everything runs locally in your browser using the native JavaScript Date object. No data is sent to any server, so the tool works offline once the page has loaded.
What is a Unix Timestamp (Epoch Time)?
A Unix timestamp -- also called Unix time, POSIX time, or epoch time -- is a single integer that represents a specific moment in time as the number of seconds elapsed since 00:00:00 UTC on January 1, 1970. That reference moment is known as the "Unix epoch", and the format was introduced with the early Unix operating systems in the 1970s. Because the count is anchored to UTC, a Unix timestamp is unambiguous worldwide: the same integer refers to the same instant whether you are in Tokyo, London, or São Paulo.
Unix timestamps quickly became the de facto standard for representing time in computing because they are compact, sortable, easy to compare, and trivial to do arithmetic with. Subtracting two timestamps yields the number of seconds between them. Adding 86,400 advances a date by one day. There are no time zones, daylight saving rules, leap years, or month-length quirks to worry about at the storage layer -- those concerns are pushed up to the presentation layer, where they belong.
Today Unix time is everywhere: file system metadata, databases, HTTP headers, JSON Web Tokens, log aggregators, build artifacts, blockchain blocks, sensor readings, and most APIs you will ever consume. Whenever you see a giant number that "looks like a timestamp", odds are it is Unix time in either seconds or milliseconds.
Seconds vs Milliseconds: Which Format Should I Use?
Unix timestamps come in two common precisions, and confusing them is one of the most frequent sources of "off by a thousand" bugs in real-world code.
- Seconds -- the original Unix format. Most operating system APIs (POSIX
time(),stat()), most databases (PostgreSQLEXTRACT(EPOCH FROM ...), MySQLUNIX_TIMESTAMP()), most HTTP headers, JWTiat/expclaims, and most public web APIs use seconds. As of 2026 a current timestamp in seconds is a 10-digit integer (around 1,746,000,000). - Milliseconds -- one thousand times finer. JavaScript chose milliseconds for
Date.now()andnew Date().getTime(), so you will see them everywhere data is produced or consumed by browser code: JSON payloads, analytics events, NoSQL stores like MongoDB, and front-end logs. A current millisecond timestamp is a 13-digit integer.
A simple rule of thumb: if your number has 10 digits it is seconds, and if it has 13 digits it is milliseconds. The toggle on this page lets you switch instantly, so you can paste a value, see "1970" appear in the output, realize you picked the wrong unit, and fix it with one click.
ISO 8601 vs Unix Timestamp: When to Use Each
ISO 8601 is the international standard for writing dates and times as strings. The format looks like 2026-05-03T08:30:00.000Z, where the trailing Z means "UTC". Unlike Unix time, ISO 8601 is human-readable -- you can glance at the string and immediately know the year, month, day, hour, minute, and second.
Use ISO 8601 when humans (and human-friendly logs) will read the output, when you need to preserve a specific time zone offset, or when you are working with formats that mandate it (Atom feeds, RFC 3339 APIs, structured logs, GraphQL DateTime scalars). Use Unix time when you need the smallest, fastest, easiest-to-sort representation -- inside databases, for cache keys, for arithmetic, or for binary protocols. The two formats are interchangeable: every tool, including this one, can convert between them losslessly. A common pattern in modern systems is to store times as Unix timestamps and display them as ISO 8601 (or as locale-formatted strings) at the very edges of the application.
The Year 2038 Problem
Many older systems store Unix timestamps inside a signed 32-bit integer. The largest value such an integer can hold is 2,147,483,647, which corresponds to 03:14:07 UTC on Tuesday, January 19, 2038. One second later, the counter overflows into a negative number and the system suddenly believes it is December 13, 1901. This is the "Year 2038 Problem", sometimes called Y2K38 or the Epochalypse, and it is a genuine concern for embedded systems, legacy databases, file formats, and network protocols that have not yet migrated to 64-bit time.
Modern operating systems, programming languages, and databases have already moved to 64-bit timestamps, which push the overflow date roughly 292 billion years into the future -- comfortably past the heat death of the sun. But code is long-lived, and plenty of devices in the field (industrial controllers, network appliances, satellite firmware, automotive ECUs) will still be running 32-bit Unix time in 2038. If you maintain such a system, the time to plan a migration is now, not on January 18, 2038.
Common Unix Timestamps
A quick reference for memorable moments expressed as Unix time:
0-- the Unix epoch itself: 1970-01-01 00:00:00 UTC.1000000000-- "one billion seconds": 2001-09-09 01:46:40 UTC. Programmers around the world held parties for it.1234567890-- a numeric novelty: 2009-02-13 23:31:30 UTC.1500000000-- 2017-07-14 02:40:00 UTC. Useful as an "after this app launched" sentinel.1700000000-- 2023-11-14 22:13:20 UTC.2000000000-- 2033-05-18 03:33:20 UTC. The next billion-second milestone.2147483647-- 2038-01-19 03:14:07 UTC. The Year 2038 cutoff for signed 32-bit time.
Frequently Asked Questions
What is epoch time?
Epoch time is another name for Unix time: the number of seconds that have elapsed since 00:00:00 UTC on January 1, 1970, ignoring leap seconds. The "epoch" is simply the chosen zero point. Some systems use different epochs (Windows file times count 100-nanosecond intervals since 1601, for example), but when people say "epoch time" without qualification they almost always mean Unix epoch.
Why is Unix time used so widely?
Unix time is a single integer, which makes it tiny to store, fast to compare, easy to sort, and trivial to do math with. It avoids every nasty issue that comes with calendar arithmetic: time zones, daylight saving transitions, locale-specific formatting, leap days, and varying month lengths. Those concerns still exist, but they only matter when displaying time to a human, not when storing or transmitting it. That separation of concerns is a major reason Unix time has outlasted dozens of competing formats.
How do I get the current Unix timestamp in JavaScript?
In JavaScript use Math.floor(Date.now() / 1000) for seconds, or simply Date.now() for milliseconds. Date.now() is a static method that always returns the current time in milliseconds since the epoch, with no need to allocate a Date object.
How do I get the current Unix timestamp in Python?
In Python use int(time.time()) for seconds (after import time), or time.time_ns() // 1_000_000 for milliseconds with high precision. With the more modern datetime module you can also write int(datetime.datetime.now(datetime.timezone.utc).timestamp()).
How do I get the current Unix timestamp in SQL?
It depends on the database. PostgreSQL: SELECT EXTRACT(EPOCH FROM NOW())::bigint;. MySQL and MariaDB: SELECT UNIX_TIMESTAMP();. SQLite: SELECT strftime('%s', 'now');. SQL Server: SELECT DATEDIFF(SECOND, '1970-01-01', GETUTCDATE());. All of these return seconds; multiply by 1000 if you need milliseconds.
What time zone is a Unix timestamp in?
Unix timestamps are always anchored to UTC. They have no time zone of their own -- the integer simply counts seconds since a fixed UTC moment. Time zones come into play only when you convert a timestamp into a calendar representation (for example "May 3, 2026 at 10:30 AM"), at which point you choose which zone to display it in. That is why this tool shows you the local time and the UTC time side by side: the underlying timestamp is identical, only the formatting differs.
Is my data sent anywhere?
No. The entire converter runs in your browser using vanilla JavaScript and the built-in Date object. Nothing you type ever leaves your device. You can verify this by opening your browser's developer tools, switching to the Network tab, and confirming that no requests are made while you use the tool.