What is a Unix Timestamp?
A Unix timestamp (also called epoch time) is the number of seconds elapsed since January 1, 1970 at 00:00:00 UTC. It is used universally in programming, databases, and APIs to represent moments in time in a timezone-independent way.
Unix Timestamp = (Date - Jan 1, 1970) in seconds
JavaScript: Math.floor(Date.now() / 1000)
Python: import time; int(time.time())
Frequently Asked Questions
Unix timestamps are stored as 32-bit signed integers in many older systems, which can only represent values up to January 19, 2038. After that, the counter overflows. Modern systems use 64-bit integers which can represent dates billions of years into the future.
Many systems (like JavaScript's Date.now()) return milliseconds since epoch rather than seconds. Divide by 1000 to get seconds. A millisecond timestamp is 13 digits; a second timestamp is 10 digits.