Pandas: Using Unix epoch timestamp as Datetime index


My application involves dealing with data (contained in a CSV) which is of the following form:

Epoch (number of seconds since Jan 1, 1970), Value

Currently i read the CSV using numpy loadtxt method (can easily use read_csv from Pandas). Currently for my series i am converting the timestamps field as follows:

timestamp_date=[datetime.datetime.fromtimestamp(timestamp_column[i]) for i in range(len(timestamp_column))]

I follow this by setting timestamp_date as the Datetime index for my DataFrame. I tried searching at several places to see if there is a quicker (inbuilt) way of using these Unix epoch timestamps, but could not find any. A lot of applications make use of such timestamp terminology.

  1. Is there an inbuilt method for handling such timestamp formats?
  2. If not, what is the recommended way of handling these formats?
5/13/2013 7:56:59 AM

Accepted Answer

Convert them to datetime64[s]:

np.array([1368431149, 1368431150]).astype('datetime64[s]')
# array([2013-05-13 07:45:49, 2013-05-13 07:45:50], dtype=datetime64[s])
5/13/2013 7:58:26 AM

You can also use pandas to_datetime:

df['datetime'] = pd.to_datetime(df["timestamp"], unit='s')

This method requires Pandas 0.18 or later.

Licensed under: CC-BY-SA with attribution
Not affiliated with: Stack Overflow