pyspark.sql.functions.from_unixtime#
- pyspark.sql.functions.from_unixtime(timestamp, format='yyyy-MM-dd HH:mm:ss')[source]#
- Converts the number of seconds from unix epoch (1970-01-01 00:00:00 UTC) to a string representing the timestamp of that moment in the current system time zone in the given format. - New in version 1.5.0. - Changed in version 3.4.0: Supports Spark Connect. - Parameters
- timestampColumnor column name
- column of unix time values. 
- formatliteral string, optional
- format to use to convert to (default: yyyy-MM-dd HH:mm:ss) 
 
- timestamp
- Returns
- Column
- formatted timestamp as string. 
 
 - Examples - >>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles") - >>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([(1428476400,)], ['unix_time']) >>> df.select('*', sf.from_unixtime('unix_time')).show() +----------+---------------------------------------------+ | unix_time|from_unixtime(unix_time, yyyy-MM-dd HH:mm:ss)| +----------+---------------------------------------------+ |1428476400| 2015-04-08 00:00:00| +----------+---------------------------------------------+ - >>> spark.conf.unset("spark.sql.session.timeZone")