pyspark.sql.functions.unix_micros#
- pyspark.sql.functions.unix_micros(col)[source]#
Returns the number of microseconds since 1970-01-01 00:00:00 UTC.
New in version 3.5.0.
- Parameters
- col
Column
or column name input column of values to convert.
- col
- Returns
Column
the number of microseconds since 1970-01-01 00:00:00 UTC.
See also
Examples
>>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
>>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([('2015-07-22 10:00:00',), ('2022-10-09 11:12:13',)], ['ts']) >>> df.select('*', sf.unix_micros(sf.to_timestamp('ts'))).show() +-------------------+-----------------------------+ | ts|unix_micros(to_timestamp(ts))| +-------------------+-----------------------------+ |2015-07-22 10:00:00| 1437584400000000| |2022-10-09 11:12:13| 1665339133000000| +-------------------+-----------------------------+
>>> spark.conf.unset("spark.sql.session.timeZone")