pyspark.sql.functions.date_sub#
- pyspark.sql.functions.date_sub(start, days)[source]#
Returns the date that is days days before start. If days is a negative value then these amount of days will be added to start.
New in version 1.5.0.
Changed in version 3.4.0: Supports Spark Connect.
- Parameters
- Returns
Column
a date before/after given number of days.
See also
Examples
>>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([('2015-04-08', 2,)], 'struct<dt:string,a:int>') >>> df.select('*', sf.date_sub(df.dt, 1)).show() +----------+---+---------------+ | dt| a|date_sub(dt, 1)| +----------+---+---------------+ |2015-04-08| 2| 2015-04-07| +----------+---+---------------+
>>> df.select('*', sf.date_sub('dt', 'a')).show() +----------+---+---------------+ | dt| a|date_sub(dt, a)| +----------+---+---------------+ |2015-04-08| 2| 2015-04-06| +----------+---+---------------+
>>> df.select('*', sf.date_sub('dt', sf.lit(-1))).show() +----------+---+----------------+ | dt| a|date_sub(dt, -1)| +----------+---+----------------+ |2015-04-08| 2| 2015-04-09| +----------+---+----------------+