pyspark.sql.functions.next_day#
- pyspark.sql.functions.next_day(date, dayOfWeek)[source]#
Returns the first date which is later than the value of the date column based on second week day argument.
New in version 1.5.0.
Changed in version 3.4.0: Supports Spark Connect.
- Parameters
- date
Column
or column name target column to compute on.
- dayOfWeekliteral string
- day of the week, case-insensitive, accepts:
“Mon”, “Tue”, “Wed”, “Thu”, “Fri”, “Sat”, “Sun”
- date
- Returns
Column
the column of computed results.
Examples
>>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([('2015-07-27',)], ['dt']) >>> df.select('*', sf.next_day(df.dt, 'Sun')).show() +----------+-----------------+ | dt|next_day(dt, Sun)| +----------+-----------------+ |2015-07-27| 2015-08-02| +----------+-----------------+
>>> df.select('*', sf.next_day('dt', 'Sat')).show() +----------+-----------------+ | dt|next_day(dt, Sat)| +----------+-----------------+ |2015-07-27| 2015-08-01| +----------+-----------------+