pyspark.sql.functions.weekofyear#
- pyspark.sql.functions.weekofyear(col)[source]#
Extract the week number of a given date as integer. A week is considered to start on a Monday and week 1 is the first week with more than 3 days, as defined by ISO 8601
New in version 1.5.0.
Changed in version 3.4.0: Supports Spark Connect.
- Parameters
- col
Column
or column name target timestamp column to work on.
- col
- Returns
Column
week of the year for given date as integer.
See also
Examples
Example 1: Extract the week of the year from a string column representing dates
>>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([('2015-04-08',), ('2024-10-31',)], ['dt']) >>> df.select("*", sf.typeof('dt'), sf.weekofyear('dt')).show() +----------+----------+--------------+ | dt|typeof(dt)|weekofyear(dt)| +----------+----------+--------------+ |2015-04-08| string| 15| |2024-10-31| string| 44| +----------+----------+--------------+
Example 2: Extract the week of the year from a string column representing timestamp
>>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([('2015-04-08 13:08:15',), ('2024-10-31 10:09:16',)], ['ts']) >>> df.select("*", sf.typeof('ts'), sf.weekofyear('ts')).show() +-------------------+----------+--------------+ | ts|typeof(ts)|weekofyear(ts)| +-------------------+----------+--------------+ |2015-04-08 13:08:15| string| 15| |2024-10-31 10:09:16| string| 44| +-------------------+----------+--------------+
Example 3: Extract the week of the year from a date column
>>> import datetime >>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([ ... (datetime.date(2015, 4, 8),), ... (datetime.date(2024, 10, 31),)], ['dt']) >>> df.select("*", sf.typeof('dt'), sf.weekofyear('dt')).show() +----------+----------+--------------+ | dt|typeof(dt)|weekofyear(dt)| +----------+----------+--------------+ |2015-04-08| date| 15| |2024-10-31| date| 44| +----------+----------+--------------+
Example 4: Extract the week of the year from a timestamp column
>>> import datetime >>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([ ... (datetime.datetime(2015, 4, 8, 13, 8, 15),), ... (datetime.datetime(2024, 10, 31, 10, 9, 16),)], ['ts']) >>> df.select("*", sf.typeof('ts'), sf.weekofyear('ts')).show() +-------------------+----------+--------------+ | ts|typeof(ts)|weekofyear(ts)| +-------------------+----------+--------------+ |2015-04-08 13:08:15| timestamp| 15| |2024-10-31 10:09:16| timestamp| 44| +-------------------+----------+--------------+