pyspark.sql.functions.try_make_interval#

pyspark.sql.functions.try_make_interval(years=None, months=None, weeks=None, days=None, hours=None, mins=None, secs=None)[source]#

This is a special version of make_interval that performs the same operation, but returns a NULL value instead of raising an error if interval cannot be created.

New in version 4.0.0.

Parameters
yearsColumn or column name, optional

The number of years, positive or negative.

monthsColumn or column name, optional

The number of months, positive or negative.

weeksColumn or column name, optional

The number of weeks, positive or negative.

daysColumn or column name, optional

The number of days, positive or negative.

hoursColumn or column name, optional

The number of hours, positive or negative.

minsColumn or column name, optional

The number of minutes, positive or negative.

secsColumn or column name, optional

The number of seconds with the fractional part in microsecond precision.

Returns
Column

A new column that contains an interval.

Examples

Example 1: Try make interval from years, months, weeks, days, hours, mins and secs.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]],
...     ['year', 'month', 'week', 'day', 'hour', 'min', 'sec'])
>>> df.select(
...     sf.try_make_interval(df.year, df.month, 'week', df.day, 'hour', df.min, df.sec)
... ).show(truncate=False)
+---------------------------------------------------------------+
|try_make_interval(year, month, week, day, hour, min, sec)      |
+---------------------------------------------------------------+
|100 years 11 months 8 days 12 hours 30 minutes 1.001001 seconds|
+---------------------------------------------------------------+

Example 2: Try make interval from years, months, weeks, days, hours and mins.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]],
...     ['year', 'month', 'week', 'day', 'hour', 'min', 'sec'])
>>> df.select(
...     sf.try_make_interval(df.year, df.month, 'week', df.day, df.hour, df.min)
... ).show(truncate=False)
+-------------------------------------------------------+
|try_make_interval(year, month, week, day, hour, min, 0)|
+-------------------------------------------------------+
|100 years 11 months 8 days 12 hours 30 minutes         |
+-------------------------------------------------------+

Example 3: Try make interval from years, months, weeks, days and hours.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]],
...     ['year', 'month', 'week', 'day', 'hour', 'min', 'sec'])
>>> df.select(
...     sf.try_make_interval(df.year, df.month, 'week', df.day, df.hour)
... ).show(truncate=False)
+-----------------------------------------------------+
|try_make_interval(year, month, week, day, hour, 0, 0)|
+-----------------------------------------------------+
|100 years 11 months 8 days 12 hours                  |
+-----------------------------------------------------+

Example 4: Try make interval from years, months, weeks and days.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]],
...     ['year', 'month', 'week', 'day', 'hour', 'min', 'sec'])
>>> df.select(sf.try_make_interval(df.year, 'month', df.week, df.day)).show(truncate=False)
+--------------------------------------------------+
|try_make_interval(year, month, week, day, 0, 0, 0)|
+--------------------------------------------------+
|100 years 11 months 8 days                        |
+--------------------------------------------------+

Example 5: Try make interval from years, months and weeks.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]],
...     ['year', 'month', 'week', 'day', 'hour', 'min', 'sec'])
>>> df.select(sf.try_make_interval(df.year, 'month', df.week)).show(truncate=False)
+------------------------------------------------+
|try_make_interval(year, month, week, 0, 0, 0, 0)|
+------------------------------------------------+
|100 years 11 months 7 days                      |
+------------------------------------------------+

Example 6: Try make interval from years and months.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]],
...     ['year', 'month', 'week', 'day', 'hour', 'min', 'sec'])
>>> df.select(sf.try_make_interval(df.year, 'month')).show(truncate=False)
+---------------------------------------------+
|try_make_interval(year, month, 0, 0, 0, 0, 0)|
+---------------------------------------------+
|100 years 11 months                          |
+---------------------------------------------+

Example 7: Try make interval from years.

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]],
...     ['year', 'month', 'week', 'day', 'hour', 'min', 'sec'])
>>> df.select(sf.try_make_interval(df.year)).show(truncate=False)
+-----------------------------------------+
|try_make_interval(year, 0, 0, 0, 0, 0, 0)|
+-----------------------------------------+
|100 years                                |
+-----------------------------------------+

Example 8: Try make empty interval.

>>> import pyspark.sql.functions as sf
>>> spark.range(1).select(sf.try_make_interval()).show(truncate=False)
+--------------------------------------+
|try_make_interval(0, 0, 0, 0, 0, 0, 0)|
+--------------------------------------+
|0 seconds                             |
+--------------------------------------+

Example 9: Try make interval from years with overflow.

>>> import pyspark.sql.functions as sf
>>> spark.range(1).select(sf.try_make_interval(sf.lit(2147483647))).show(truncate=False)
+-----------------------------------------------+
|try_make_interval(2147483647, 0, 0, 0, 0, 0, 0)|
+-----------------------------------------------+
|NULL                                           |
+-----------------------------------------------+