pyspark.sql.functions.variance#
- pyspark.sql.functions.variance(col)[source]#
Aggregate function: alias for var_samp
New in version 1.6.0.
Changed in version 3.4.0: Supports Spark Connect.
- Parameters
- col
Column
or column name target column to compute on.
- col
- Returns
Column
variance of given column.
See also
Examples
>>> from pyspark.sql import functions as sf >>> df = spark.range(6) >>> df.select(sf.variance(df.id)).show() +------------+ |variance(id)| +------------+ | 3.5| +------------+