pyspark.sql.functions.std#
- pyspark.sql.functions.std(col)[source]#
 Aggregate function: alias for stddev_samp.
New in version 3.5.0.
- Parameters
 - col
Columnor column name target column to compute on.
- col
 - Returns
 Columnstandard deviation of given column.
See also
Examples
>>> import pyspark.sql.functions as sf >>> spark.range(6).select(sf.std("id")).show() +------------------+ | std(id)| +------------------+ |1.8708286933869...| +------------------+