pyspark.sql.functions.try_to_date#
- pyspark.sql.functions.try_to_date(col, format=None)[source]#
This is a special version of try_to_date that performs the same operation, but returns a NULL value instead of raising an error if date cannot be created.
New in version 4.0.0.
- Parameters
- col
Column
or column name input column of values to convert.
- format: literal string, optional
format to use to convert date values.
- col
- Returns
Column
date value as
pyspark.sql.types.DateType
type.
See also
Examples
>>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([('1997-02-28',)], ['ts']) >>> df.select('*', sf.try_to_date(df.ts)).show() +----------+---------------+ | ts|try_to_date(ts)| +----------+---------------+ |1997-02-28| 1997-02-28| +----------+---------------+
>>> df.select('*', sf.try_to_date('ts', 'yyyy-MM-dd')).show() +----------+---------------------------+ | ts|try_to_date(ts, yyyy-MM-dd)| +----------+---------------------------+ |1997-02-28| 1997-02-28| +----------+---------------------------+
>>> df = spark.createDataFrame([('foo',)], ['ts']) >>> df.select(sf.try_to_date(df.ts)).show() +---------------+ |try_to_date(ts)| +---------------+ | NULL| +---------------+