pyspark.sql.functions.to_date#
- pyspark.sql.functions.to_date(col, format=None)[source]#
- Converts a - Columninto- pyspark.sql.types.DateTypeusing the optionally specified format. Specify formats according to datetime pattern. By default, it follows casting rules to- pyspark.sql.types.DateTypeif the format is omitted. Equivalent to- col.cast("date").- New in version 2.2.0. - Changed in version 3.4.0: Supports Spark Connect. - Parameters
- colColumnor column name
- input column of values to convert. 
- format: literal string, optional
- format to use to convert date values. 
 
- col
- Returns
- Column
- date value as - pyspark.sql.types.DateTypetype.
 
 - See also - Examples - >>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([('1997-02-28 10:30:00',)], ['ts']) >>> df.select('*', sf.to_date(df.ts)).show() +-------------------+-----------+ | ts|to_date(ts)| +-------------------+-----------+ |1997-02-28 10:30:00| 1997-02-28| +-------------------+-----------+ - >>> df.select('*', sf.to_date('ts', 'yyyy-MM-dd HH:mm:ss')).show() +-------------------+--------------------------------+ | ts|to_date(ts, yyyy-MM-dd HH:mm:ss)| +-------------------+--------------------------------+ |1997-02-28 10:30:00| 1997-02-28| +-------------------+--------------------------------+