You can workaround it by leveraging expr, e.g., expr("unix_micros(col)") for now. Should better have Scala binding first before we have Python one FWIW,
On Sat, 15 Oct 2022 at 06:19, Martin <bos-t...@gmx.de> wrote: > Hi everyone, > > In *Spark SQL* there are several timestamp related functions > > - unix_micros(timestamp) > Returns the number of microseconds since 1970-01-01 00:00:00 UTC. > - unix_millis(timestamp) > Returns the number of milliseconds since 1970-01-01 00:00:00 UTC. > Truncates higher levels of precision. > > See https://spark.apache.org/docs/latest/sql-ref-functions-builtin.html > > Currently these are *"missing" in pyspark.sql.functions*. > > https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/functions.html#datetime-functions > > I'd appreciate it if these were also available in PySpark. > > Cheers, > Martin >