So it appears then that the equivalent operators for PySpark are completely missing from the docs, right? That’s surprising. And if there are column function equivalents for |, &, and ~, then I can’t find those either for PySpark. Indeed, I don’t think such a thing is possible in PySpark. (e.g. (col('age') > 0).and(...))
I can file a ticket about this, but I’m just making sure I’m not missing something obvious. On Tue, Oct 23, 2018 at 2:50 PM Sean Owen <sro...@gmail.com> wrote: > Those should all be Column functions, really, and I see them at > http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.Column > > On Tue, Oct 23, 2018, 12:27 PM Nicholas Chammas < > nicholas.cham...@gmail.com> wrote: > >> I can’t seem to find any documentation of the &, |, and ~ operators for >> PySpark DataFrame columns. I assume that should be in our docs somewhere. >> >> Was it always missing? Am I just missing something obvious? >> >> Nick >> >