I am trying to understand the best way to handle the scenario where null
array "[]" is passed. Can somebody suggest if there is a way to filter out
such records. I've tried numerous things including using
dataframe.head().isEmpty but pyspark doesn't recognize isEmpty even though
I see it in the API docs.

pyspark.sql.utils.AnalysisException: u"cannot resolve '`timestamp`' given
input columns: []; line 1 pos 0;\n'Filter isnotnull('timestamp)\n+-
LogicalRDD\n"

Reply via email to