Hello All,

I have a column in a dataframe which i struct type.I want to find the size
of the column in bytes.it is getting failed  while loading in snowflake.
I could see size functions avialable to get the length.how to calculate the
size in bytes for a column in pyspark dataframe.

pyspark.sql.functions.size(col)[source]
Collection function: returns the length of the array or map stored in the
column.

Please help me on this case.

Thanks



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to