SOLVED!
thanks for the help - I found the issue. it was the version of pyarrow
(0.15.1) which apparently isn't currently stable. Downgrading it solved the
issue for me
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
Hi,
Thanks for your reply.
Tried what you've suggested and still getting the same error.
Also worth mentioning that when I tried to simply write the dataframe to S3,
without applying the function, it works.
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
---