Hi Kumaresh,
This is most likely an issue with the size of your Spark cluster not being
large enough to accomplish the desired task. Hints for this type of
situation are when the stack trace mentions things like a size limitation
was exceeded and you lost a node.
However, this is also a great
Hello Spark Community!
I am new to Spark. I tried to read a multiline json file (has around 2M
records and gzip size is about 2GB) and encountered an exception. It works
if I convert the same file into jsonl before reading it via spark.
Unfortunately the file is private and I cannot share it. Is