Im using Spark1.4.2 with Hadoop 2.7, I tried increasing
spark.shuffle.io.maxRetries to 10 but didn't help.
Any ideas on what could be causing this??
This is the exception that I am getting:
[MySparkApplication] WARN : Failed to execute SQL statement select *
from TableS s join TableC c on
bq. (Permission denied)
Have you checked the permission for /mnt/md0/var/lib/spark/... ?
Cheers
On Thu, Nov 26, 2015 at 3:03 AM, Sahil Sareen wrote:
> Im using Spark1.4.2 with Hadoop 2.7, I tried increasing
> spark.shuffle.io.maxRetries to 10 but didn't help.
>
> Any
I tried increasing spark.shuffle.io.maxRetries to 10 but didn't help.
This is the exception that I am getting:
[MySparkApplication] WARN : Failed to execute SQL statement select *
from TableS s join TableC c on s.property = c.property from X YZ
org.apache.spark.SparkException: Job