,
> Hemanth
>
> *From: *Femi Anthony
> *Date: *Thursday, 10 August 2017 at 11.24
> *To: *Hemanth Gudela
> *Cc: *"user@spark.apache.org"
> *Subject: *Re: spark.write.csv is not able write files to specified path,
> but is writing to unintended subfolder _
.apache.org<mailto:user@spark.apache.org>"
mailto:user@spark.apache.org>>
Subject: Re: spark.write.csv is not able write files to specified path, but is
writing to unintended subfolder _temporary/0/task_xxx folder on worker nodes
Also, why are you trying to write results locally if you
Gudela
Cc: "user@spark.apache.org"
Subject: Re: spark.write.csv is not able write files to specified path, but is
writing to unintended subfolder _temporary/0/task_xxx folder on worker nodes
Also, why are you trying to write results locally if you're not using a
distributed file sy
a
> Cc: "user@spark.apache.org"
> Subject: Re: spark.write.csv is not able write files to specified path, but
> is writing to unintended subfolder _temporary/0/task_xxx folder on worker
> nodes
>
> Normally the _temporary directory gets deleted as part of the cleanup
ubject: Re: spark.write.csv is not able write files to specified path, but is
writing to unintended subfolder _temporary/0/task_xxx folder on worker nodes
Is your filePath prefaced with file:/// and the full path or is it relative ?
You might also try calling close() on the Spark context or session
> From: Femi Anthony
> Date: Thursday, 10 August 2017 at 10.38
> To: Hemanth Gudela
> Cc: "user@spark.apache.org"
> Subject: Re: spark.write.csv is not able write files to specified path, but
> is writing to unintended subfolder _temporary/0/task_xxx folder on worker
ark.apache.org"
Subject: Re: spark.write.csv is not able write files to specified path, but is
writing to unintended subfolder _temporary/0/task_xxx folder on worker nodes
Normally the _temporary directory gets deleted as part of the cleanup when the
write is complete and a SUCCESS file is c
Normally the* _temporary* directory gets deleted as part of the cleanup
when the write is complete and a SUCCESS file is created. I suspect that
the writes are not properly completed. How are you specifying the write ?
Any error messages in the logs ?
On Thu, Aug 10, 2017 at 3:17 AM, Hemanth Gudel
Hi,
I’m running spark on cluster mode containing 4 nodes, and trying to write CSV
files to node’s local path (not HDFS).
I’m spark.write.csv to write CSV files.
On master node:
spark.write.csv creates a folder with csv file name and writes many files with
part-r-000n suffix. This is okay for me