Do you mind pastebin'ning the stack trace with the error so that we know
which part of the code is under discussion ?

Thanks

On Tue, Mar 1, 2016 at 7:48 AM, Peter Halliday <pjh...@cornell.edu> wrote:

> I have a Spark application that has a Task seem to fail, but it actually
> did write out some of the files that were assigned it.  And Spark assigns
> another executor that task, and it gets a FileAlreadyExistsException.  The
> Hadoop code seems to allow for files to be overwritten, but I see the 1.5.1
> version of this code doesn’t allow for this to be passed in.  Is that
> correct?
>
> Peter Halliday
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to