Hi Marcelo.
Thanks for the quick reply. Can you suggest me how to increase the memory
limits or how to tackle this problem. I am a novice. If you want I can post
my code here.


Thanks


On Wed, Jul 9, 2014 at 12:50 AM, Marcelo Vanzin <van...@cloudera.com> wrote:

> This is generally a side effect of your executor being killed. For
> example, Yarn will do that if you're going over the requested memory
> limits.
>
> On Tue, Jul 8, 2014 at 12:17 PM, Rahul Bhojwani
> <rahulbhojwani2...@gmail.com> wrote:
> > HI,
> >
> > I am getting this error. Can anyone help out to explain why is this error
> > coming.
> >
> > ########
> >
> > Exception in thread "delete Spark temp dir
> >
> C:\Users\shawn\AppData\Local\Temp\spark-27f60467-36d4-4081-aaf5-d0ad42dda560"
> >  java.io.IOException: Failed to delete:
> >
> C:\Users\shawn\AppData\Local\Temp\spark-27f60467-36d4-4081-aaf5-d0ad42dda560\tmp
> > cmenlp
> >         at
> org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:483)
> >         at
> >
> org.apache.spark.util.Utils$$anonfun$deleteRecursively$1.apply(Utils.scala:479)
> >         at
> >
> org.apache.spark.util.Utils$$anonfun$deleteRecursively$1.apply(Utils.scala:478)
> >         at
> >
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
> >         at
> > scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
> >         at
> org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:478)
> >         at org.apache.spark.util.Utils$$anon$4.run(Utils.scala:212)
> > PS>
> > ############
> >
> >
> >
> >
> > Thanks in advance
> > --
> > Rahul K Bhojwani
> > 3rd Year B.Tech
> > Computer Science and Engineering
> > National Institute of Technology, Karnataka
>
>
>
> --
> Marcelo
>



-- 
Rahul K Bhojwani
3rd Year B.Tech
Computer Science and Engineering
National Institute of Technology, Karnataka

Reply via email to