Spark Version 1.3
Command:

./bin/spark-submit -v --master yarn-cluster --driver-class-path
/apache/hadoop/share/hadoop/common/hadoop-common-2.4.1-company-2.jar:/apache/hadoop/lib/hadoop-lzo-0.6.0.jar:/apache/hadoop-2.4.1-2.1.3.0-2-company/share/hadoop/yarn/lib/guava-11.0.2.jar:/apache/hadoop-2.4.1-2.1.3.0-2-company/share/hadoop/hdfs/hadoop-hdfs-2.4.1-company-2.jar
--num-executors 100 --driver-memory 4g --driver-java-options
"-XX:MaxPermSize=4G" --executor-memory 8g --executor-cores 1 --queue
hdmi-express --class com. company.ep.poc.spark.reporting.SparkApp
/home/dvasthimal/spark1.3/spark_reporting-1.0-SNAPSHOT.jar
startDate=2015-04-6 endDate=2015-04-7
input=/user/dvasthimal/epdatasets_small/exptsession subcommand=viewItem
output=/user/dvasthimal/epdatasets/viewItem

On Wed, Apr 8, 2015 at 2:30 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) <deepuj...@gmail.com> wrote:

> I have a spark stage that has 8 tasks.  7/8 have completed. However 1 task
> is failing with Cannot find address
>
>
> Aggregated Metrics by ExecutorExecutor IDAddressTask TimeTotal TasksFailed
> TasksSucceeded TasksShuffle Read Size / RecordsShuffle Write Size /
> RecordsShuffle Spill (Memory)Shuffle Spill (Disk)19CANNOT FIND ADDRESS24
> min1101248.9 MB / 561940060.0 B / 00.0 B0.0 B
> 47    CANNOT FIND
> ADDRESS2.3 h1101295.3 MB / 562020370.0 B / 00.0 B0.0 B
> Any suggestions ?
> --
> Deepak
>
>


-- 
Deepak

Reply via email to