[ https://issues.apache.org/jira/browse/SPARK-2022?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Matthew Farrellee closed SPARK-2022. ------------------------------------ Resolution: Fixed > Spark 1.0.0 is failing if mesos.coarse set to true > -------------------------------------------------- > > Key: SPARK-2022 > URL: https://issues.apache.org/jira/browse/SPARK-2022 > Project: Spark > Issue Type: Bug > Components: Mesos > Affects Versions: 1.0.0 > Reporter: Marek Wiewiorka > Assignee: Tim Chen > Priority: Critical > > more stderr > ------------------- > WARNING: Logging before InitGoogleLogging() is written to STDERR > I0603 16:07:53.721132 61192 exec.cpp:131] Version: 0.18.2 > I0603 16:07:53.725230 61200 exec.cpp:205] Executor registered on slave > 201405220917-134217738-5050-27119-0 > Exception in thread "main" java.lang.NumberFormatException: For input string: > "sparkseq003.cloudapp.net" > at > java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) > at java.lang.Integer.parseInt(Integer.java:492) > at java.lang.Integer.parseInt(Integer.java:527) > at > scala.collection.immutable.StringLike$class.toInt(StringLike.scala:229) > at scala.collection.immutable.StringOps.toInt(StringOps.scala:31) > at > org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:135) > at > org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala) > more stdout > ----------------------- > Registered executor on sparkseq003.cloudapp.net > Starting task 5 > Forked command at 61202 > sh -c '"/home/mesos/spark-1.0.0/bin/spark-class" > org.apache.spark.executor.CoarseGrainedExecutorBackend > -Dspark.mesos.coarse=true > akka.tcp://sp...@sparkseq001.cloudapp.net:40312/user/CoarseG > rainedScheduler 201405220917-134217738-5050-27119-0 sparkseq003.cloudapp.net > 4' > Command exited with status 1 (pid: 61202) -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org