[ 
https://issues.apache.org/jira/browse/SPARK-1638?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kalpit Shah updated SPARK-1638:
-------------------------------

    Component/s:     (was: Deploy)
                     (was: EC2)
                 Spark Core

> Executors fail to come up if "spark.executor.extraJavaOptions" is set 
> ----------------------------------------------------------------------
>
>                 Key: SPARK-1638
>                 URL: https://issues.apache.org/jira/browse/SPARK-1638
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>         Environment: Bring up a cluster in EC2 using spark-ec2 scripts
>            Reporter: Kalpit Shah
>             Fix For: 1.0.0
>
>
> If you try to launch a PySpark shell with "spark.executor.extraJavaOptions" 
> set to "-XX:+UseCompressedOops -XX:+UseCompressedStrings -verbose:gc 
> -XX:+PrintGCDetails -XX:+PrintGCTimeStamps", the executors never come up on 
> any of the workers.
> I see the following error in log file :
> Spark Executor Command: "/usr/lib/jvm/java/bin/java" "-cp" 
> "/root/c3/lib/*::/root/ephemeral-hdfs/conf:/root/spark/conf:/root/spark/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar:"
>  "-XX:+UseCompressedOops -XX:+UseCompressedStrings -verbose:gc 
> -XX:+PrintGCDetails -XX:+PrintGCTimeStamps" "-Xms13312M" "-Xmx13312M" 
> "org.apache.spark.executor.CoarseGrainedExecutorBackend" 
> "akka.tcp://spark@HOSTNAME:45429/user/CoarseGrainedScheduler" "7" "HOSTNAME" 
> "4" "akka.tcp://sparkWorker@HOSTNAME:39727/user/Worker" 
> "app-20140423224526-0000"
> ========================================
> Unrecognized VM option 'UseCompressedOops -XX:+UseCompressedStrings 
> -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps'
> Error: Could not create the Java Virtual Machine.
> Error: A fatal exception has occurred. Program will exit.
>  



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to