[
https://issues.apache.org/jira/browse/SPARK-1792?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Joseph E. Gonzalez updated SPARK-1792:
--------------------------------------
Description:
The `conf/spark-env.sh.template` does not have configure options for the spark
shell. For example to enable Kryo for GraphX when using the spark shell in
stand alone mode it appears you must add:
{code}
SPARK_SUBMIT_OPTS="-Dspark.serializer=org.apache.spark.serializer.KryoSerializer
"
SPARK_SUBMIT_OPTS+="-Dspark.kryo.registrator=org.apache.spark.graphx.GraphKryoRegistrator
"
{code}
However SPARK_SUBMIT_OPTS is not documented anywhere. Perhaps the spark-shell
should have its own options (e.g., SPARK_SHELL_OPTS).
was:
The `conf/spark-env.sh.template` does not have configure options for the spark
shell. For example to enable Kryo for GraphX when using the spark shell in
stand alone mode it appears you must add:
{code}
SPARK_SUBMIT_OPTS="-Dspark.serializer=org.apache.spark.serializer.KryoSerializer
"
SPARK_SUBMIT_OPTS+="-Dspark.kryo.registrator=org.apache.spark.graphx.GraphKryoRegistrator
"
{code}
However SPARK_SUBMIT_OPTS is not documented anywhere. Perhaps the spark-shell
should have its own options (e.g., SPARK_SHELL_OPTS).
We might want to resolve this with the 1.0 release @matei and @pwendell?
> Missing Spark-Shell Configure Options
> -------------------------------------
>
> Key: SPARK-1792
> URL: https://issues.apache.org/jira/browse/SPARK-1792
> Project: Spark
> Issue Type: Bug
> Components: Documentation, Spark Core
> Reporter: Joseph E. Gonzalez
>
> The `conf/spark-env.sh.template` does not have configure options for the
> spark shell. For example to enable Kryo for GraphX when using the spark
> shell in stand alone mode it appears you must add:
> {code}
> SPARK_SUBMIT_OPTS="-Dspark.serializer=org.apache.spark.serializer.KryoSerializer
> "
> SPARK_SUBMIT_OPTS+="-Dspark.kryo.registrator=org.apache.spark.graphx.GraphKryoRegistrator
> "
> {code}
> However SPARK_SUBMIT_OPTS is not documented anywhere. Perhaps the
> spark-shell should have its own options (e.g., SPARK_SHELL_OPTS).
--
This message was sent by Atlassian JIRA
(v6.2#6252)