Sean Owen resolved SPARK-19624.
    Resolution: Not A Problem

Just as with spark-submit, you can and should use --name to set the shell app 
name. It works for me.

> --conf spark.app.name=test is not working with spark-shell/pyspark
> ------------------------------------------------------------------
>                 Key: SPARK-19624
>                 URL: https://issues.apache.org/jira/browse/SPARK-19624
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, Spark Shell, Spark Submit
>    Affects Versions: 1.6.0, 2.0.0, 2.1.0
>            Reporter: Sachin Aggarwal
>            Priority: Minor
> On starting a spark-shell or pyshark shell if we pass --conf 
> spark.app.name=test it is not working as --name "Spark shell"  takes 
> precedence over --conf 
> line refrence for spark-shell
> https://github.com/apache/spark/blob/master/bin/spark-shell#L53
> similarly line refrence for pyspark 
> https://github.com/apache/spark/blob/master/bin/pyspark#L77

This message was sent by Atlassian JIRA

To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to