[jira] [Comment Edited] (SPARK-15564) App name is the main class name in Spark streaming jobs

2016-05-26 Thread Saisai Shao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-15564?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15303442#comment-15303442
 ] 

Saisai Shao edited comment on SPARK-15564 at 5/27/16 4:04 AM:
--

According to the your description, I guess you're running streaming application 
on yarn cluster mode?

If so you need to set application name through {{--name}} or set 
{{spark.app.name}} in conf file / {{--conf}}. Since in yarn cluster mode, 
yarn/client starts before driver started, and it will set the app name in yarn 
{{ApplicationSubmissionContext}}, at that time app name is not available, so it 
will pick class name instead.

So from my understanding it is by design.


was (Author: jerryshao):
According to the your description, I guess you're running streaming application 
on yarn cluster mode?

If so you need to set application name through {{--name}} or set 
{{spark.app.name}} in conf file / {{--conf}}. Since in yarn cluster mode, 
yarn/client starts before driver started, and it will set the app name in yarn 
{{ApplicationSubmissionContext}}, at that time app name is not available, so it 
will pick class name instead.


> App name is the main class name in Spark streaming jobs
> ---
>
> Key: SPARK-15564
> URL: https://issues.apache.org/jira/browse/SPARK-15564
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.6.1
>Reporter: Steven Lowenthal
>Priority: Minor
>
> I've tried everything to set the app name to something other than the class 
> name of the job, but spark reports the application name as the class.  This 
> adversely affects the ability to monitor jobs, we can't have dots in the 
> reported app name. 
> {code:title=job.scala}
>   val defaultAppName = "NDS Transform"
>conf.setAppName(defaultAppName)
>println (s"App Name: ${conf.get("spark.app.name")}")
>   ...
>   val ssc = new StreamingContext(conf, streamingBatchWindow)
> {code}
> {code:title=output}
> App Name: NDS Transform
> {code}
> Application IDName
> app-20160526161230-0017 (kill)  com.gracenote.ongo.spark.NDSStreamAvro



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-15564) App name is the main class name in Spark streaming jobs

2016-05-26 Thread Steven Lowenthal (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-15564?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15302358#comment-15302358
 ] 

Steven Lowenthal edited comment on SPARK-15564 at 5/26/16 4:34 PM:
---

It looks like it works using the alternate constructor on SparkStreamingContext

{code:title=job.scala}
  val sc = new SparkContext(conf)
  val ssc = new StreamingContext(sc, streamingBatchWindow)
{code}

I suspect the issue is here.   createNewSparkContext is called with a null app 
name.  It should be something like createNewSparkContext(..., 
conf.get("spark.app.name"), ... )

{code:title=StreamingContext.scala}
  def this(conf: SparkConf, batchDuration: Duration) = {
this(StreamingContext.createNewSparkContext(conf), null, batchDuration)
  }
{code}


was (Author: slowenthal):
It looks like it works using the alternate constructor on SparkStreamingContext

{code:title=job.scala}
  val sc = new SparkContext(conf)
  val ssc = new StreamingContext(sc, streamingBatchWindow)
{code}

I suspect the issue is here with createNewSparkContext:

{code:title=StreamingContext.scala}
  def this(conf: SparkConf, batchDuration: Duration) = {
this(StreamingContext.createNewSparkContext(conf), null, batchDuration)
  }
{code}

> App name is the main class name in Spark streaming jobs
> ---
>
> Key: SPARK-15564
> URL: https://issues.apache.org/jira/browse/SPARK-15564
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.6.1
>Reporter: Steven Lowenthal
>
> I've tried everything to set the app name to something other than the class 
> name of the job, but spark reports the application name as the class.  This 
> adversely affects the ability to monitor jobs, we can't have dots in the 
> reported app name. 
> {code:title=job.scala}
>   val defaultAppName = "NDS Transform"
>conf.setAppName(defaultAppName)
>println (s"App Name: ${conf.get("spark.app.name")}")
>   ...
>   val ssc = new StreamingContext(conf, streamingBatchWindow)
> {code}
> {code:title=output}
> App Name: NDS Transform
> {code}
> Application IDName
> app-20160526161230-0017 (kill)  com.gracenote.ongo.spark.NDSStreamAvro



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org