Re: How to access application name in the spark framework code.

2014-11-24 Thread Kartheek.R
Hi Deng,

Thank you. That works perfectly:)

Regards
Karthik.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-access-application-name-in-the-spark-framework-code-tp19719p19723.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: How to access application name in the spark framework code.

2014-11-24 Thread Deng Ching-Mallete
Hi,

I think it should be accessible via the SparkConf in the SparkContext.
Something like sc.getConf().get("spark.app.name")?

Thanks,
Deng

On Tue, Nov 25, 2014 at 12:40 PM, rapelly kartheek 
wrote:

> Hi,
>
> When I submit a spark application like this:
>
> ./bin/spark-submit --class org.apache.spark.examples.SparkKMeans
> --deploy-mode client --master spark://karthik:7077
> $SPARK_HOME/examples/*/scala-*/spark-examples-*.jar /k-means 4 0.001
> Which part of the spark framework code deals with the name of the
> application?. Basically, I want to access the name of the application in
> the spark scheduler code.
>
> Can someone please tell me where I should look for the code that deals
> with the name of the currently executing application (say, SparkKMeans)?
>
> Thank you.
>



-- 
Maria Odea "Deng" Ching-Mallete | och...@apache.org |
http://www.linkedin.com/in/oching