To add to this, conceptually, it makes no sense to launch something in
yarn-cluster mode by creating a SparkContext on the client - the whole
point of yarn-cluster mode is that the SparkContext runs on the cluster,
not on the client.

On Thu, Jul 9, 2015 at 2:35 PM, Marcelo Vanzin <van...@cloudera.com> wrote:

> You cannot run Spark in cluster mode by instantiating a SparkContext like
> that.
>
> You have to launch it with the "spark-submit" command line script.
>
> On Thu, Jul 9, 2015 at 2:23 PM, jegordon <jgordo...@gmail.com> wrote:
>
>> Hi to all,
>>
>> Is there any way to run pyspark scripts with yarn-cluster mode without
>> using
>> the spark-submit script? I need it in this way because i will integrate
>> this
>> code into a django web app.
>>
>> When i try to run any script in yarn-cluster mode i got the following
>> error
>> :
>>
>> org.apache.spark.SparkException: Detected yarn-cluster mode, but isn't
>> running on a cluster. Deployment to YARN is not supported directly by
>> SparkContext. Please use spark-submit.
>>
>>
>> I'm creating the sparkContext in the following way :
>>
>>         conf = (SparkConf()
>>             .setMaster("yarn-cluster")
>>             .setAppName("DataFrameTest"))
>>
>>         sc = SparkContext(conf = conf)
>>
>>         #Dataframe code ....
>>
>> Thanks
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Pyspark-not-working-on-yarn-cluster-mode-tp23755.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>
>
> --
> Marcelo
>

Reply via email to