I'm not sure on Python, not expert in that area.  Based on pr,
https://github.com/apache/spark/pull/8318, I believe you are correct that
Spark would need to be installed for you to be able to currently leverage
the pyspark package.

On Sun, Feb 28, 2016 at 1:38 PM, moshir mikael <moshir.mik...@gmail.com>
wrote:

> Ok,
> but what do I need for the program to run.
> In python  sparkcontext  = SparkContext(conf) only works when you have
> spark installed locally.
> AFAIK there is no *pyspark *package for python that you can install doing
> pip install pyspark.
> You actually need to install spark to get it running (e.g :
> https://github.com/KristianHolsheimer/pyspark-setup-guide).
>
> Does it mean you need to install spark on the box your applications runs
> to benefit from pyspark and this is required to connect to another remote
> spark cluster ?
> Am I missing something obvious ?
>
>
> Le dim. 28 févr. 2016 à 19:01, Todd Nist <tsind...@gmail.com> a écrit :
>
>> Define your SparkConfig to set the master:
>>
>>   val conf = new SparkConf().setAppName(AppName)
>>     .setMaster(SparkMaster)
>>     .set(....)
>>
>> Where SparkMaster = "spark://SparkServerHost:7077".  So if your spark
>> server hostname it "RADTech" then it would be "spark://RADTech:7077".
>>
>> Then when you create the SparkContext, pass the SparkConf  to it:
>>
>>     val sparkContext = new SparkContext(conf)
>>
>> Then use the sparkContext for interact with the SparkMaster / Cluster.
>> Your program basically becomes the driver.
>>
>> HTH.
>>
>> -Todd
>>
>> On Sun, Feb 28, 2016 at 9:25 AM, mms <moshir.mik...@gmail.com> wrote:
>>
>>> Hi, I cannot find a simple example showing how a typical application can
>>> 'connect' to a remote spark cluster and interact with it. Let's say I have
>>> a Python web application hosted somewhere *outside *a spark cluster,
>>> with just python installed on it. How can I talk to Spark without using a
>>> notebook, or using ssh to connect to a cluster master node ? I know of
>>> spark-submit and spark-shell, however forking a process on a remote host to
>>> execute a shell script seems like a lot of effort What are the recommended
>>> ways to connect and query Spark from a remote client ? Thanks Thx !
>>> ------------------------------
>>> View this message in context: Spark Integration Patterns
>>> <http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Integration-Patterns-tp26354.html>
>>> Sent from the Apache Spark User List mailing list archive
>>> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>>>
>>
>>

Reply via email to