Define your SparkConfig to set the master:

  val conf = new SparkConf().setAppName(AppName)
    .setMaster(SparkMaster)
    .set(....)

Where SparkMaster = "spark://SparkServerHost:7077".  So if your spark
server hostname it "RADTech" then it would be "spark://RADTech:7077".

Then when you create the SparkContext, pass the SparkConf  to it:

    val sparkContext = new SparkContext(conf)

Then use the sparkContext for interact with the SparkMaster / Cluster.
Your program basically becomes the driver.

HTH.

-Todd

On Sun, Feb 28, 2016 at 9:25 AM, mms <moshir.mik...@gmail.com> wrote:

> Hi, I cannot find a simple example showing how a typical application can
> 'connect' to a remote spark cluster and interact with it. Let's say I have
> a Python web application hosted somewhere *outside *a spark cluster, with
> just python installed on it. How can I talk to Spark without using a
> notebook, or using ssh to connect to a cluster master node ? I know of
> spark-submit and spark-shell, however forking a process on a remote host to
> execute a shell script seems like a lot of effort What are the recommended
> ways to connect and query Spark from a remote client ? Thanks Thx !
> ------------------------------
> View this message in context: Spark Integration Patterns
> <http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Integration-Patterns-tp26354.html>
> Sent from the Apache Spark User List mailing list archive
> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>

Reply via email to