Thanks, Gaurav and Corey,

Probably I didn’t make myself clear. I am looking for best Spark practice 
similar to Shiny for R, the analysis/visualziation results can be easily 
published to web server and shown from web browser. Or any dashboard for Spark?

Best regards,

Cui Lin

From: gtinside <gtins...@gmail.com<mailto:gtins...@gmail.com>>
Date: Friday, January 9, 2015 at 7:45 PM
To: Corey Nolet <cjno...@gmail.com<mailto:cjno...@gmail.com>>
Cc: Cui Lin <cui....@hds.com<mailto:cui....@hds.com>>, 
"user@spark.apache.org<mailto:user@spark.apache.org>" 
<user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: Re: Web Service + Spark

You can also look at Spark Job Server 
https://github.com/spark-jobserver/spark-jobserver

- Gaurav

On Jan 9, 2015, at 10:25 PM, Corey Nolet 
<cjno...@gmail.com<mailto:cjno...@gmail.com>> wrote:

Cui Lin,

The solution largely depends on how you want your services deployed (Java web 
container, Spray framework, etc...) and if you are using a cluster manager like 
Yarn or Mesos vs. just firing up your own executors and master.

I recently worked on an example for deploying Spark services inside of Jetty 
using Yarn as the cluster manager. It forced me to learn how Spark wires up the 
dependencies/classpaths. If it helps, the example that resulted from my 
tinkering is located at [1].


[1] https://github.com/calrissian/spark-jetty-server

On Fri, Jan 9, 2015 at 9:33 PM, Cui Lin 
<cui....@hds.com<mailto:cui....@hds.com>> wrote:
Hello, All,

What’s the best practice on deploying/publishing spark-based scientific 
applications into a web service? Similar to Shiny on R.
 Thanks!

Best regards,

Cui Lin

Reply via email to