Re: Creating Apache Spark-powered “As Service” applications

2015-01-16 Thread kevinkim
Hi,
I guess you might be interested in Apache Zeppelin,
It's a Spark-backed notebook & visualization tool.

http://zeppelin.incubator.apache.org
Please take a look at it!

Regards,
Kevin


On Sat Jan 17 2015 at 4:32:12 AM olegshirokikh [via Apache Spark User List]
 wrote:

> The question is about the ways to create a Windows desktop-based and/or
> web-based application client that is able to connect and talk to the server
> containing Spark application (either local or on-premise cloud
> distributions) in the run-time.
>
> Any language/architecture may work. So far, I've seen two things that may
> be a help in that, but I'm not so sure if they would be the best
> alternative and how they work yet:
>
> Spark Job Server - https://github.com/spark-jobserver/spark-jobserver -
> defines a REST API for Spark
> Hue -
> http://gethue.com/get-started-with-spark-deploy-spark-server-and-compute-pi-from-your-web-browser/
>  -
> uses item 1)
>
> Any advice would be appreciated. Simple toy example program (or steps)
> that shows, e.g. how to build such client for simply creating Spark Context
> on a local machine and say reading text file and returning basic stats
> would be ideal answer!
>
> --
>  If you reply to this email, your message will be added to the discussion
> below:
>
> http://apache-spark-user-list.1001560.n3.nabble.com/Creating-Apache-Spark-powered-As-Service-applications-tp21193.html
>  To unsubscribe from Apache Spark User List, click here
> <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code&node=1&code=a2V2aW5raW1AYXBhY2hlLm9yZ3wxfC0xNDUyMjU3MDUw>
> .
> NAML
> <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>
>




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Creating-Apache-Spark-powered-As-Service-applications-tp21193p21201.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

RE: Creating Apache Spark-powered “As Service” applications

2015-01-16 Thread Oleg Shirokikh
Thanks a lot, Robert – I’ll definitely investigate this and probably would come 
back with questions.

P.S. I’m new to this Spark forum. I’m getting responses through emails but they 
are not appearing as “replies” in the thread – it’s kind of inconvenient. Is it 
something that I should tweak?

Thanks,
Oleg

From: Robert C Senkbeil [mailto:rcsen...@us.ibm.com]
Sent: Friday, January 16, 2015 12:21 PM
To: Oleg Shirokikh
Cc: user@spark.apache.org
Subject: Re: Creating Apache Spark-powered “As Service” applications


Hi,

You can take a look at the Spark Kernel project: 
https://github.com/ibm-et/spark-kernel

The Spark Kernel's goal is to serve as the foundation for interactive 
applications. The project provides a client library in Scala that abstracts 
connecting to the kernel (containing a Spark Context), which can be embedded 
into a web application. We demonstrated this at StataConf when we embedded the 
Spark Kernel client into a Play application to provide an interactive web 
application that communicates to Spark via the Spark Kernel (hosting a Spark 
Context).

A getting started section can be found here: 
https://github.com/ibm-et/spark-kernel/wiki/Getting-Started-with-the-Spark-Kernel

If you have any other questions, feel free to email me or communicate over our 
mailing list:

spark-ker...@googlegroups.com<mailto:spark-ker...@googlegroups.com>

https://groups.google.com/forum/#!forum/spark-kernel

Signed,
Chip Senkbeil
IBM Emerging Technology Software Engineer

[Inactive hide details for olegshirokikh ---01/16/2015 01:32:43 PM---The 
question is about the ways to create a Windows desktop-]olegshirokikh 
---01/16/2015 01:32:43 PM---The question is about the ways to create a Windows 
desktop-based and/or web-based application client

From: olegshirokikh mailto:o...@solver.com>>
To: user@spark.apache.org<mailto:user@spark.apache.org>
Date: 01/16/2015 01:32 PM
Subject: Creating Apache Spark-powered “As Service” applications





The question is about the ways to create a Windows desktop-based and/or
web-based application client that is able to connect and talk to the server
containing Spark application (either local or on-premise cloud
distributions) in the run-time.

Any language/architecture may work. So far, I've seen two things that may be
a help in that, but I'm not so sure if they would be the best alternative
and how they work yet:

Spark Job Server - https://github.com/spark-jobserver/spark-jobserver -
defines a REST API for Spark
Hue -
http://gethue.com/get-started-with-spark-deploy-spark-server-and-compute-pi-from-your-web-browser/
- uses item 1)

Any advice would be appreciated. Simple toy example program (or steps) that
shows, e.g. how to build such client for simply creating Spark Context on a
local machine and say reading text file and returning basic stats would be
ideal answer!



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Creating-Apache-Spark-powered-As-Service-applications-tp21193.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>



Re: Creating Apache Spark-powered “As Service” applications

2015-01-16 Thread Robert C Senkbeil
Hi,

You can take a look at the Spark Kernel project:
https://github.com/ibm-et/spark-kernel

The Spark Kernel's goal is to serve as the foundation for interactive
applications. The project provides a client library in Scala that abstracts
connecting to the kernel (containing a Spark Context), which can be
embedded into a web application. We demonstrated this at StataConf when we
embedded the Spark Kernel client into a Play application to provide an
interactive web application that communicates to Spark via the Spark Kernel
(hosting a Spark Context).

A getting started section can be found here:
https://github.com/ibm-et/spark-kernel/wiki/Getting-Started-with-the-Spark-Kernel

If you have any other questions, feel free to email me or communicate over
our mailing list:

spark-ker...@googlegroups.com

https://groups.google.com/forum/#!forum/spark-kernel

Signed,
Chip Senkbeil
IBM Emerging Technology Software Engineer



From:   olegshirokikh 
To: user@spark.apache.org
Date:   01/16/2015 01:32 PM
Subject:    Creating Apache Spark-powered “As Service” applications



The question is about the ways to create a Windows desktop-based and/or
web-based application client that is able to connect and talk to the server
containing Spark application (either local or on-premise cloud
distributions) in the run-time.

Any language/architecture may work. So far, I've seen two things that may
be
a help in that, but I'm not so sure if they would be the best alternative
and how they work yet:

Spark Job Server - https://github.com/spark-jobserver/spark-jobserver -
defines a REST API for Spark
Hue -
http://gethue.com/get-started-with-spark-deploy-spark-server-and-compute-pi-from-your-web-browser/

- uses item 1)

Any advice would be appreciated. Simple toy example program (or steps) that
shows, e.g. how to build such client for simply creating Spark Context on a
local machine and say reading text file and returning basic stats would be
ideal answer!



--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Creating-Apache-Spark-powered-As-Service-applications-tp21193.html

Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Creating Apache Spark-powered “As Service” applications

2015-01-16 Thread Corey Nolet
There's also an example of running a SparkContext in a java servlet
container from Calrissian: https://github.com/calrissian/spark-jetty-server

On Fri, Jan 16, 2015 at 2:31 PM, olegshirokikh  wrote:

> The question is about the ways to create a Windows desktop-based and/or
> web-based application client that is able to connect and talk to the server
> containing Spark application (either local or on-premise cloud
> distributions) in the run-time.
>
> Any language/architecture may work. So far, I've seen two things that may
> be
> a help in that, but I'm not so sure if they would be the best alternative
> and how they work yet:
>
> Spark Job Server - https://github.com/spark-jobserver/spark-jobserver -
> defines a REST API for Spark
> Hue -
>
> http://gethue.com/get-started-with-spark-deploy-spark-server-and-compute-pi-from-your-web-browser/
> - uses item 1)
>
> Any advice would be appreciated. Simple toy example program (or steps) that
> shows, e.g. how to build such client for simply creating Spark Context on a
> local machine and say reading text file and returning basic stats would be
> ideal answer!
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Creating-Apache-Spark-powered-As-Service-applications-tp21193.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Creating Apache Spark-powered “As Service” applications

2015-01-16 Thread olegshirokikh
The question is about the ways to create a Windows desktop-based and/or
web-based application client that is able to connect and talk to the server
containing Spark application (either local or on-premise cloud
distributions) in the run-time.

Any language/architecture may work. So far, I've seen two things that may be
a help in that, but I'm not so sure if they would be the best alternative
and how they work yet:

Spark Job Server - https://github.com/spark-jobserver/spark-jobserver -
defines a REST API for Spark
Hue -
http://gethue.com/get-started-with-spark-deploy-spark-server-and-compute-pi-from-your-web-browser/
- uses item 1)

Any advice would be appreciated. Simple toy example program (or steps) that
shows, e.g. how to build such client for simply creating Spark Context on a
local machine and say reading text file and returning basic stats would be
ideal answer!



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Creating-Apache-Spark-powered-As-Service-applications-tp21193.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org