zeppelin + helium activation

2016-10-18 Thread Vikash Kumar
Hi all, Congratulations for latest release. I went through zeppelin helium proposal which looks great and more advance and it will bring zeppelin far ahead from other notebooks. While exploring over the internet I came across presentation on helium[1] where in slide number 24 has a UI to plug-i

Re: zeppelin + helium activation

2016-10-18 Thread moon soo Lee
Hi, Thanks for the interest on helium framework. It's still got long ways to go. If you want to try, you can build master branch with '-Pexamples' flag. You'll see Helium launcher icon when you create either Java.util.Date object, or table result in paragraph. I.e. Try '%spark new java.util.date

Re: Regarding Zeppelin upgrade

2016-10-18 Thread Ahyoung Ryu
Hi Dipesh, It would be better if you can provide more information to us. Which version of Zeppelin did you install on your cluster? How did you install Zeppelin? From source? Or prebuild binary package? Thanks, Ahyoung On Sat, Oct 15, 2016 at 8:43 PM, Dipesh Vora wrote: > Hi All, > > > > I hav

JDBC Connections

2016-10-18 Thread Benjamin Kim
We are using Zeppelin 0.6.0 as a self-service for our clients to query our PostgreSQL databases. We are noticing that the connections are not closing after each one of them are done. What is the normal operating procedure to have these connections close when idle? Our scope for the JDBC interpre

Re: JDBC Connections

2016-10-18 Thread Hyung Sung Shim
Hello. AFAIK The connections did not closed until restart JDBC Interpreter. so https://github.com/apache/zeppelin/pull/1396 use ConnectionPool for control sessions. 2016-10-19 2:43 GMT+09:00 Benjamin Kim : > We are using Zeppelin 0.6.0 as a self-service for our clients to query our > PostgreSQL

Netty error with spark interpreter

2016-10-18 Thread Vikash Kumar
Hi all, I am trying zeppelin with spark which is throwing me the following error related to netty jar conflicts. I checked properly my class path. There are only single versions of netty-3.8.0 and netty-all-4.0.29-Final jar. Other information : Spark 2.0.0