Change interpreter execution root path

2018-10-20 Thread Jhon Anderson Cardenas Diaz
Hi Do you know how can I change the folder path where the interpreters are executed?. The reason why I want to change that default location (which is $ZEPPELIN_HOME) is because we are getting very large core dumps files in that location when the interpreter process die. As we are in a k8s

Re: ConfigStorage - Credentials persistence

2018-10-04 Thread Jhon Anderson Cardenas Diaz
ion files and notebook files will be stored in > the same storage layer. > > > Jhon Anderson Cardenas Diaz 于2018年10月5日周五 > 上午6:21写道: > > > Also maybe the *helium configuration* should be included in the > > ConfigStorage component, in order to be persisted like others

Re: ConfigStorage - Credentials persistence

2018-10-04 Thread Jhon Anderson Cardenas Diaz
Also maybe the *helium configuration* should be included in the ConfigStorage component, in order to be persisted like others configurations. El jue., 4 oct. 2018 a las 16:44, Jhon Anderson Cardenas Diaz (< jhonderson2...@gmail.com>) escribió: > Hi! > > Currently in the ConfigSt

ConfigStorage - Credentials persistence

2018-10-04 Thread Jhon Anderson Cardenas Diaz
Hi! Currently in the ConfigStorage component there are methods to persist and retrieve the credentials of the zeppelin users: public abstract String loadCredentials() throws IOException; public abstract void saveCredentials(String credentials) throws IOException; But those methods are not being

Re: Paragraphs outputs from other notebooks/paragraphs

2018-08-02 Thread Jhon Anderson Cardenas Diaz
a las 20:07, Jeff Zhang () escribió: > > This is the first time I see user reporting this issue, what interpreter > do you use ? Is it easy to reproduce ? > > > Jhon Anderson Cardenas Diaz 于2018年8月3日周五 > 上午12:34写道: > >> Hi! >> >> Has someone else experim

Paragraphs outputs from other notebooks/paragraphs

2018-08-02 Thread Jhon Anderson Cardenas Diaz
Hi! Has someone else experimented this problem?: Sometimes *when a paragraph is executed it shows random output from another notebook* (from other users also). We are using zeppelin 0.7.3 and Spark and all other interpreters are configured in "Per User - Scoped" mode.. Regards.

New Spark Interpreter - Spark UI Url

2018-07-18 Thread Jhon Anderson Cardenas Diaz
Hi ! Is there any way to configure a custom spark UI url in the new spark interpreter implementation ? That feature was introduced in https://issues.apache.org/jira/browse/ZEPPELIN-2949, and it is working on old spark interpreter but not working on the new one. Regards.

Zeppelin starting time - tied to the load of notebooks loading

2018-07-04 Thread Jhon Anderson Cardenas Diaz
Hi!. Right now the Zeppelin starting time depends directly on the time it takes to load the notebooks from the repository. If the user has a lot of notebooks (ex more than 1000), the starting time starts to be too long. Is there some plan to re implement this notebooks loading so that it is done

Re: All PySpark jobs are canceled when one user cancel his PySpark paragraph (job)

2018-06-12 Thread Jhon Anderson Cardenas Diaz
> 1. Use per user scoped mode so that each user own his own python process > 2. Use IPySparkInterpreter of zeppelin 0.8 which is better for integration > python with zeppelin. > > > > Jhon Anderson Cardenas Diaz 于2018年6月13日周三 > 上午6:15写道: > > > Hi! > > >

Re: All PySpark jobs are canceled when one user cancel his PySpark paragraph (job)

2018-06-12 Thread Jhon Anderson Cardenas Diaz
be cancelled: context.py: # create a signal handler which would be invoked on receiving SIGINT def signal_handler(signal, frame): *self.cancelAllJobs()* raise KeyboardInterrupt() Is this a zeppelin bug ? Thank you. 2018-06-12 17:12 GMT-05:00 Jhon Anderson Cardenas Diaz < jhonderso

Re: All PySpark jobs are canceled when one user cancel his PySpark paragraph (job)

2018-06-12 Thread Jhon Anderson Cardenas Diaz
g SIGINT def signal_handler(signal, frame): self.cancelAllJobs() raise KeyboardInterrupt() 2018-06-12 9:26 GMT-05:00 Jhon Anderson Cardenas Diaz < jhonderson2...@gmail.com>: > Hi!. > I have 0.8.0 version, from September 2017 > > 2018-06-12 4:48 GMT-05:00 Jianfeng (Jeff)

Re: All PySpark jobs are canceled when one user cancel his PySpark paragraph (job)

2018-06-12 Thread Jhon Anderson Cardenas Diaz
Hi!. I have 0.8.0 version, from September 2017 2018-06-12 4:48 GMT-05:00 Jianfeng (Jeff) Zhang : > > Which version do you use ? > > > Best Regard, > Jeff Zhang > > > From: Jhon Anderson Cardenas Diaz jhonderson2...@gmail.com>> > Reply-To:

All PySpark jobs are canceled when one user cancel his PySpark paragraph (job)

2018-06-08 Thread Jhon Anderson Cardenas Diaz
Dear community, Currently we are having problems with multiple users running paragraphs associated with pyspark jobs. The problem is that if an user aborts/cancels his pyspark paragraph (job), the active pyspark jobs of the other users are canceled too. Going into detail, I've seen that when

Re: Zeppelin code can access FileSystem

2018-05-10 Thread Jhon Anderson Cardenas Diaz
would think that is another security issue of this approach.What do you think about it? 2018-05-09 12:53 GMT-05:00 Jhon Anderson Cardenas Diaz < jhonderson2...@gmail.com>: > > -- Forwarded message - > From: Sam Nicholson <sam...@ogt11.com> > Date: mié., may.

Zeppelin code can access FileSystem

2018-05-08 Thread Jhon Anderson Cardenas Diaz
Dear Zeppelin Community, Currently when a Zeppelin paragraph is executed, the code in it can read sensitive config files, change them, including web app pages and etc. Like in this example: %python f = open("/usr/zeppelin/conf/credentials.json", "r") f.read() Do you know if is there a way to

Filter for Zeppelin Notebook Server (Websocket)

2018-04-25 Thread Jhon Anderson Cardenas Diaz
Hi! I am trying to implement a filter inside zeppelin in order to intercept the petitions and collect metrics about zeppelin performance. I registered the javax servlet filter in the zeppelin-web/src/WEB-INF/web.xml, and the filter works well for the REST request; but it does not intercept the

[jira] [Created] (ZEPPELIN-3419) Potential dependency conflict when the version of a dependency is changed on zeppelin interpreters

2018-04-20 Thread Jhon Anderson Cardenas Diaz (JIRA)
Jhon Anderson Cardenas Diaz created ZEPPELIN-3419: - Summary: Potential dependency conflict when the version of a dependency is changed on zeppelin interpreters Key: ZEPPELIN-3419 URL: https

Re: Zeppelin - Spark Driver location

2018-03-13 Thread Jhon Anderson Cardenas Diaz
owse/ZEPPELIN-2898 was merged end of > September so not sure if you have that. > > Check out > https://medium.com/@zjffdu/zeppelin-0-8-0-new-features-ea53e8810235 how > to set this up. > > > > -- > Ruslan Dautkhanov > > On Tue, Mar 13, 2018 at 5:24 PM, Jhon Anderson Card

Zeppelin - Spark Driver location

2018-03-13 Thread Jhon Anderson Cardenas Diaz
Hi zeppelin users ! I am working with zeppelin pointing to a spark in standalone. I am trying to figure out a way to make zeppelin runs the spark driver outside of client process that submits the application. According with the documentation (

Unmodifiable interpreter properties

2018-03-02 Thread Jhon Anderson Cardenas Diaz
Hi fellow Zeppelin users. I would like to know if is there a way in zeppelin to set interpreter properties that can not be changed by the user from the graphic interface. An example use case in which this can be useful is if we want that zeppelin users can not kill jobs from the spark ui; for

Re: Extending SparkInterpreter functionality

2018-02-02 Thread Jhon Anderson Cardenas Diaz
o cater to multiple clients. So, multiple >>> Zeppelin instances, multiple spark clusters, multiple Spark UIs and on top >>> of that maintaining the security and privacy in a shared multi-tenant env >>> will need all the flexibility we can get! >>> >>> Thank

Extending SparkInterpreter functionality

2018-02-01 Thread Jhon Anderson Cardenas Diaz
Hello! I'm a software developer and as part of a project I require to extend the functionality of SparkInterpreter without modifying it. I need instead create a new interpreter that extends it or wrap its functionality. I also need the spark sub-interpreters to use my new custom interpreter, but