Re: Setting Zeppelin to work with multiple Hadoop clusters when running Spark.

2017-03-26 Thread Serega Sheypak
le HADOOP_CONF_DIR under one jvm classpath. Only one > default configuration will be used. > > > Best Regard, > Jeff Zhang > > > From: Serega Sheypak <serega.shey...@gmail.com> > Reply-To: "users@zeppelin.apache.org" <users@zeppelin.apache.org> >

Re: Setting Zeppelin to work with multiple Hadoop clusters when running Spark.

2017-03-26 Thread Serega Sheypak
I know it, thanks, but it's non reliable solution. 2017-03-26 5:23 GMT+02:00 Jianfeng (Jeff) Zhang <jzh...@hortonworks.com>: > > You can try to specify the namenode address for hdfs file. e.g > > spark.read.csv(“hdfs://localhost:9009/file”) > > Best Regard, > Je

Why does zeppelin try to do during web application startup?

2017-03-26 Thread Serega Sheypak
Hi, I'm trying run Zeppelin 0.8.0-SNAPSHOT in Docker. Startup takes forever. It starts in seconds when launched on host, not in Docker container. I suspect Docker container has poorly configured network and some part of zeppelin tries to reach remote resource. SLF4J: See

Setting Zeppelin to work with multiple Hadoop clusters when running Spark.

2017-03-25 Thread Serega Sheypak
Hi, I have three hadoop clusters. Each cluster has it's own NN HA configured and YARN. I want to allow user to read from ant cluster and write to any cluster. Also user should be able to choose where to run is spark job. What is the right way to configure it in Zeppelin?

Preconfigure Spark interpreter

2017-04-22 Thread Serega Sheypak
Hi, I need to pre-configure spark interpreter with my own artifacts and internal repositories. How can I do it?

Re: Preconfigure Spark interpreter

2017-04-22 Thread Serega Sheypak
eter.json of the Zeppelin > installation will be changed. > > On Sat, Apr 22, 2017, 11:35 Serega Sheypak <serega.shey...@gmail.com> > wrote: > >> Hi, I need to pre-configure spark interpreter with my own artifacts and >> internal repositories. How can I do it? >> >

Custom spark for zeppelin and interpreter-list

2017-04-22 Thread Serega Sheypak
Hi, I have few concerns I can't resolve right now. I definitely can go though the source code and find the solution, but I would like to understand the idea behind. I'm building Zeppelin from sources using 0.8.0-SNAPSHOT. I do build it with custom cloudera CDH spark 2.0-something. I can't

Re: Is there any possibility to get link to Spark YARN application master from notebook

2017-07-10 Thread Serega Sheypak
Nevermind, I forgot that it's in intepreter settings https://cloud.githubusercontent.com/assets/5082742/20110797/c6852202-a60b-11e6-8264-93437a58f752.gif 2017-07-10 10:46 GMT+02:00 Serega Sheypak <serega.shey...@gmail.com>: > Super stupid question, sorry. > I can't find button / l

Re: Interpreter %sq not found Zeppelin swallows last "l" for some reason...?

2017-07-10 Thread Serega Sheypak
all my problems. 2017-07-10 20:37 GMT+02:00 Jongyoul Lee <jongy...@gmail.com>: > Thanks for telling me that. I'll also test it with chrome. Might you use > it in Windows? I never heard about it so I'm just asking something to find > a clue. > > On Mon, 10 Jul 2017 at

Re: Zeppelin without internet, speedup startup

2017-06-29 Thread Serega Sheypak
by end-users like data analysts. 2017-06-29 21:11 GMT+02:00 Иван Шаповалов <shapovalov.iva...@gmail.com>: > if you use helium - it will be installing npm at start time. See > HeliumVisualizationFactory.java > > 2017-06-29 17:09 GMT+03:00 Serega Sheypak <serega.shey...@gma

Re: java.lang.ClassNotFoundException: org.apache.zeppelin.spark.SparkInterpreter with 0.7.2 binary

2017-06-29 Thread Serega Sheypak
solved. I misunderstood how update works. 2017-06-29 21:14 GMT+02:00 Иван Шаповалов <shapovalov.iva...@gmail.com>: > looks like you create an interpreter setting via rest api and it is > configured well enough > > 2017-06-29 18:32 GMT+03:00 Serega Sheypak <serega.shey...@gmail.co

Re: Hitting strange NPE

2017-06-29 Thread Serega Sheypak
Hi, resolved. root cause: I've recompiled zeppelin with spark 2.11, used spark 2.0 complied for scala 2.11 but external artifacts were complied for scala 2.10 I did provide correct external artifacts and Zeppelin started to work. 2017-06-26 22:49 GMT+02:00 Serega Sheypak <serega.shey...@gmail.

Re: Configuring Zeppelin spark interpreter to work with different hadoop clusters

2017-06-30 Thread Serega Sheypak
instance. What do I miss? Thanks! 2017-06-30 16:43 GMT+02:00 Jeff Zhang <zjf...@gmail.com>: > > Right, create three spark interpreters for your 3 yarn cluster. > > > > Serega Sheypak <serega.shey...@gmail.com>于2017年6月30日周五 下午10:33写道: > >> Hi, thanks for your reply

Re: Configuring Zeppelin spark interpreter to work with different hadoop clusters

2017-06-30 Thread Serega Sheypak
16:21 GMT+02:00 Jeff Zhang <zjf...@gmail.com>: > > Try set HADOOP_CONF_DIR for each yarn conf in interpreter setting. > > Serega Sheypak <serega.shey...@gmail.com>于2017年6月30日周五 下午10:11写道: > >> Hi I have several different hadoop clusters, each of them has it

Re: Zeppelin REST api for interpreters

2017-06-28 Thread Serega Sheypak
Ah, it's there, thanks! 2017-06-28 12:44 GMT+02:00 Иван Шаповалов <shapovalov.iva...@gmail.com>: > for 3.2 https://zeppelin.apache.org/docs/0.7.2/rest-api/rest- > interpreter.html should work > > 2017-06-28 12:14 GMT+03:00 Serega Sheypak <serega.shey...@gmail.com>: >

Zeppelin REST api for interpreters

2017-06-28 Thread Serega Sheypak
Hi, I'm reading https://zeppelin.apache.org/docs/0.7.2/rest-api/rest-notebook.html I has great REST API for notebooks and paragraphs. I'm looking for interpreter configuration. I want to automate Zeppelin deployment and I need: 1. put zeppelin war on node (done) 2. start war and connect to

Re: Is Zeppelin spark-version agnostic?

2017-06-27 Thread Serega Sheypak
Hi Jeff! Am I right that I don't have to recomplie Zeppelin for scala 2.11 to make it work with spark 2.0 complied for scala 2.11? Zeppelin doesn't really care about spark scala version and spark version overall (1.6 ... 2.0) Thanks! 2017-06-27 18:08 GMT+02:00 Serega Sheypak <serega.s

Zeppelin without internet, speedup startup

2017-06-29 Thread Serega Sheypak
Hi, I'm starting zeppelin w/o internet. Looks like it tries to access some external resources. Is it true? Can I stop it somehow? It takes 2 minutes to start. I failed to find it in source code. Thanks!

Customizing sparkconfig before starting spark app

2017-04-26 Thread Serega Sheypak
Hi, I have few questions about spark application customization 1. Is it possible to set spark app name from notebook, not from zeppelin conf? 2. Is is possible to register custom kryo serializers? 3. Is it possible to configure user name? Right now I'm running zeppelin as root and all jobs are

How to debug spark.dep job?

2017-04-27 Thread Serega Sheypak
Hi, seems like I was able to start Zeppelin. I have inhouse artifactory and I want zeppelin to download my artifacts from artifactory and use classes in spark job afterwards. Notebook submission hangs %spark.dep and never finishes. Zeppelin outputs to log that DepInterpreter job has been

Re: Custom spark for zeppelin and interpreter-list

2017-04-24 Thread Serega Sheypak
> > Hope this helps. > > Best, > moon > > > On Sat, Apr 22, 2017 at 1:04 PM Serega Sheypak <serega.shey...@gmail.com> > wrote: > >> Hi, I have few concerns I can't resolve right now. I definitely can go >> though the source code and find the solution,

Interpreter %sq not found Zeppelin swallows last "l" for some reason...?

2017-06-26 Thread Serega Sheypak
Hi, I get super weird exception: ERROR [2017-06-26 07:44:17,523] ({qtp2016336095-99} NotebookServer.java[persistAndExecuteSingleParagraph]:1749) - Exception from run org.apache.zeppelin.interpreter.InterpreterException: paragraph_1498480084440_1578830546's Interpreter %sq not found I have three

Hitting strange NPE

2017-06-26 Thread Serega Sheypak
Hi, I'm getting strange NPE w/o any obvious reason. My notebook contains two paragraphs: res0: org.apache.zeppelin.dep.Dependency = org.apache.zeppelin.dep.Dependency@6ce5acd %spark.dep z.load("some-local-jar.jar") and import com.SuperClass // bla-bla val features =

Re: NPE in SparkInterpreter.java

2017-06-26 Thread Serega Sheypak
Hi, I have more or less the same symptom if (Utils.isScala2_10()) { binder = (Map) getValue("_binder"); } else { binder = (Map) getLastObject(); } binder.put("sc", sc); // EXCEPTION HERE java.lang.NullPointerException at

Re: Hitting strange NPE

2017-06-26 Thread Serega Sheypak
Serega Sheypak <serega.shey...@gmail.com>: > Ok, seems like something wrong when you try to use deps. I was able run > simple spark job w/o third party dependecies. > Zeppelin always throw NPE when you try to use local files using %spark.dep > or spark interpreter conf (there i

Re: Hitting strange NPE

2017-06-26 Thread Serega Sheypak
:31 GMT+02:00 Serega Sheypak <serega.shey...@gmail.com>: > Hi, I'm getting strange NPE w/o any obvious reason. > > My notebook contains two paragraphs: > > > res0: org.apache.zeppelin.dep.Dependency = org.apache.zeppelin.dep. > Dependency@6ce5acd > > %spa

Re: Is Zeppelin spark-version agnostic?

2017-06-27 Thread Serega Sheypak
spark is installed. > > > > Serega Sheypak <serega.shey...@gmail.com>于2017年6月27日周二 下午6:14写道: > >> Hi, can zeppelin spark interpreter support spark 1.6 / 2.0 / 2.1 >> I didn't find which spark versions are supported... >> >

Re: NPE in SparkInterpreter.java

2017-06-27 Thread Serega Sheypak
It was my fault, I'm so sorry, I've recompiled zeppelin for scala 2.11 to make it run with cloudera spark 2.0 and used scala 2.10 third party libs. I replaced them with 2.11 versions and it started to work вт, 27 июня 2017 г. в 9:52, Serega Sheypak <serega.shey...@gmail.com>: > Hi,

Re: NPE in SparkInterpreter.java

2017-06-27 Thread Serega Sheypak
how to prevent it? > > I am trying to make a strong case for my company to switch from other > notebook application to Zeppelin, Zeppelin looks good and only this issue > concerns me. > > I'm looking forward for any insights, thanks. > On Monday, June 26, 2017, 11:56:45 AM

Can't run simple example with scala and spark SQL. Some non obvious syntax error in SQL

2017-05-02 Thread Serega Sheypak
Here is my sample notebook: %spark val linesText = sc.textFile("hdfs://cluster/user/me/lines.txt") case class Line(id:Long, firstField:String, secondField:String) val lines = linesText.map{ line => val splitted = line.split(" ") println("splitted => " + splitted)

Re: Can't run simple example with scala and spark SQL. Some non obvious syntax error in SQL

2017-05-02 Thread Serega Sheypak
plitted) Line(splitted(0).toLong, splitted(1), splitted(2)) } lines.collect().foreach(println) prints file contexts to UI. I have some trouble with sql... 2017-05-02 13:57 GMT+02:00 Serega Sheypak <serega.shey...@gmail.com>: > Here is my sample notebook: > %spark > val linesT

Is Zeppelin spark-version agnostic?

2017-06-27 Thread Serega Sheypak
Hi, can zeppelin spark interpreter support spark 1.6 / 2.0 / 2.1 I didn't find which spark versions are supported...

sql paragraph doesn't see my 3rd party jars

2017-10-07 Thread Serega Sheypak
Hi, I'm trying to use spark and sql paragraphs with 3rd party jars added to spark interpreter configuration. My spark code works fine. My sql paragraph fails with class not found exception %sql create external table MY_TABLE row format serde 'com.my.MyAvroSerde' with serdeproperties

Can't use newly create intepreter, Zeppelin 0.7.2 paragraph_XXX's Interpreter spark_my not found

2017-10-23 Thread Serega Sheypak
Hi, I've create spark interpreter with name spark_my I was able to restart it and zeppelin shows "green" marker near it. spark_my is copy-paste of default spark intepreter with few changes. I try to use spark_my in notebook: %spark_my import java.nio.ByteBuffer more code Zeppelin shows

How does user user jar conflict resolved in spark interpreter?

2017-11-15 Thread Serega Sheypak
Hi zeppelin users! I have the question about dependencies users are using while running notebooks using spark interpreter. Imagine I have configured spark intepreter. Two users write their spark notebooks. the first user does z.load("com:best-it-company:0.1") the second one user adds to his

Re: Can't use newly create intepreter, Zeppelin 0.7.2 paragraph_XXX's Interpreter spark_my not found

2017-10-24 Thread Serega Sheypak
il.com>于2017年10月23日周一 下午7:08写道: > >> >> Please bind this interpreter to your note first. >> >> Serega Sheypak <serega.shey...@gmail.com>于2017年10月23日周一 下午6:14写道: >> >>> Hi, I've create spark interpreter with name spark_my >>> I was able to restart