sql paragraph doesn't see my 3rd party jars

2017-10-07 Thread Serega Sheypak
Hi, I'm trying to use spark and sql paragraphs with 3rd party jars added to
spark interpreter configuration.

My spark code works fine.


My sql paragraph fails with class not found exception
%sql
create external table MY_TABLE row format serde 'com.my.MyAvroSerde'
with serdeproperties ('serialization.class'='com.my.ContainerProto') stored
as inputformat 'com.my.ProtoAvroFileFormat'
LOCATION 'hdfs://my/data'

Exception:
MetaException(message:org.apache.hadoop.hive.serde2.SerDeException
java.lang.ClassNotFoundException: Class com.my.ContainerProto not found)


It's confusing since spark paragraph works well with
code

import com.my.ContainerProto
// bla-bla
rdd.map{bytes => ContainerProto.fromBytes(bytes)}

Code executed and result produced. Why sql paragraph doesn't see my 3rd
party jars?


Re: Trying to 0.7.3 running with Spark

2017-10-07 Thread Jianfeng (Jeff) Zhang

Could you check the log again ? There should be another exception above the 
exception you pasted. Most likely SparkContext is failed to create.



Best Regard,
Jeff Zhang


From: Terry Healy >
Reply-To: "users@zeppelin.apache.org" 
>
Date: Friday, October 6, 2017 at 10:35 PM
To: "users@zeppelin.apache.org" 
>
Subject: Trying to 0.7.3 running with Spark

Using Zeppelin 0.7.3, Spark 2.1.0-mapr-1703 / Scala 2.11.8

I had previously run the demo and successfully set up MongoDB and JDBC 
interpreter for Impala under V0.7.2. Since I have upgraded to 0.7.3, everything 
broke. I am down to to complete re-install (several, in fact) and get a 
response like below for most everything I try. (Focusing just on %spark for 
now) apparently have something very basic wrong, but I'll be damned if I can 
find it. The same example works fine in spark-shell.

Any suggestions for a new guy very much appreciated.

I found [ZEPPELIN-2475] 
and
 [ZEPPELIN-1560] which seem to be the same, or similar, but I did not 
understand what to change 
where

This is from "Zeppelin Tutorial/Basic Features (Spark)".

java.lang.NullPointerException
at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:33)
at 
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext_2(SparkInterpreter.java:398)
at 
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:387)
at 
org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:146)
at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:843)
at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:491)
at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)