Ahh yes, forgot that you're using the 0.6.0 build. The guava jar was
missing in the 0.5.5 release
On Wed, Apr 13, 2016 at 2:03 PM, Sanne de Roever
wrote:
> Rocking! Vincents suggestion worked.
>
> I tried a %dep in the notebook first, this did not work.
>
> The
Easy work-around, in $ZEPPELIN-HOME/interpreter/cassandra/lib folder, add
the guava-16.0.1.jar file and it's done.
On Wed, Apr 13, 2016 at 1:37 PM, vincent gromakowski <
vincent.gromakow...@gmail.com> wrote:
> It's not a configuration error but a well known conflict between guava 12
> in Spark
Rocking! Vincents suggestion worked.
I tried a %dep in the notebook first, this did not work.
The $ZEPPELIN-HOME/interpreter/cassandra does not have a lib folder, but is
filled with jars itself, oa. guava-16.0.1.jar. No changes necessary there
it seems.
On Wed, Apr 13, 2016 at 1:37 PM, vincent
Hi,
When I ran R notebook example, I got these errors in the logs:
- Caused by: org.apache.zeppelin.interpreter.InterpreterException:
sparkr is not responding
- Caused by: org.apache.thrift.transport.TTransportException
I did not config SPARK_HOME so far, and intended to use the embedded
Is this a specific Docker decision or a Zeppelin on Docker decision. I am
curious on the amount of network traffic Zeppelin actually generates. I
could be around, but I made the assumption that most of the network traffic
with Zeppelin is results from the various endpoints (Spark, JDBC, Elastic
It's a global decision on our SMACK stack platform but maybe we will go
for applications only on docker for devops (client of spark). For zeppelin
I dont see the need (no devops)
Le 13 avr. 2016 4:05 PM, "John Omernik" a écrit :
> Is this a specific Docker decision or a
Can you post the full stacktrace you have (look also at the log file)?
Did you install R on your machine?
SPARK_HOME is optional.
On 13/04/16 15:39, Patcharee Thongtra wrote:
Hi,
When I ran R notebook example, I got these errors in the logs:
- Caused by:
hi Scott
Vendor-repo would be the way to go. It is possible in this case CDH Spark 1.6
has some incompatible API changes, though I couldn't find it yet. Do you have
more from the logs on that NoSuchMethodException?
_
From: Scott Zelenka
Sent:
Hi,
I'm trying to build/install Zeppelin 0.6.0 (version 0.5.6 also has the
same symptoms) on a new CDH cluster running Hadoop 2.6.0-cdh5.7.0 and
Spark 1.6.0, but I'm getting this error when I use SPARK_HOME to point
to the /opt/cloudera/parcels/CDH/lib/spark directory in zeppelin-env.sh: