I think results are stored In notebooks
Le 6 mai 2016 09:01, "venkata srinivasarao kolla" a
écrit :
> Hi Team,
>
> At present in zeppelin, whenever we run an query the results are
> getting stored only in memory.
> I did not find any zeppelin api to persist these results to a disk.
> Can some on
You are in the uncoveredbuse case. No binding yet for es interpreter and
spark connector doesnt use agregation
Le 18 avr. 2016 12:36 PM, "Chirag Sharma" a écrit :
> Hi there
>
> We want to do some manipulation over the data retrieved by elastic search
> before plotting graphs for it. E.g. there i
+1 for the optimization of the UI with the update after action. we have
some remote users over a laggy entreprise network...
2016-04-16 7:14 GMT+02:00 moon soo Lee :
> Hi John,
>
> Thanks for sharing your feedback.
> Could you share how big your 4 notes are?
>
> Laggy response might comes from tw
heavy lifting type activities.
>
>
> John
> On Apr 12, 2016 5:03 PM, "vincent gromakowski" <
> vincent.gromakow...@gmail.com> wrote:
>
>> We decided to not use docker for network performance In production
>> flows not dor deployment. virtualisation of the
It's not a configuration error but a well known conflict between guava 12
in Spark and guava 16 in spark cassandra driver. You can find some
workarounds in spark cassandra mailing list
My workaround in zeppelin is to load in zeppelin dependency loader (spark
interpreter config web page) the guava
eed enterprise threshold, we can safely shutdown the
> instance of Zeppelin returning resources to the cluster.
>
> Would love discussion here...
>
> On Tue, Apr 12, 2016 at 1:57 AM, vincent gromakowski <
> vincent.gromakow...@gmail.com> wrote:
>
>> 1. I am using ans
t;>> instance per User, that helps organize who has access to what (Still
>>>> hashing out the details on that). Marathon on Mesos is how we ensure that
>>>> Zeppelin is actually available, and then when it comes to spark, we are
>>>> just submitting to Me
x broken things, so hope it's stability isn't
> something to be concerned about.
>
> Regards,
> Ashish
>
> On Fri, Apr 8, 2016 at 12:20 PM, vincent gromakowski <
> vincent.gromakow...@gmail.com> wrote:
>
>> use fuse interface. Gluster volume is directly ac
> implementation of storage interface? Or zeppelin can work with it, out of
> the box?
>
> Regards,
> Ashish
>
> On Wed, Apr 6, 2016 at 12:53 PM, vincent gromakowski <
> vincent.gromakow...@gmail.com> wrote:
>
>> For 1 marathon on mesos restart zeppelin daemon In case
For 1 marathon on mesos restart zeppelin daemon In case of failure.
For 2 glusterfs fuse mount allows to share notebooks on all mesos nodes.
For 3 not available right now In our design but a manual restart In
zeppelin config page is acceptable for US.
Le 6 avr. 2016 8:18 AM, "Eran Witkon" a écrit
I think Spark 2.0 will use 2.11 by default
Le 29 mars 2016 9:02 PM, "vincent gromakowski" <
vincent.gromakow...@gmail.com> a écrit :
> Spark can work with 2.11 but requires to be built from source
> Le 29 mars 2016 8:27 PM, "Peter DeNicola" <
> pete
Spark can work with 2.11 but requires to be built from source
Le 29 mars 2016 8:27 PM, "Peter DeNicola"
a écrit :
> From what I understand, Spark isn’t configured to work with scala 2.11 and
> only 2.10.xx. At least, that was the trouble my team was running into when
> attempting to use spark wit
Hi
Clearly the dependencies management should be clarified because its not
clear which method override which one specially when you have conflict the
order of libs is important in the classpath...
Le 9 mars 2016 03:04, "mina lee" a écrit :
> Hi Chris,
>
> there are several ways to load dependenc
In my case I juste need to edit the settings in UI then save it. I agree
the behavior is strange.
Le 4 mars 2016 8:20 AM, "Zhong Wang" a écrit :
> Seems I need to remove the dependency in UI, then save the configuration,
> then add it back again.
>
> Is there any more convenient way to do that?
>
t; it's probably a problem with your classpath."?
>
>
> On Mon, Feb 29, 2016 at 5:19 PM, vincent gromakowski <
> vincent.gromakow...@gmail.com> wrote:
>
>> Don't forget to do the import. If done it's probably a problem with your
>> classpath...
Don't forget to do the import. If done it's probably a problem with your
classpath...
2016-02-29 15:03 GMT+01:00 Aleksandr Modestov :
> Hello all,
> There is a problem when I start to initialize H2OContent.
> Does anybody know the answer?
>
> java.lang.NoClassDefFoundError: water/api/HandlerFacto
t;
>
> On Mon, Feb 29, 2016 at 3:43 PM, vincent gromakowski <
> vincent.gromakow...@gmail.com> wrote:
>
>> Try to use the dependency loader in Spark interpreter configuration page.
>> I have encountered strange behaviors with spark.jars options...
>>
>> 2016
Try to use the dependency loader in Spark interpreter configuration page. I
have encountered strange behaviors with spark.jars options...
2016-02-29 13:35 GMT+01:00 Aleksandr Modestov :
> Hello!
> Excuse me, but it doesn't work...
> I open an interpreter window and create several additional lines
Sorry guys, I have made a mistake in zeppelin configuration that removed
zeppelin-spark.jar from the classpath...
2016-02-29 12:07 GMT+01:00 vincent gromakowski <
vincent.gromakow...@gmail.com>:
> Hi all,
> I am getting this strange error
>
> java.lang.NoClassDefFoundError: Lo
Hi all,
I am getting this strange error
java.lang.NoClassDefFoundError: Lorg/apache/zeppelin/spark/ZeppelinContext;
at java.lang.Class.getDeclaredFields0(Native Method)
at java.lang.Class.privateGetDeclaredFields(Class.java:2583)
at java.lang.Class.getDeclaredField(Class
:
> http://zeppelin.incubator.apache.org/docs/0.6.0-incubating-SNAPSHOT/rest-api/rest-notebook.html
>
>
>
> On Wed, Feb 24, 2016 at 4:22 PM, vincent gromakowski <
> vincent.gromakow...@gmail.com> wrote:
>
>> Is there any way to run a paragraph and retrieve results in the
Is there any way to run a paragraph and retrieve results in the response ?
I have done some tests and cannot achieve it. It always send back "status :
OK" even if it fails
you go to
> interpreter menu, can you see dependency you have added on GUI?
>
> Thanks,
> moon
>
> On Tue, Feb 23, 2016 at 11:41 AM vincent gromakowski <
> vincent.gromakow...@gmail.com> wrote:
>
>> 1. Stop zeppelin
>> 2. Add a dependency in in
supposed to be loaded on launch.
> Could double check that interpreter.json is not read at zeppelin launch?
> Or if it keep happening, could you let me know how to reproduce?
>
> Thanks,
> moon
>
>
> On Tue, Feb 23, 2016 at 8:22 AM vincent gromakowski <
> vince
ttings web UI,edit the spark interpreter and restart it...
2016-02-23 15:15 GMT+01:00 vincent gromakowski <
vincent.gromakow...@gmail.com>:
> Hi,
> I am trying to automatcally add jars to spark interpreter with several
> methods but I cannot achieve it.
> I am currently generat
> `mvn clean package -DskipTests`
>
> Regards
>
> On Tue, Feb 23, 2016 at 2:30 PM, vincent gromakowski <
> vincent.gromakow...@gmail.com> wrote:
>
>> Hi
>> Zeppelin 0.5.6 give me an error on Cassandra interpreter. Class not found
>> on Cassandra drive
Hi,
I am trying to automatcally add jars to spark interpreter with several
methods but I cannot achieve it.
I am currently generating an interpreter.json file from ansible templates
before launching Zeppelin in Marathon.
1. spark.jars
2. spark.driver.extraClassPath
3. groupArtifactVersion (depe
Hi
Zeppelin 0.5.6 give me an error on Cassandra interpreter. Class not found
on Cassandra driver core. 2.1.7 is required but I cannot see it In the
classpath. Both with binaries and built from source
I would like to modify shortcuts specially "alt gr" because I canot use it
anymore to write {,},[,] on french keyboard, the shortcut "alt gr" + 5
reduce the width of the paragraph...
vincent gromakowski <
vincent.gromakow...@gmail.com>:
> I have found the error comes from the spark-cassandra assembly jar that I
> have built from snapshot. I am going crazy with this connector...
>
> 2016-02-17 22:44 GMT+01:00 Felix Cheung :
>
>> It looks like the ja
ial 0.5.6 release?
>
>
>
>
>
> On Wed, Feb 17, 2016 at 10:53 AM -0800, "vincent gromakowski" <
> vincent.gromakow...@gmail.com> wrote:
>
> Hi all,
> I have built Zeppelin from source (master branch) but I cannot get it work
> with Spark.
> Here is
Hi all,
I have built Zeppelin from source (master branch) but I cannot get it work
with Spark.
Here is my build command :
mvn clean package -DskipTests -Pspark-1.6 -Phadoop-2.6
And I get the following error with Spark 1.6 (pre built distribution 1.6
with Hadoop from Spark site)
java.lang.Securit
e your are on the same host, the websocket is probably
bypassing the proxy...
2015-12-19 11:27 GMT+01:00 vincent gromakowski <
vincent.gromakow...@gmail.com>:
> I am using apache 2.4 on a different server than zeppelin. I get a message
> In My firefox saying it cannot connect to ws:/
I am using apache 2.4 on a different server than zeppelin. I get a message
In My firefox saying it cannot connect to ws://host/ws
Maybe an issue with the headers and origins ?
Le 19 déc. 2015 6:54 AM, "Hyung Sung Shim" a écrit :
> Hello vincent gromakowski.
>
> What version
I am trying to make Zeppelin work behind an Apache reverse proxy that would
deal with user authentication but I have issues with the websocket.
Please could you provide me some examples of Apache configuration files
that would work in reverse proxy ?
35 matches
Mail list logo