Hey Rohit,

The non-VB env should be the same as a VB env, only it's "dirtier".  To
your point- Grunt installations etc.  I think it would be more useful for
me to do another virtual box, try to recreate the issues people were having
and then find work arounds.  Can you send me some links to the specific
threads where people were having issues, please?  I'll try to work on it
this weekend.

Also, did anyone ever write a way to load Flink dependencies within
Zeppelin?  I think I saw a JIRA ticket for it. I only ask, bc for stuff I'm
doing at work that is a way higher priority for me.

I probably should jump on the dev-mailing list...
tg



Trevor Grant
Data Scientist
https://github.com/rawkintrevo
http://stackexchange.com/users/3002022/rawkintrevo

*"Fortunate is he, who is able to know the causes of things."  -Virgil*


On Tue, Nov 3, 2015 at 4:04 AM, rohit choudhary <rconl...@gmail.com> wrote:

> Hi Trevor,
>
> This could be useful for a lot of people starting off on zeppelin. Could
> you also include installation on non VB env? Also a lot of users have
> complained of the zeppeling-web folder creating issues because of
> phantomjs, did you encounter those issues? Or any on Grunt? It will be
> useful to see what is the output of ./grunt build'  as it seems you were
> able to get it working on Ubuntu without issues.
>
> Thanks,
> Rohit.
>
> On Tue, Nov 3, 2015 at 10:55 AM, Trevor Grant <trevor.d.gr...@gmail.com>
> wrote:
>
>> I wrote a tutorial on setting up Zeppelin with Flink and Spark in cluster
>> mode, which was a hell of a lot more difficult than this post makes it
>> seem.
>>
>> I'm willing to rewrite into a legit tutorial if there is any interest.
>>
>> http://trevorgrant.org/2015/11/03/apache-casserole-a-delicious-big-data-recipe-for-the-whole-family/
>>
>> Note: I changed the title but couldn't figure out how to change the URL,
>> in case anyone wanted to flame me for it.
>>
>> It runs against Flink 0.10 (Till Rohrmann's branch) and Spark 1.4. It
>> would probably help if Till's support for Flink 0.10 was merged back in,
>> then I could probably also get it working against Spark 1.5 (or 1.6).  As
>> it was, those two (Spark 1.5 and 1.6) were what really killed me the last
>> few days, I finally gave up and just used 1.4 which worked well.
>>
>> Great work all, and thanks.
>> tg
>>
>>
>> Trevor Grant
>> Data Scientist
>> https://github.com/rawkintrevo
>> http://stackexchange.com/users/3002022/rawkintrevo
>>
>> *"Fortunate is he, who is able to know the causes of things."  -Virgil*
>>
>>
>

Reply via email to