Hi,
By "local Spark 2.0 instance", did you mean a standalone cluster on your
local? So you updated the "master", and "deploy-mode" interpreter
configuration?
I had this problem as well. But I found if you set the
"spark.executor.memory" to "512m", and make sure your machine has
sufficient
Yes, I think JRE is not able do pull dependencies. That's why I found the
dependency configuration did not take effect.
On Fri, Oct 14, 2016 at 2:50 PM Jongyoul Lee wrote:
> Hi,
>
> Did you solve it by installing JDK?
>
> On Thu, Oct 13, 2016 at 8:44 PM, Xi Shen
Hi
Thanks for your kind answers :)
I downloaded file through your url link and I've tested.
This test have executed a "bin/zeppelin.cmd" command in cmd window.
But, I didn't find your problem.
Did you get the other error logs?
This "logs" folder is under zeppelin's home.
If not, when you
Hi, thanks for the response. I am using pre-built binary for windows with
all interpreters. I did not build it myself.
The link i downloaded it from is:
http://www-eu.apache.org/dist/zeppelin/zeppelin-0.6.1/zeppelin-0.6.1-bin-all.tgz
Let me know if there is any additional information that I can
Hi,
Did you solve it by installing JDK?
On Thu, Oct 13, 2016 at 8:44 PM, Xi Shen wrote:
> Turns out I need JDK, but I only install JRE...
>
> On Wed, Oct 12, 2016 at 3:25 PM Xi Shen wrote:
>
>> Here's the log