from your attached log

java.lang.NoSuchMethodError:
akka.actor.ActorContext.dispatcher()Lscala/concurrent/ExecutionContextExecutor;
at
org.apache.spark.storage.BlockManagerMasterActor.preStart(BlockManagerMasterActor.scala:63)

I guess this kind of error is coming when dependency library version is not
matched.
Could you share your build command? and are you setting SPARK_HOME or
HADOOP_HOME in your conf/zeppelin-env.sh?

Thanks,
moon

On Mon, Mar 23, 2015 at 10:39 PM Nirav Mehta <[email protected]> wrote:

> Moon,
>
> Thanks for the quick response.
>
> Not sure how to interpret this error:
> ERROR [2015-03-23 13:35:53,149] ({pool-1-thread-3}
> ProcessFunction.java[process]:41) - Internal error processing open
> com.nflabs.zeppelin.interpreter.InterpreterException:
> java.lang.IllegalStateException: cannot create children while terminating
> or terminated
>
>  I've attached the log file for further reference.
>
> Thanks,
> Nirav
>
> On Sun, Mar 22, 2015 at 11:06 PM, moon soo Lee <[email protected]> wrote:
>
>> Hi,
>>
>> There're log file that starts with 'zeppelin-interpreter-spark-*.log'
>> under logs directory.
>> Could you check this file and see is there any exception happened?
>>
>> Thanks,
>> moon
>>
>> On Mon, Mar 23, 2015 at 10:36 AM Nirav Mehta <[email protected]>
>> wrote:
>>
>>> Hi,
>>>
>>> I'm trying to run Zeppelin over an existing Spark cluster.
>>>
>>> My zeppelin-env.sh has the entry:
>>> export MASTER=spark://spark:7077
>>>
>>> In the first paragraph, I executed bash commands:
>>> %sh
>>> hadoop fs -ls /user/root
>>>
>>> This returned:
>>> drwxr-xr-x - root supergroup 0 2015-01-15 09:05 /user/root/input
>>> -rw-r--r-- 3 root supergroup 29966462 2015-03-23 01:06
>>> /user/root/product.txt
>>>
>>> In the next paragraph, I executed the following:
>>> %spark
>>> val prodRaw = sc.textFile("hdfs://user/root/product.txt")
>>> prodRaw.count
>>>
>>> This doesn't return any result, or any errors on the console. Instead, I
>>> see a new context create every time I execute something:
>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>> explanation.
>>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>> ------ Create new SparkContext spark://spark:7077 -------
>>> ------ Create new SparkContext spark://spark:7077 -------
>>> ------ Create new SparkContext spark://spark:7077 -------
>>>
>>> Is this expected behavior? Seems like Zeppelin should be holding the
>>> context.
>>>
>>> Same issues when executing the sample notebook.
>>>
>>> Appreciate any help!
>>>
>>
>

Reply via email to