Yes, that's right. I hope we can collect them wisely.

On Wed, Aug 3, 2016 at 10:58 PM, Jeff Zhang <zjf...@gmail.com> wrote:

> For some cases, zeppelin side's configuration will override
> spark-defaults.conf, but if spark.master is yarn-cluster in
> spark-defaults.conf, then this kind of inconsistency will happen.  I  will
> solve this issue in https://issues.apache.org/jira/browse/ZEPPELIN-1263
>
> On Wed, Aug 3, 2016 at 9:38 PM, Jongyoul Lee <jongy...@gmail.com> wrote:
>
>> In that case, zeppelin side's one will overrider spark-defaults.conf
>>
>> On Tue, Aug 2, 2016 at 11:17 AM, Jeff Zhang <zjf...@gmail.com> wrote:
>>
>>> Find the issue, this is due to the inconsistency between spark.master in
>>> spark-defaults.conf spark.master in zeppelin side.
>>>
>>> On Tue, Aug 2, 2016 at 9:20 AM, Jeff Zhang <zjf...@gmail.com> wrote:
>>>
>>>>
>>>> I follow the the instruction in this PR to build zeppelin and try to
>>>> run in yarn-client mode, but hit very weird issue. It seems the SPARK_HOME
>>>> is messed up. Does anyone hit this issue or can run zeppelin on spark 2.0
>>>> in yarn-client mode successfully ?
>>>>
>>>> https://github.com/apache/zeppelin/pull/1195
>>>>
>>>> ERROR [2016-08-02 09:05:27,771] ({pool-4-thread-5}
>>>> Logging.scala[logError]:91) - Error initializing SparkContext.
>>>> java.lang.IllegalStateException: Library directory
>>>> '/Users/jzhang/Temp/hadoop_tmp/nm-local-dir/usercache/jzhang/appcache/application_1470097474471_0011/container_1470097474471_0011_01_000001/assembly/target/scala-2.11/jars'
>>>> does not exist; make sure Spark is built.
>>>>     at
>>>> org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:248)
>>>>     at
>>>> org.apache.spark.launcher.CommandBuilderUtils.findJarsDir(CommandBuilderUtils.java:368)
>>>>     at
>>>> org.apache.spark.launcher.YarnCommandBuilderUtils$.findJarsDir(YarnCommandBuilderUtils.scala:38)
>>>>     at
>>>> org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:500)
>>>>     at
>>>> org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:834)
>>>>     at
>>>> org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:167)
>>>>     at
>>>> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
>>>>     at
>>>> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:149)
>>>>     at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
>>>>     at
>>>> org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2256)
>>>>     at
>>>> org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
>>>>     at
>>>> org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
>>>>     at scala.Option.getOrElse(Option.scala:121)
>>>>     at
>>>> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>     at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>     at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>     at java.lang.reflect.Method.invoke(Method.java:497)
>>>>     at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
>>>>     at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:33)
>>>>
>>>>
>>>> --
>>>> Best Regards
>>>>
>>>> Jeff Zhang
>>>>
>>>
>>>
>>>
>>> --
>>> Best Regards
>>>
>>> Jeff Zhang
>>>
>>
>>
>>
>> --
>> 이종열, Jongyoul Lee, 李宗烈
>> http://madeng.net
>>
>
>
>
> --
> Best Regards
>
> Jeff Zhang
>



-- 
이종열, Jongyoul Lee, 李宗烈
http://madeng.net

Reply via email to