I don't see spark-libs.jar under $KYLIN_HOME/spark/jars

per this doc: http://kylin.apache.org/docs21/tutorial/cube_spark.html

On Wed, Feb 28, 2018 at 10:30 AM, Sonny Heer <[email protected]> wrote:

> Hi Billy
> Looks like the current error is this:
>
> Error: Could not find or load main class org.apache.spark.deploy.yarn.
> ApplicationMaster
>
> End of LogType:stderr
>
> Thanks
>
> On Wed, Feb 28, 2018 at 8:04 AM, Billy Liu <[email protected]> wrote:
>
>> Any exception in logs?
>>
>> With Warm regards
>>
>> Billy Liu
>>
>>
>> 2018-02-28 22:53 GMT+08:00 Sonny Heer <[email protected]>:
>> > Anyone know what I need to set in order for spark-submit to use the HDP
>> > version of spark and not the internal one?
>> >
>> > currently i see:
>> >
>> > export HADOOP_CONF_DIR=/ebs/kylin/hadoop-conf &&
>> > /ebs/kylin/apache-kylin-2.2.0-bin/spark/bin/spark-submit
>> >
>> >
>> > I see in the kylin.properties files:
>> > ## Spark conf (default is in spark/conf/spark-defaults.conf)
>> >
>> > Although it doesn't how how I can change this to use the HDP
>> spark-submit.
>> >
>> > Also HDP is on 1.6.1 version of spark and kylin internally uses 2.x.
>> Not
>> > sure if that matters during submit.  I can't seem to get more than 2
>> > executors to run without it failing with other errors.  We have about 44
>> > slots on our cluster.
>> >
>> > Also uncommented:
>> > ## uncomment for HDP
>> >
>> > kylin.engine.spark-conf.spark.driver.extraJavaOptions=-Dhdp.
>> version=current
>> >
>> > kylin.engine.spark-conf.spark.yarn.am.extraJavaOptions=-Dhdp
>> .version=current
>> >
>> > kylin.engine.spark-conf.spark.executor.extraJavaOptions=-Dhd
>> p.version=current
>> >
>> > see attached for other properties set.
>>
>
>

Reply via email to