Please use vendor's forum. 
Thanks
-------- Original message --------From: Sonny Heer <[email protected]> Date: 
2/28/18  2:35 PM  (GMT-08:00) To: [email protected] Subject: Re: running 
spark on kylin 2.2 
So when I run it with just spark-submit it gets further.  but now there is a 
API difference.  Does Kylin 2.2 work with Spark 1.6.1 ?  This is the version 
that comes with HDP 










2.4.2.0-258



ERROR:











Exception in thread "main" java.lang.NoSuchMethodError: 
org.apache.spark.sql.hive.HiveContext.table(Ljava/lang/String;)Lorg/apache/spark/sql/Dataset;
at 
org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:167

It appears it only supports spark 2.x?  Please advise what we can do to make 
this work on HDP 2.4...
Thanks
On Wed, Feb 28, 2018 at 2:07 PM, Sonny Heer <[email protected]> wrote:
I don't see spark-libs.jar under $KYLIN_HOME/spark/jars
per this doc: http://kylin.apache.org/docs21/tutorial/cube_spark.html
On Wed, Feb 28, 2018 at 10:30 AM, Sonny Heer <[email protected]> wrote:
Hi BillyLooks like the current error is this:











Error: Could not find or load main class 
org.apache.spark.deploy.yarn.ApplicationMaster
End of LogType:stderr



Thanks
On Wed, Feb 28, 2018 at 8:04 AM, Billy Liu <[email protected]> wrote:
Any exception in logs?



With Warm regards



Billy Liu





2018-02-28 22:53 GMT+08:00 Sonny Heer <[email protected]>:

> Anyone know what I need to set in order for spark-submit to use the HDP

> version of spark and not the internal one?

>

> currently i see:

>

> export HADOOP_CONF_DIR=/ebs/kylin/hadoop-conf &&

> /ebs/kylin/apache-kylin-2.2.0-bin/spark/bin/spark-submit

>

>

> I see in the kylin.properties files:

> ## Spark conf (default is in spark/conf/spark-defaults.conf)

>

> Although it doesn't how how I can change this to use the HDP spark-submit.

>

> Also HDP is on 1.6.1 version of spark and kylin internally uses 2.x.  Not

> sure if that matters during submit.  I can't seem to get more than 2

> executors to run without it failing with other errors.  We have about 44

> slots on our cluster.

>

> Also uncommented:

> ## uncomment for HDP

>

> kylin.engine.spark-conf.spark.driver.extraJavaOptions=-Dhdp.version=current

>

> kylin.engine.spark-conf.spark.yarn.am.extraJavaOptions=-Dhdp.version=current

>

> kylin.engine.spark-conf.spark.executor.extraJavaOptions=-Dhdp.version=current

>

> see attached for other properties set.







Reply via email to