This is the code: created a java class by extending
org.apache.hadoop.hive.ql.udf.generic.GenericUDTF;
,and creates a sparkSession as-
SparkSession spark =
SparkSession.builder().enableHiveSupport().master("yarn-client").appName("SampleSparkUDTF_yarnV1").getOrCreate();
,and tries to rea
Why does "spark.yarn.jars" property not read, in this HDP 2.6 , Spark2.1.1
cluster:
0: jdbc:hive2://localhost:1/db> set spark.yarn.jars;
+--+--+
| set
The property "spark.yarn.jars" available via
/usr/hdp/current/spark2-client/conf/spark-default.conf
spark.yarn.jars hdfs://ambari03.fuzzyl.com:8020/hdp/apps/2.6.1.0-129/spark2
Is there any other way to set/read/pass this property "spark.yarn.jars" ?
From: Su
3)
at
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:170)
at
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:555)
... 18 more
Is there a way to resolve this error?
On Wed, Jul 5, 2017 at 2:01 PM, Sudha KS
mailto:sudha...@fuzzylogix.com>> wrote:
The
While testing like this, it does not read hive-site.xml, spark-env.sh of the
cluster (had to pass in SparkSession.builder().config()).
Is there a way to make it read spark config present in the cluster?
From: Sudha KS
Sent: Wednesday, July 5, 2017 6:45 PM
To: user@spark.apache.org
Subject: RE
To set Spark2 as default, refer
https://www.cloudera.com/documentation/spark2/latest/topics/spark2_admin.html#default_tools
-Original Message-
From: Gaurav1809 [mailto:gauravhpan...@gmail.com]
Sent: Wednesday, September 20, 2017 9:16 AM
To: user@spark.apache.org
Subject: Cloudera - How t
The tables created are in the name livy:
drwxrwxrwx+ - livy hdfs 0 2017-12-18 09:38
/apps/hive/warehouse/dev.db/tbl
1. Df.write.saveAsTable()
2. spark.sql("CREATE TABLE tbl (key INT, value STRING)")
whereas, the create table from hive shell is 'hive'/proxyuser:
drwxrwxrwx+