[
https://issues.apache.org/jira/browse/SPARK-35804?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Hyukjin Kwon resolved SPARK-35804.
----------------------------------
Resolution: Incomplete
> can't read external hive table on spark
> ---------------------------------------
>
> Key: SPARK-35804
> URL: https://issues.apache.org/jira/browse/SPARK-35804
> Project: Spark
> Issue Type: Bug
> Components: PySpark, Spark Core, Spark Shell
> Affects Versions: 2.3.2
> Environment: hdp 3.1.4
> hive-hcatalog-core-3.1.0.3.1.4.0-315.jar & hive-hcatalog-core-3.1.2 both I've
> tried
>
> Reporter: cao zhiyu
> Priority: Major
> Labels: JSON, external-tables, hive, spark
>
> I create a external hive table with hdfs file which is formatted as json
> string.
> I can read the data field of this hive table with the help of
> org.apache.hive.hcatalog.data.JsonSerDe which is packed in
> hive-hcatalog-core.jar in hive shell.
> But when I try to use the spark (pyspark ,spark-shell or whatever) ,I just
> can't read it.
> It gave me a error Table: Unable to get field from serde:
> org.apache.hive.hcatalog.data.JsonSerDe
> I've copied the jar (hive-hcatalog-core.jar) to $spark_home/jars and yarn
> libs and rerun ,there is no effect,even use --jars
> $jar_path/hive-hcatalog-core.jar.But when I browse the webpage of spark task
> ,I can actually find the jar in the env list.
>
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]