[ https://issues.apache.org/jira/browse/SPARK-23473?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Goun Na updated SPARK-23473: ---------------------------- Description: Errors when Hive database name starts with a number such as 11st. Attached is full error message, and reproducible ------------------------------------------------------------------------------------------------------------------------------------ scala> spark.catalog.setCurrentDatabase("11st") scala> spark.catalog.listTables scala> spark.catalog.listTables 18/02/21 15:47:44 ERROR log: error in initSerDe: java.lang.ClassNotFoundException Class org.apache.hadoop.hive.contrib.serde2.RegexSerDe not found java.lang.ClassNotFoundException: Class org.apache.hadoop.hive.contrib.serde2.RegexSerDe not found at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105) at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:385) at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276) at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258) at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605) at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$getTableOption$1$$anonfun$apply$10.apply(HiveClientImpl.scala:365) was: Errors when Hive database name starts with a number such as 11st. Full error message is attached ------------------------------------------------------------------------------------------------------------------------------------ scala> spark.catalog.setCurrentDatabase("11st") scala> spark.catalog.listTables scala> spark.catalog.listTables 18/02/21 15:47:44 ERROR log: error in initSerDe: java.lang.ClassNotFoundException Class org.apache.hadoop.hive.contrib.serde2.RegexSerDe not found java.lang.ClassNotFoundException: Class org.apache.hadoop.hive.contrib.serde2.RegexSerDe not found at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105) at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:385) at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276) at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258) at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605) at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$getTableOption$1$$anonfun$apply$10.apply(HiveClientImpl.scala:365) > spark.catalog.listTables error when database name starts with a number > ---------------------------------------------------------------------- > > Key: SPARK-23473 > URL: https://issues.apache.org/jira/browse/SPARK-23473 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 2.1.0 > Reporter: Goun Na > Priority: Trivial > Attachments: spark_catalog_err.txt > > > Errors when Hive database name starts with a number such as 11st. > Attached is full error message, and reproducible > ------------------------------------------------------------------------------------------------------------------------------------ > > scala> spark.catalog.setCurrentDatabase("11st") > scala> spark.catalog.listTables > scala> spark.catalog.listTables > 18/02/21 15:47:44 ERROR log: error in initSerDe: > java.lang.ClassNotFoundException Class > org.apache.hadoop.hive.contrib.serde2.RegexSerDe not found > java.lang.ClassNotFoundException: Class > org.apache.hadoop.hive.contrib.serde2.RegexSerDe not found > at > org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105) > at > org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:385) > at > org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276) > at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258) > at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605) > at > org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$getTableOption$1$$anonfun$apply$10.apply(HiveClientImpl.scala:365) -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org