QuintinTao opened a new issue #2322:
URL: https://github.com/apache/iceberg/issues/2322


   **I got a exception**
   Exception in thread "main" java.lang.IllegalArgumentException: Cannot 
initialize Catalog, missing no-arg constructor: 
org.apache.iceberg.hive.HiveCatalog
        at org.apache.iceberg.CatalogUtil.loadCatalog(CatalogUtil.java:157)
        at 
org.apache.iceberg.CatalogUtil.buildIcebergCatalog(CatalogUtil.java:194)
        at 
org.apache.iceberg.spark.SparkCatalog.buildIcebergCatalog(SparkCatalog.java:100)
        at 
org.apache.iceberg.spark.SparkCatalog.initialize(SparkCatalog.java:380)
        at 
org.apache.spark.sql.connector.catalog.Catalogs$.load(Catalogs.scala:61)
        at 
org.apache.spark.sql.connector.catalog.CatalogManager.$anonfun$catalog$1(CatalogManager.scala:52)
        at scala.collection.mutable.HashMap.getOrElseUpdate(HashMap.scala:86)
        at 
org.apache.spark.sql.connector.catalog.CatalogManager.catalog(CatalogManager.scala:52)
        at 
org.apache.iceberg.spark.source.IcebergSource.catalogAndIdentifier(IcebergSource.java:118)
        at 
org.apache.iceberg.spark.source.IcebergSource.getTable(IcebergSource.java:82)
        at 
org.apache.iceberg.spark.source.IcebergSource.inferPartitioning(IcebergSource.java:72)
        at 
org.apache.spark.sql.execution.datasources.v2.DataSourceV2Utils$.getTableFromProvider(DataSourceV2Utils.scala:82)
        at 
org.apache.spark.sql.streaming.DataStreamWriter.start(DataStreamWriter.scala:355)
        at 
com.funny.SparkIcebergApplication.createNewStreamingQuery(SparkIcebergApplication.scala:97)
        at 
com.funny.SparkIcebergApplication.run(SparkIcebergApplication.scala:103)
        at 
com.funny.SparkIcebergApplication$.main(SparkIcebergApplication.scala:45)
        at com.funny.SparkIcebergApplication.main(SparkIcebergApplication.scala)
   Caused by: java.lang.NoSuchMethodException: Cannot find constructor for 
interface org.apache.iceberg.catalog.Catalog
        Missing org.apache.iceberg.hive.HiveCatalog 
[java.lang.NoClassDefFoundError: 
org/apache/hadoop/hive/metastore/api/NoSuchObjectException]
        at 
org.apache.iceberg.common.DynConstructors$Builder.buildChecked(DynConstructors.java:227)
        at org.apache.iceberg.CatalogUtil.loadCatalog(CatalogUtil.java:155)
        ... 16 more
   I think the reason is CatalogUtil.java:157  can't get the 
ICEBERG_CATALOG_TYPE 
   
   **but I set it here is my code** 
   
     val sparkCof = new SparkConf()
       sparkCof
         
.set("spark.sql.extensions","org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions")
 
         .set("spark.sql.catalog.hadoop_prod.type", "hadoop") 
         .set("spark.sql.catalog.hadoop_prod", 
"org.apache.iceberg.spark.SparkCatalog")
         .set("spark.sql.catalog.hadoop_prod.warehouse", 
"hdfs://ip:port/user/hive/")
   
       val spark = SparkSession.builder().appName("socketStreaming")
         .master(sparkConfig.sparkMaster)
         .config("spark.cleaner.ttl", sparkConfig.ttl) 
         .config("spark.cleaner.referenceTracking.cleanCheckpoints", 
sparkConfig.cleanCheckpoints)
         .config("spark.ui.enabled", sparkConfig.sparkUIEnabled.toString)
         .config(sparkCof)
         .getOrCreate()
   
   val df = spark.readStream.format("kafka").option("kafka.bootstrap.servers", 
"ip:port").option("startingOffsets", 
"earliest").option("partition.assignment.strategy", 
"roundrobin").option("maxOffsetsPerTrigger", "100000").option("failOnDataLoss", 
false).option("subscribe", "forward").load()
   
   
df.writeStream.format("iceberg").outputMode("append").trigger(Trigger.ProcessingTime(1,
 TimeUnit.MINUTES)).option("path", 
"local.db3.table").option("checkpointLocation", "/user/xxx").start()


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to