zztttt edited a comment on issue #4072:
URL: https://github.com/apache/hudi/issues/4072#issuecomment-981334669


   > sorry, I got it wrong add .config("spark.sql.extensions", 
"org.apache.spark.sql.hudi.HoodieSparkSessionExtension")
   
   Thanks a lot! I have solve the problem I met a few days ago! However, I got 
a different exception.
   The absolute path of the new project is /home/zzt/code/spark-debug, and the 
detail of the exception is the following:
   
   Mon Nov 29 14:24:44 CST 2021 Thread[main,5,main] Cleanup action starting
   java.sql.SQLException: Failed to start database 'metastore_db' with class 
loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73bcd9b4, 
see the next exception for details.
        at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
        at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
        at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
   .......
   aused by: ERROR XJ040: Failed to start database 'metastore_db' with class 
loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@73bcd9b4, 
see the next exception for details.
        at org.apache.derby.iapi.error.StandardException.newException(Unknown 
Source)
        at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown
 Source)
        ... 100 more
   Caused by: ERROR XSDB6: Another instance of Derby may have already booted 
the database /home/zzt/code/spark-debug/metastore_db.
        at org.apache.derby.iapi.error.StandardException.newException(Unknown 
Source)
        at org.apache.derby.iapi.error.StandardException.newException(Unknown 
Source)
        at 
org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown
 Source)
        at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown 
Source)
        at java.security.AccessController.doPrivileged(Native Method)
   
   I can find this log in derby.log in the project path. The new SparkSession 
is build as below:
   val spark = SparkSession.builder
         .appName("Spark shell")
         .enableHiveSupport()
         .config("spark.serializer", 
"org.apache.spark.serializer.KryoSerializer")
         .config("spark.sql.extensions", 
"org.apache.spark.sql.hudi.HoodieSparkSessionExtension")
         .config("sparl.sql.catalogImplementation", "hive")
         .config("spark.sql.warehouse.dir", 
"/home/zzt/code/spark-debug/spark-warehouse")
         .master("local[*]")
         .getOrCreate();
   I open .enableHiveSupport() in the config part. Without this, the program 
works well. Is there any conflict?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to