codope commented on issue #6808:
URL: https://github.com/apache/hudi/issues/6808#issuecomment-1263311923
Could be related to Hive itself. Can you add this to your script and see if
metastore configs are passed correctly?
`spark.sql("set").filter("key rlike 'metastore|jdo'").show(1000,False)`
I see that you have already turned off the schema as well as its version
verification. That should have taken effect.
Also, you need to set the derby jdbc driver.
```
"spark.hadoop.javax.jdo.option.ConnectionDriverName" =
"org.apache.derby.jdbc.EmbeddedDriver",
"spark.hadoop.javax.jdo.option.ConnectionURL" =
"jdbc:derby:memory:myInMemDB;create=true"
```
If it does not help, can you run sync with `jdbc` mode. Just set following
configs:
```
"hoodie.datasource.hive_sync.use_jdbc": "true"
"hoodie.datasource.hive_sync.mode": "jdbc"
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]