CTTY commented on PR #9136:
URL: https://github.com/apache/hudi/pull/9136#issuecomment-1640605864

   It seems `validate-bundles(flink1.17, Spark 3.4, Spark 3.4.0)` just 
consistently fail with JDK17 on issue below:
   ```
   Connecting to jdbc:hive2://localhost:10000/default
   23/07/17 17:45:48 [main]: WARN jdbc.HiveConnection: Failed to connect to 
localhost:10000
   Could not open connection to the HS2 server. Please check the server URI and 
if the URI is correct, then ask the administrator to check the server status.
   Error: Could not open client transport with JDBC Uri: 
jdbc:hive2://localhost:10000/default: java.net.ConnectException: Connection 
refused (Connection refused) (state=08S01,code=0)
   Cannot run commands specified using -e. No current connection
   Error: validate.sh HiveQL validation failed.
   Error: Process completed with exit code 1.
   ```
   
   Need to look into this, otherwise everything looks good. newly added 
docker-test-java17 and test-spark-java17 are working fine


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to