jalousiex opened a new issue, #4038:
URL: https://github.com/apache/incubator-streampark/issues/4038

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/incubator-streampark/issues?q=is%3Aissue+label%3A%22bug%22)
 and found no similar issues.
   
   
   ### Java Version
   
   java version "1.8.0_311"
   Java(TM) SE Runtime Environment (build 1.8.0_311-b11)
   Java HotSpot(TM) 64-Bit Server VM (build 25.311-b11, mixed mode)
   
   ### Scala Version
   
   2.12.x
   
   ### StreamPark Version
   
   apache-streampark_2.12-2.1.4-incubating-bin.tar
   
   ### Flink Version
   
   flink 1.18 standalone session 1 master, 3 workers
   
   ```
   dinky-app-1.18-1.1.0-jar-with-dependencies.jar
   dinky-client-1.18-1.1.0.jar
   dinky-client-base-1.1.0.jar
   dinky-common-1.1.0.jar
   flink-cep-1.18.1.jar
   flink-connector-files-1.18.1.jar
   flink-connector-jdbc-3.2.0-1.18.jar
   flink-csv-1.18.1.jar
   flink-dist-1.18.1.jar
   flink-doris-connector-1.18-1.6.2.jar
   flink-json-1.18.1.jar
   flink-scala_2.12-1.18.1.jar
   flink-shaded-hadoop-2-uber-2.8.3-10.0.jar
   flink-sql-connector-mysql-cdc-3.1.1.jar
   flink-table-api-java-uber-1.18.1.jar
   flink-table-planner_2.12-1.18.1.jar
   flink-table-runtime-1.18.1.jar
   log4j-1.2-api-2.17.1.jar
   log4j-api-2.17.1.jar
   log4j-core-2.17.1.jar
   log4j-slf4j-impl-2.17.1.jar
   mysql-connector-java-8.0.27.jar
   ojdbc8-23.2.0.0.jar
   paimon-flink-1.18-0.8.2.jar
   ```
   
   ### deploy mode
   
   None
   
   ### What happened
   
   follow [paimon's 
guide](https://paimon.apache.org/docs/0.8/flink/cdc-ingestion/mysql-cdc/#synchronizing-databases)
   ### i can run a paimon action job successed
   ```
   bin/flink run \
   -Dexecution.checkpointing.interval=10s \
   -Dexecution.checkpointing.num-retained=5 \
   -Dstate.checkpoints.num-retained=10 \
   -Dpipeline.name=sync-db-mysql-to-paimon-s3 \
   ../fjars/paimon-flink-action-0.8.2.jar \
   mysql_sync_database \
   --mysql-conf hostname=172.31.4.149 \
   --mysql-conf port=3306 \
   --mysql-conf username=testuser \
   --mysql-conf password=testuser \
   --mysql-conf database-name=testflink \
   --warehouse s3://flink/paimon/ \
   --catalog_conf s3.endpoint=https://ossapi-tst \
   --catalog_conf s3.access-key=*****\
   --catalog_conf s3.secret-key=*****\
   --catalog_conf s3.path.style.access=true \
   --database ods \
   --including_tables=o.*|product.?|shipments \
   --table_prefix my_ \
   --table_suffix _001 \
   --table_conf source.checkpoint-align.enabled=true \
   --table_conf changelog-producer=input \
   --table_conf sink.parallelism=1
   ```
   
   ### but failed in streampark by custom code job
   
   **upload local job** = paimon-flink-action-0.8.2.jar
   **Dependency Upload Jar** = flink-sql-connector-mysql-cdc-3.1.1.jar
   **Dynamic Properties**
   ```
   -Dexecution.checkpointing.interval=10s 
   -Dexecution.checkpointing.num-retained=5 
   -Dstate.checkpoints.num-retained=10 
   ```
   **Program Args**
   ```
   mysql_sync_database 
   --mysql-conf hostname=172.31.4.149 
   --mysql-conf port=3306 
   --mysql-conf username=testuser 
   --mysql-conf password=testuser 
   --mysql-conf database-name=testflink 
   --warehouse s3://flink/paimon/ 
   --catalog_conf s3.endpoint=https://ossapi-tst 
   --catalog_conf s3.access-key=*****
   --catalog_conf s3.secret-key=*****
   --catalog_conf s3.path.style.access=true 
   --database default 
   --including_tables=o.*|product.?|shipments 
   --table_prefix my_ 
   --table_suffix _001 
   --table_conf source.checkpoint-align.enabled=true 
   --table_conf changelog-producer=input 
   --table_conf sink.parallelism=1
   ```
   
   ### Error Exception
   
   ```log
   java.util.concurrent.CompletionException: 
java.lang.reflect.InvocationTargetException
        at 
java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:273)
        at 
java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:280)
        at 
java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1592)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
   Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.streampark.flink.client.FlinkClient$.$anonfun$proxy$1(FlinkClient.scala:87)
        at 
org.apache.streampark.flink.proxy.FlinkShimsProxy$.$anonfun$proxy$1(FlinkShimsProxy.scala:60)
        at 
org.apache.streampark.common.util.ClassLoaderUtils$.runAsClassLoader(ClassLoaderUtils.scala:38)
        at 
org.apache.streampark.flink.proxy.FlinkShimsProxy$.proxy(FlinkShimsProxy.scala:60)
        at 
org.apache.streampark.flink.client.FlinkClient$.proxy(FlinkClient.scala:82)
        at 
org.apache.streampark.flink.client.FlinkClient$.submit(FlinkClient.scala:53)
        at 
org.apache.streampark.flink.client.FlinkClient.submit(FlinkClient.scala)
        at 
org.apache.streampark.console.core.service.impl.ApplicationServiceImpl.lambda$start$8(ApplicationServiceImpl.java:1655)
        at 
java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590)
        ... 3 more
   Caused by: java.lang.RuntimeException: 
   
   [flink-submit] Both JobGraph submit plan and Rest API submit plan all failed!
   JobGraph Submit plan failed detail:
   ------------------------------------------------------------------
   org.apache.flink.client.program.ProgramInvocationException: The main method 
caused an error: org.apache.paimon.fs.UnsupportedSchemeException: Could not 
find a file io implementation for scheme 's3' in the classpath.  
FlinkFileIOLoader also cannot access this path. Hadoop FileSystem also cannot 
access this path 's3://flink/paimon'.
        at 
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:372)
        at 
org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222)
        at 
org.apache.flink.client.program.PackagedProgramUtils.getPipelineFromProgram(PackagedProgramUtils.java:158)
        at 
org.apache.flink.client.program.PackagedProgramUtils.createJobGraph(PackagedProgramUtils.java:82)
        at 
org.apache.streampark.flink.client.trait.FlinkClientTrait.getJobGraph(FlinkClientTrait.scala:252)
        at 
org.apache.streampark.flink.client.trait.FlinkClientTrait.getJobGraph$(FlinkClientTrait.scala:231)
        at 
org.apache.streampark.flink.client.impl.RemoteClient$.jobGraphSubmit(RemoteClient.scala:139)
        at 
org.apache.streampark.flink.client.impl.RemoteClient$.$anonfun$doSubmit$1(RemoteClient.scala:48)
        at 
org.apache.streampark.flink.client.trait.FlinkClientTrait.$anonfun$trySubmit$1(FlinkClientTrait.scala:207)
        at scala.util.Try$.apply(Try.scala:209)
        at 
org.apache.streampark.flink.client.trait.FlinkClientTrait.trySubmit(FlinkClientTrait.scala:205)
        at 
org.apache.streampark.flink.client.trait.FlinkClientTrait.trySubmit$(FlinkClientTrait.scala:201)
        at 
org.apache.streampark.flink.client.impl.RemoteClient$.doSubmit(RemoteClient.scala:49)
        at 
org.apache.streampark.flink.client.trait.FlinkClientTrait.submit(FlinkClientTrait.scala:123)
        at 
org.apache.streampark.flink.client.trait.FlinkClientTrait.submit$(FlinkClientTrait.scala:60)
        at 
org.apache.streampark.flink.client.impl.RemoteClient$.submit(RemoteClient.scala:34)
        at 
org.apache.streampark.flink.client.FlinkClientHandler$.submit(FlinkClientHandler.scala:40)
        at 
org.apache.streampark.flink.client.FlinkClientHandler.submit(FlinkClientHandler.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.streampark.flink.client.FlinkClient$.$anonfun$proxy$1(FlinkClient.scala:87)
        at 
org.apache.streampark.flink.proxy.FlinkShimsProxy$.$anonfun$proxy$1(FlinkShimsProxy.scala:60)
        at 
org.apache.streampark.common.util.ClassLoaderUtils$.runAsClassLoader(ClassLoaderUtils.scala:38)
        at 
org.apache.streampark.flink.proxy.FlinkShimsProxy$.proxy(FlinkShimsProxy.scala:60)
        at 
org.apache.streampark.flink.client.FlinkClient$.proxy(FlinkClient.scala:82)
        at 
org.apache.streampark.flink.client.FlinkClient$.submit(FlinkClient.scala:53)
        at 
org.apache.streampark.flink.client.FlinkClient.submit(FlinkClient.scala)
        at 
org.apache.streampark.console.core.service.impl.ApplicationServiceImpl.lambda$start$8(ApplicationServiceImpl.java:1655)
        at 
java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
   Caused by: java.io.UncheckedIOException: 
org.apache.paimon.fs.UnsupportedSchemeException: Could not find a file io 
implementation for scheme 's3' in the classpath.  FlinkFileIOLoader also cannot 
access this path. Hadoop FileSystem also cannot access this path 
's3://flink/paimon'.
        at 
org.apache.paimon.catalog.CatalogFactory.createCatalog(CatalogFactory.java:92)
        at 
org.apache.paimon.catalog.CatalogFactory.createCatalog(CatalogFactory.java:66)
        at 
org.apache.paimon.flink.FlinkCatalogFactory.createPaimonCatalog(FlinkCatalogFactory.java:80)
        at 
org.apache.paimon.flink.action.ActionBase.initPaimonCatalog(ActionBase.java:71)
        at org.apache.paimon.flink.action.ActionBase.<init>(ActionBase.java:58)
        at 
org.apache.paimon.flink.action.cdc.SynchronizationActionBase.<init>(SynchronizationActionBase.java:77)
        at 
org.apache.paimon.flink.action.cdc.SyncDatabaseActionBase.<init>(SyncDatabaseActionBase.java:61)
        at 
org.apache.paimon.flink.action.cdc.mysql.MySqlSyncDatabaseAction.<init>(MySqlSyncDatabaseAction.java:108)
        at 
org.apache.paimon.flink.action.cdc.mysql.MySqlSyncDatabaseActionFactory.createAction(MySqlSyncDatabaseActionFactory.java:52)
        at 
org.apache.paimon.flink.action.cdc.mysql.MySqlSyncDatabaseActionFactory.createAction(MySqlSyncDatabaseActionFactory.java:31)
        at 
org.apache.paimon.flink.action.cdc.SynchronizationActionFactoryBase.create(SynchronizationActionFactoryBase.java:45)
        at 
org.apache.paimon.flink.action.cdc.SyncDatabaseActionFactoryBase.create(SyncDatabaseActionFactoryBase.java:44)
        at 
org.apache.paimon.flink.action.ActionFactory.createAction(ActionFactory.java:82)
        at 
org.apache.paimon.flink.action.FlinkActions.main(FlinkActions.java:38)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355)
        ... 33 more
   Caused by: org.apache.paimon.fs.UnsupportedSchemeException: Could not find a 
file io implementation for scheme 's3' in the classpath.  FlinkFileIOLoader 
also cannot access this path. Hadoop FileSystem also cannot access this path 
's3://flink/paimon'.
        at org.apache.paimon.fs.FileIO.get(FileIO.java:420)
        at 
org.apache.paimon.catalog.CatalogFactory.createCatalog(CatalogFactory.java:89)
        ... 51 more
        Suppressed: java.nio.file.AccessDeniedException: s3://flink/paimon: 
org.apache.hadoop.fs.s3a.auth.NoAuthWithAWSException: No AWS Credentials 
provided by DynamicTemporaryAWSCredentialsProvider 
TemporaryAWSCredentialsProvider SimpleAWSCredentialsProvider 
EnvironmentVariableCredentialsProvider IAMInstanceCredentialsProvider : 
com.amazonaws.SdkClientException: Unable to load AWS credentials from 
environment variables (AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) and AWS_SECRET_KEY 
(or AWS_SECRET_ACCESS_KEY))
                at 
org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:212)
                at 
org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:175)
                at 
org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:3799)
                at 
org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:3688)
                at 
org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$exists$34(S3AFileSystem.java:4703)
                at 
org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.lambda$trackDurationOfOperation$5(IOStatisticsBinding.java:499)
                at 
org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.trackDuration(IOStatisticsBinding.java:444)
                at 
org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2337)
                at 
org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2356)
                at 
org.apache.hadoop.fs.s3a.S3AFileSystem.exists(S3AFileSystem.java:4701)
                at 
org.apache.flink.fs.s3hadoop.common.HadoopFileSystem.exists(HadoopFileSystem.java:165)
                at 
org.apache.paimon.flink.FlinkFileIO.exists(FlinkFileIO.java:100)
                at 
org.apache.paimon.fs.FileIOUtils.checkAccess(FileIOUtils.java:37)
                at org.apache.paimon.fs.FileIO.get(FileIO.java:388)
                ... 52 more
        Caused by: org.apache.hadoop.fs.s3a.auth.NoAuthWithAWSException: No AWS 
Credentials provided by DynamicTemporaryAWSCredentialsProvider 
TemporaryAWSCredentialsProvider SimpleAWSCredentialsProvider 
EnvironmentVariableCredentialsProvider IAMInstanceCredentialsProvider : 
com.amazonaws.SdkClientException: Unable to load AWS credentials from 
environment variables (AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) and AWS_SECRET_KEY 
(or AWS_SECRET_ACCESS_KEY))
                at 
org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:216)
                at 
com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1269)
                at 
com.amazonaws.http.AmazonHttpClient$RequestExecutor.runBeforeRequestHandlers(AmazonHttpClient.java:845)
                at 
com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:794)
                at 
com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:781)
                at 
com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:755)
                at 
com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:715)
                at 
com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:697)
                at 
com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:561)
                at 
com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:541)
                at 
com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5456)
                at 
com.amazonaws.services.s3.AmazonS3Client.getBucketRegionViaHeadRequest(AmazonS3Client.java:6432)
                at 
com.amazonaws.services.s3.AmazonS3Client.fetchRegionFromCache(AmazonS3Client.java:6404)
                at 
com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5441)
                at 
com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5403)
                at 
com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1372)
                at 
org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$getObjectMetadata$10(S3AFileSystem.java:2545)
                at 
org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:414)
                at 
org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:377)
                at 
org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:2533)
                at 
org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:2513)
                at 
org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:3776)
                ... 63 more
        Caused by: com.amazonaws.SdkClientException: Unable to load AWS 
credentials from environment variables (AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) 
and AWS_SECRET_KEY (or AWS_SECRET_ACCESS_KEY))
                at 
com.amazonaws.auth.EnvironmentVariableCredentialsProvider.getCredentials(EnvironmentVariableCredentialsProvider.java:49)
                at 
org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:177)
                ... 84 more
        Suppressed: org.apache.hadoop.fs.UnsupportedFileSystemException: No 
FileSystem for scheme "s3"
                at 
org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3443)
                at 
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3466)
                at 
org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:174)
                at 
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3574)
                at 
org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3521)
                at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:540)
                at org.apache.hadoop.fs.Path.getFileSystem(Path.java:365)
                at 
org.apache.paimon.fs.hadoop.HadoopFileIO.createFileSystem(HadoopFileIO.java:175)
                at 
org.apache.paimon.fs.hadoop.HadoopFileIO.getFileSystem(HadoopFileIO.java:168)
                at 
org.apache.paimon.fs.hadoop.HadoopFileIO.getFileSystem(HadoopFileIO.java:145)
                at 
org.apache.paimon.fs.hadoop.HadoopFileIO.exists(HadoopFileIO.java:110)
                at 
org.apache.paimon.fs.FileIOUtils.checkAccess(FileIOUtils.java:37)
                at org.apache.paimon.fs.FileIO.get(FileIO.java:397)
                ... 52 more
   ```
   
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!(您是否要贡献这个PR?)
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to