xyzhh opened a new issue, #11089:
URL: https://github.com/apache/dolphinscheduler/issues/11089

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/dolphinscheduler/issues?q=is%3Aissue) and 
found no similar issues.
   
   
   ### What happened
   
   Data quality task  worked well when I config data resource center to 
HDFS,But prompt follow error when I use  Minio instead of HDFS(And now,Data 
resource was working well.):
   
   [INFO] 2022-07-14 03:14:46.030 +0000 
[taskAppId=TASK-20220714-6187199960928_3-3-5] TaskLogLogger-class 
org.apache.dolphinscheduler.plugin.task.dq.DataQualityTask:[63] -  -> 22/07/14 
11:14:45 INFO yarn.Client: Application report for 
application_1657155379674_1551 (state: FINISHED)
        22/07/14 11:14:45 INFO yarn.Client: 
                 client token: N/A
                 diagnostics: User class threw exception: 
java.io.InterruptedIOException: doesBucketExist on dolphinscheduler: 
com.amazonaws.AmazonClientException: No AWS Credentials provided by 
BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider 
SharedInstanceProfileCredentialsProvider : com.amazonaws.SdkClientException: 
Unable to load credentials from service endpoint
                at 
org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:144)
                at 
org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(S3AFileSystem.java:336)
                at 
org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:283)
                at 
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2816)
                at 
org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:98)
                at 
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2853)
                at 
org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2835)
                at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:387)
                at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
                at 
org.apache.spark.sql.execution.datasources.DataSource.planForWritingFileFormat(DataSource.scala:452)
                at 
org.apache.spark.sql.execution.datasources.DataSource.planForWriting(DataSource.scala:548)
                at 
org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:278)
                at 
org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:267)
                at 
org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:225)
                at 
org.apache.spark.sql.DataFrameWriter.csv(DataFrameWriter.scala:642)
                at 
org.apache.dolphinscheduler.data.quality.flow.batch.writer.file.BaseFileWriter.outputImpl(BaseFileWriter.java:113)
                at 
org.apache.dolphinscheduler.data.quality.flow.batch.writer.file.HdfsFileWriter.write(HdfsFileWriter.java:40)
                at 
org.apache.dolphinscheduler.data.quality.execution.SparkBatchExecution.executeWriter(SparkBatchExecution.java:130)
                at 
org.apache.dolphinscheduler.data.quality.execution.SparkBatchExecution.execute(SparkBatchExecution.java:58)
                at 
org.apache.dolphinscheduler.data.quality.context.DataQualityContext.execute(DataQualityContext.java:62)
                at 
org.apache.dolphinscheduler.data.quality.DataQualityApplication.main(DataQualityApplication.java:70)
                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                at java.lang.reflect.Method.invoke(Method.java:498)
                at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:688)
        Caused by: com.amazonaws.AmazonClientException: No AWS Credentials 
provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider 
SharedInstanceProfileCredentialsProvider : com.amazonaws.SdkClientException: 
Unable to load credentials from service endpoint
                at 
org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:151)
                at 
com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1119)
                at 
com.amazonaws.http.AmazonHttpClient$RequestExecutor.runBeforeRequestHandlers(AmazonHttpClient.java:759)
                at 
com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:723)
                at 
com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:716)
                at 
com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699)
                at 
com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667)
                at 
com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649)
                at 
com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513)
                at 
com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4221)
                at 
com.amazonaws.services.s3.AmazonS3Client.getBucketRegionViaHeadRequest(AmazonS3Client.java:4982)
                at 
com.amazonaws.services.s3.AmazonS3Client.fetchRegionFromCache(AmazonS3Client.java:4956)
                at 
com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4205)
                at 
com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4168)
                at 
com.amazonaws.services.s3.AmazonS3Client.headBucket(AmazonS3Client.java:1306)
                at 
com.amazonaws.services.s3.AmazonS3Client.doesBucketExist(AmazonS3Client.java:1263)
                at 
org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(S3AFileSystem.java:324)
                ... 24 more
        Caused by: com.amazonaws.SdkClientException: Unable to load credentials 
from service endpoint
                at 
com.amazonaws.auth.EC2CredentialsFetcher.handleError(EC2CredentialsFetcher.java:180)
                at 
com.amazonaws.auth.EC2CredentialsFetcher.fetchCredentials(EC2CredentialsFetcher.java:159)
                at 
com.amazonaws.auth.EC2CredentialsFetcher.getCredentials(EC2CredentialsFetcher.java:82)
                at 
com.amazonaws.auth.InstanceProfileCredentialsProvider.getCredentials(InstanceProfileCredentialsProvider.java:141)
                at 
org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:129)
                ... 40 more
        Caused by: java.net.SocketTimeoutException: connect timed out
                at java.net.PlainSocketImpl.socketConnect(Native Method)
                at 
java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
                at 
java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
                at 
java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
                at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
                at java.net.Socket.connect(Socket.java:589)
                at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
                at sun.net.www.http.HttpClient.openServer(HttpClient.java:463)
                at sun.net.www.http.HttpClient.openServer(HttpClient.java:558)
                at sun.net.www.http.HttpClient.<init>(HttpClient.java:242)
                at sun.net.www.http.HttpClient.New(HttpClient.java:339)
                at sun.net.www.http.HttpClient.New(HttpClient.java:357)
                at 
sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1220)
                at 
sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1156)
                at 
sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:1050)
                at 
sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:984)
                at 
com.amazonaws.internal.ConnectionUtils.connectToEndpoint(ConnectionUtils.java:47)
                at 
com.amazonaws.internal.EC2CredentialsUtils.readResource(EC2CredentialsUtils.java:106)
                at 
com.amazonaws.internal.EC2CredentialsUtils.readResource(EC2CredentialsUtils.java:77)
                at 
com.amazonaws.auth.InstanceProfileCredentialsProvider$InstanceMetadataCredentialsEndpointProvider.getCredentialsEndpoint(InstanceProfileCredentialsProvider.java:156)
                at 
com.amazonaws.auth.EC2CredentialsFetcher.fetchCredentials(EC2CredentialsFetcher.java:121)
                ... 43 more
        
                 ApplicationMaster host: 192.168.15.17
                 ApplicationMaster RPC port: 0
                 queue: root.users.webdp
                 start time: 1657768444332
                 final status: FAILED
                 tracking URL: 
http://master:8088/proxy/application_1657155379674_1551/
                 user: webdp
        Exception in thread "main" org.apache.spark.SparkException: Application 
application_1657155379674_1551 finished with failed status
                at org.apache.spark.deploy.yarn.Client.run(Client.scala:1153)
                at 
org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1568)
                at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:892)
                at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
                at 
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
                at 
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
                at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
        22/07/14 11:14:45 INFO util.ShutdownHookManager: Shutdown hook called
        22/07/14 11:14:45 INFO util.ShutdownHookManager: Deleting directory 
/tmp/spark-57caeffd-716e-4ab7-a6bb-f9f89f59a7ea
        22/07/14 11:14:45 INFO util.ShutdownHookManager: Deleting directory 
/tmp/spark-56d7c97c-54e3-4b45-8a2e-7e2cbf802eb1
   
   
   
   ### What you expected to happen
   
   data quality task run normally when I  use S3.
   
   ### How to reproduce
   
   Run data quality task with S3 data resource.
   
   ### Anything else
   
   _No response_
   
   ### Version
   
   3.0.0-beta-2
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: 
[email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to