Anyone can help here what is wrong with setup

From: Awasthi, Somesh
Sent: Wednesday, September 18, 2024 1:34 PM
To: Ayush Saxena <ayush...@gmail.com>; d...@hive.apache.org
Cc: user@hive.apache.org; d...@iceberg.apache.org
Subject: RE: Hive 4 integration to store table on S3 and ADLS gen2

Any idea plz suggest.

From: Awasthi, Somesh
Sent: Wednesday, September 18, 2024 11:52 AM
To: Ayush Saxena <ayush...@gmail.com<mailto:ayush...@gmail.com>>; 
d...@hive.apache.org<mailto:d...@hive.apache.org>
Cc: user@hive.apache.org<mailto:user@hive.apache.org>; 
d...@iceberg.apache.org<mailto:d...@iceberg.apache.org>
Subject: RE: Hive 4 integration to store table on S3 and ADLS gen2

Hi Aayush thanks for your quick response .

Hadoop 3.3.6 is correct what is the wrong here .

How to raise bug for Hadoop could you please help here once

One more question.

How to setup hive 4 standalone with iceberg support with table stored on s3.

Please give me proper steps and doc to help it seamlessly

Thanks for u support.

Thanks,
Somesh

From: Ayush Saxena <ayush...@gmail.com<mailto:ayush...@gmail.com>>
Sent: Wednesday, September 18, 2024 11:41 AM
To: d...@hive.apache.org<mailto:d...@hive.apache.org>
Cc: user@hive.apache.org<mailto:user@hive.apache.org>; 
d...@iceberg.apache.org<mailto:d...@iceberg.apache.org>; Awasthi, Somesh 
<soawas...@informatica.com<mailto:soawas...@informatica.com>>
Subject: Re: Hive 4 integration to store table on S3 and ADLS gen2

Caution: This email originated from outside of the organization. Review for 
Phishing!

Hi Somesh,

But while trying so we are seeing following exception :
hadoop fs -ls s3a://somesh.qa.bucket/ -:

This has nothing to do with Hive as such, You have configured Hadoop S3 client 
wrong, you are missing configs, your hadoop ls command itself is failing, there 
is no Hive involved here. You need to setup the FileSystem correctly...

This is a hadoop problem, maybe you can explore reading this doc in hadoop [1] 
& that might help, if you still face issues, you should bug the Hadoop mailing 
lists not hive

-Ayush

[1] 
https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html<https://urldefense.com/v3/__https:/hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html__;!!KpaPruflFCEp!j4dFavCEvf-mGARdE7Z5UPaJjUFttf_LhVQOrQ9NIiFM_4kzEeE_CppLsc6HTnWgyhiuvwZ3bSaIdEceJDvF$>

On Wed, 18 Sept 2024 at 11:12, Awasthi, Somesh 
<soawas...@informatica.com.invalid<mailto:soawas...@informatica.com.invalid>> 
wrote:
Hi Team,

I want to setup hive4 standalone to store table on S3 and Adls gen2 as a 
storage .

Could you please help me as with proper steps and configurations required for 
this.

Because we are facing multiple issue on this please help me here ASPA.

What we tried.

I am trying to configure AWS S3 configuration with the Hadoop and Hive setup.
But while trying so we are seeing following exception :
hadoop fs -ls s3a://somesh.qa.bucket/ -:
Fatal internal error java.lang.RuntimeException: 
java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.s3a.S3AFileSystem 
not found
To resolve this I have added hadoop-aws-3.3.6.jar and 
aws-java-sdk-bundle-1.12.770.jar in Hadoop classpath.
i.e is under : /usr/local/hadoop/share/hadoop/common/lib
And S3 related configurations in the core-site.xml file: under 
/usr/local/hadoop/etc/hadoop directory.
fs.default.name<https://urldefense.com/v3/__http:/fs.default.name__;!!KpaPruflFCEp!j4dFavCEvf-mGARdE7Z5UPaJjUFttf_LhVQOrQ9NIiFM_4kzEeE_CppLsc6HTnWgyhiuvwZ3bSaIdLdKuwep$>
 s3a://somesh.qa.bucket fs.s3a.impl org.apache.hadoop.fs.s3a.S3AFileSystem 
fs.s3a.endpoint 
s3.us-west-2.amazonaws.com<https://urldefense.com/v3/__http:/s3.us-west-2.amazonaws.com__;!!KpaPruflFCEp!j4dFavCEvf-mGARdE7Z5UPaJjUFttf_LhVQOrQ9NIiFM_4kzEeE_CppLsc6HTnWgyhiuvwZ3bSaIdORYfuG3$>
 fs.s3a.access.key {Access _Key_Value} fs.s3a.secret.key {Secret_Key_Value} 
fs.s3a.path.style.access false
Now when we try hadoop fs -ls s3a://somesh.qa.bucket/
We are observing following exception :
2024-08-22 13:50:11,294 INFO impl.MetricsConfig: Loaded properties from 
hadoop-metrics2.properties
2024-08-22 13:50:11,376 INFO impl.MetricsSystemImpl: Scheduled Metric snapshot 
period at 10 second(s).
2024-08-22 13:50:11,376 INFO impl.MetricsSystemImpl: s3a-file-system metrics 
system started
2024-08-22 13:50:11,434 WARN util.VersionInfoUtils: The AWS SDK for Java 1.x 
entered maintenance mode starting July 31, 2024 and will reach end of support 
on December 31, 2025. For more information, see 
https://aws.amazon.com/blogs/developer/the-aws-sdk-for-java-1-x-is-in-maintenance-mode-effective-july-31-2024/<https://urldefense.com/v3/__https:/aws.amazon.com/blogs/developer/the-aws-sdk-for-java-1-x-is-in-maintenance-mode-effective-july-31-2024/__;!!KpaPruflFCEp!j4dFavCEvf-mGARdE7Z5UPaJjUFttf_LhVQOrQ9NIiFM_4kzEeE_CppLsc6HTnWgyhiuvwZ3bSaIdEoWKW5F$>
You can print where on the file system the AWS SDK for Java 1.x core runtime is 
located by setting the AWS_JAVA_V1_PRINT_LOCATION environment variable or 
aws.java.v1.printLocation system property to 'true'.
This message can be disabled by setting the 
AWS_JAVA_V1_DISABLE_DEPRECATION_ANNOUNCEMENT environment variable or 
aws.java.v1.disableDeprecationAnnouncement system property to 'true'.
The AWS SDK for Java 1.x is being used here:
at java.lang.Thread.getStackTrace(Thread.java:1564)
at 
com.amazonaws.util.VersionInfoUtils.printDeprecationAnnouncement(VersionInfoUtils.java:81)
at com.amazonaws.util.VersionInfoUtils.(VersionInfoUtils.java:59)
at com.amazonaws.internal.EC2ResourceFetcher.(EC2ResourceFetcher.java:44)
at 
com.amazonaws.auth.InstanceMetadataServiceCredentialsFetcher.(InstanceMetadataServiceCredentialsFetcher.java:38)
at 
com.amazonaws.auth.InstanceProfileCredentialsProvider.(InstanceProfileCredentialsProvider.java:111)
at 
com.amazonaws.auth.InstanceProfileCredentialsProvider.(InstanceProfileCredentialsProvider.java:91)
at 
com.amazonaws.auth.InstanceProfileCredentialsProvider.(InstanceProfileCredentialsProvider.java:75)
at 
com.amazonaws.auth.InstanceProfileCredentialsProvider.(InstanceProfileCredentialsProvider.java:58)
at 
com.amazonaws.auth.EC2ContainerCredentialsProviderWrapper.initializeProvider(EC2ContainerCredentialsProviderWrapper.java:66)
at 
com.amazonaws.auth.EC2ContainerCredentialsProviderWrapper.(EC2ContainerCredentialsProviderWrapper.java:55)
at 
org.apache.hadoop.fs.s3a.auth.IAMInstanceCredentialsProvider.(IAMInstanceCredentialsProvider.java:53)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at 
org.apache.hadoop.fs.s3a.S3AUtils.createAWSCredentialProvider(S3AUtils.java:727)
at org.apache.hadoop.fs.s3a.S3AUtils.buildAWSProviderList(S3AUtils.java:659)
at 
org.apache.hadoop.fs.s3a.S3AUtils.createAWSCredentialProviderSet(S3AUtils.java:585)
at org.apache.hadoop.fs.s3a.S3AFileSystem.bindAWSClient(S3AFileSystem.java:959)
at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:586)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3611)
at org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:174)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3712)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3663)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:557)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:365)
at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:347)
at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:264)
at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:247)
at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:105)
at org.apache.hadoop.fs.shell.Command.run(Command.java:191)
at org.apache.hadoop.fs.FsShell.run(FsShell.java:327)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:82)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:97)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:390)
ls: s3a://infa.qa.bucket/: 
org.apache.hadoop.fs.s3a.auth.NoAuthWithAWSException: No AWS Credentials 
provided by TemporaryAWSCredentialsProvider SimpleAWSCredentialsProvider 
EnvironmentVariableCredentialsProvider IAMInstanceCredentialsProvider : 
com.amazonaws.SdkClientException: Unable to load AWS credentials from 
environment variables (AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) and AWS_SECRET_KEY 
(or AWS_SECRET_ACCESS_KEY))
2024-08-22 13:50:14,248 INFO impl.MetricsSystemImpl: Stopping s3a-file-system 
metrics system...
2024-08-22 13:50:14,248 INFO impl.MetricsSystemImpl: s3a-file-system metrics 
system stopped.
2024-08-22 13:50:14,248 INFO impl.MetricsSystemImpl: s3a-file-system metrics 
syst


Could you please help us to resolve this issue as soon as possible

Thanks,
Somesh

Reply via email to