Hi Leela,
The Secret Access Key property name should be *fs.s3n.awsSecretAccessKey* NOT *fs.s3.awsSecretAccessKey* Access Key property name should be *fs.s3n.awsAccessKeyId* NOT *fs.s3.awsAccessKeyId* Please try this: hive> set fs.s3n.awsSecretAccessKey=zvpZS8KxMBLK0w73w8yhzWaW0Ove10 Pk+fHeit/I; hive> set fs.s3n.awsAccessKeyId=AKIAINIHCWDZMRWJEI3A; Best Regards, Sarath Kumar Sivan Email: sarathkumarsi...@gmail.com On Wed, Jul 26, 2017 at 2:38 AM, leela prasad Gorrepati < leelaprasad.gorrep...@gmail.com> wrote: > Hi All, > I am unable to create Hive external table on s3 location. I have followed > the steps mentioned in " > https://cwiki.apache.org/confluence/display/Hive/HiveAws+HivingS3nRemotely > " > > My Hive commands and received error is, > > hive> set hadoop.socks.server=localhost:2600; > hive> set > hadoop.rpc.socket.factory.class.default=org.apache. > hadoop.net.SocksSocketFactory; > hive> set hadoop.job.ugi=root,root; > hive> set mapred.map.tasks=40; > hive> set mapred.reduce.tasks=-1; > hive> set fs.s3.awsSecretAccessKey=zvpZS8KxMBLK0w73w8yhzWaW0Ove10 > Pk+fHeit/I; > hive> set fs.s3.awsAccessKeyId=AKIAINIHCWDZMRWJEI3A; > hive> create external table empl (id int, name string, location string) > location 's3://leela-2507-test/employee'; > FAILED: Execution Error, return code 1 from > org.apache.hadoop.hive.ql.exec.DDLTask. > MetaException(message:java.lang.IllegalArgumentException: AWS Access Key > ID > and Secret Access Key must be specified by setting the fs.s3.awsAccessKeyId > and fs.s3.awsSecretAccessKey properties (respectively).) > > > Any one can try with the above valid access key, I have created for trail > purpose and test data is in s3://leela-2507-test/employee > > Configuring in hive-site.xml also results in same error. > > Any suggestions are appreciated. > > Thanks in Advance. > > Regards, > Leela Prasad >