[
https://issues.apache.org/jira/browse/HCATALOG-553?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Arup Malakar updated HCATALOG-553:
----------------------------------
Assignee: (was: Arup Malakar)
> Dynamic partitioning to viewfs location on non-default name node fails
> ----------------------------------------------------------------------
>
> Key: HCATALOG-553
> URL: https://issues.apache.org/jira/browse/HCATALOG-553
> Project: HCatalog
> Issue Type: Sub-task
> Reporter: Arup Malakar
> Labels: namenode_federation
> Attachments: HCATALOG-553-branch-0.patch
>
>
> [1] Create partitioned table on non-default name node:
> {code}
> CREATE TABLE student (
> name string
> ,age int
> )
> partitioned by (
> gpa string
> ) stored as SequenceFile
> location "viewfs:///database/table";
> {code}
> [2] Now, try using dynamic partitioning:
> Call pig -useHCatalog with script:
> {code}
> A= load '/user/hadoopqa/hcatalog/tests/data/txt/studenttab10k.txt' using
> PigStorage() as (name: chararray, age: int, gpa: chararray);
> B = filter A by ((gpa > '0.00') AND (gpa <= '2.00'));
> store B into 'default.student'
> using org.apache.hcatalog.pig.HCatStorer
> ();
> {code}
> Here the location viewfs:///database/ translates to an HDFS location in the
> non default namenode.
> The exception is
> {code}
> 2012-11-06 19:33:19,352 [main] ERROR
> org.apache.pig.tools.pigstats.SimplePigStats - ERROR 2997: Unable to recreate
> exception from backed error: AttemptID:attempt_1348522594824_0939_m_000000_3
> Info:Error: java.io.IOException: Failed on local exception: java.io.IOExceptio
> n: javax.security.sasl.SaslException: GSS initiate failed [Caused by
> GSSException: No valid credentials provided (Mechanism level: Failed to find
> any Kerberos tgt)]; Host Details : local host is:
> "gsbl90385.blue.ygrid.yahoo.com/98.137.112.165"; destination host is: ""gs
> bl90898.blue.ygrid.yahoo.com":8020;
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:738)
> at org.apache.hadoop.ipc.Client.call(Client.java:1092)
> at
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:195)
> at $Proxy8.mkdirs(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:102)
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:67)
> at $Proxy8.mkdirs(Unknown Source)
> at
> org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:1722)
> at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:1693)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:479)
> at
> org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:248)
> at
> org.apache.hadoop.fs.viewfs.ChRootedFileSystem.mkdirs(ChRootedFileSystem.java:221)
> at
> org.apache.hadoop.fs.viewfs.ViewFileSystem.mkdirs(ViewFileSystem.java:374)
> at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1813)
> at
> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.setupJob(FileOutputCommitter.java:284)
> at
> org.apache.hadoop.mapred.FileOutputCommitter.setupJob(FileOutputCommitter.java:131)
> at
> org.apache.hcatalog.mapreduce.FileRecordWriterContainer.write(FileRecordWriterContainer.java:209)
> at
> org.apache.hcatalog.mapreduce.FileRecordWriterContainer.write(FileRecordWriterContainer.java:52)
> at
> org.apache.hcatalog.pig.HCatBaseStorer.putNext(HCatBaseStorer.java:235)
> at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:139)
> at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:98)
> at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:598)
> at
> org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
> at
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
> at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapOnly$Map.collect(PigMapOnly.java:48)
> at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.runPipeline(PigGenericMapBase.java:273)
> at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.map(PigGenericMapBase.java:266)
> at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.map(PigGenericMapBase.java:64)
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:157)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1212)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:152)
> Caused by: java.io.IOException: javax.security.sasl.SaslException: GSS
> initiate
> failed [Caused by GSSException: No valid credentials provided (Mechanism
> level:
> Failed to find any Kerberos tgt)]
> at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:534)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1212)
> at
> org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:498)
> at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:582)
> at
> org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:205)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1198)
> at org.apache.hadoop.ipc.Client.call(Client.java:1068)
> ... 38 more
> Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by
> GSSException: No valid credentials provided (Mechanism level: Failed to find
> any Kerberos tgt)]
> at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:194)
> at
> org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:137)
> at
> org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:406)
> at
> org.apache.hadoop.ipc.Client$Connection.access$1200(Client.java:205)
> at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:575)
> at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:572)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1212)
> at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:571)
> ... 41 more
> Caused by: GSSException: No valid credentials provided (Mechanism level:
> Failed
> to find any Kerberos tgt)
> at
> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:130)
> at
> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:106)
> at
> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:172)
> at
> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:209)
> at
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:195)
> at
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:162)
> at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:175)
> ... 50 more
> 2012-11-06 19:33:19,353 [main] ERROR
> org.apache.pig.tools.pigstats.PigStatsUtil
> - 1 map reduce job(s) failed!
> {code}
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira