Looks like the easiest solution is to use separate clients, one for each cluster you want to connect to.
Cheers On Sat, Apr 27, 2013 at 6:51 AM, Shahab Yunus <[email protected]>wrote: > Hello, > > This is a follow-up to my previous post a few days back. I am trying to > connect to 2 different Hadoop clusters' setups through a same client but I > am running into the issue that the config of one overwrites the other. > > The scenario is that I want to read data from an HBase table from one > cluster and write it as a file on HDFS on the other. Individually, if I try > to write to them they both work but when I try this through a same Java > client, they fail. > > I have tried loading the core-site.xml through addResource method of the > Configuration class but only the first found config file is picked? I have > also tried by renaming the config files and then adding them as a resource > (again through the addResource method). > > The situation is compounded by the fact that one cluster is using Kerberos > authentication and the other is not? If the Kerberos server's file is found > first then authentication failures are faced for the other server when > Hadoop tries to find client authentication information. If the 'simple' > cluster's config is loaded first then 'Authentication is Required' error is > encountered against the Kerberos server. > > I will gladly provide more information. Is it even possible even if let us > say both servers have same security configuration or none? Any ideas? > Thanks a million. > > Regards, > Shahab >
