Prablah, did you follow the URL provided in the exception message? i put a lot of effort in to improving the diagnostics, where the wiki articles are part of the troubleshooing process https://issues.apache.org/jira/browse/HADOOP-7469
it's really disappointing when people escalate the problem to open source developers before trying to fix the problem themselves, in this case, read the error message. now, if there is some k8s related issue which makes this more common, you are encouraged to update the wiki entry with a new cause. documentation is an important contribution to open source projects, and if you have discovered a new way to recreate the failure, it would be welcome. which reminds me, i have to add something to connection reset and docker which comes down to "turn off http keepalive in maven builds" -Steve On Sat, 30 Apr 2022 at 10:45, Gabor Somogyi <gabor.g.somo...@gmail.com> wrote: > Hi, > > Please be aware that ConnectionRefused exception is has nothing to do w/ > authentication. See the description from Hadoop wiki: > "You get a ConnectionRefused > <https://cwiki.apache.org/confluence/display/HADOOP2/ConnectionRefused> > Exception > when there is a machine at the address specified, but there is no program > listening on the specific TCP port the client is using -and there is no > firewall in the way silently dropping TCP connection requests. If you do > not know what a TCP connection request is, please consult the > specification <http://www.ietf.org/rfc/rfc793.txt>." > > This means the namenode on host:port is not reachable in the TCP layer. > Maybe there are multiple issues but I'm pretty sure that something is wrong > in the K8S net config. > > BR, > G > > > On Fri, Apr 29, 2022 at 6:23 PM Pralabh Kumar <pralabhku...@gmail.com> > wrote: > >> Hi dev Team >> >> Spark-25355 added the functionality of the proxy user on K8s . However >> proxy user on K8s with Kerberized HDFS is not working . It is throwing >> exception and >> >> 22/04/21 17:50:30 WARN Client: Exception encountered while connecting to >> the server : org.apache.hadoop.security.AccessControlException: Client >> cannot authenticate via:[TOKEN, KERBEROS] >> >> >> Exception in thread "main" java.net.ConnectException: Call From >> <driverpod> to <namenode> failed on connection exception: >> java.net.ConnectException: Connection refused; For more details see: http: >> //wiki.apache.org/hadoop/ConnectionRefused >> >> at >> java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native >> Method) >> >> at >> java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(Unknown >> Source) >> >> at >> java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown >> Source) >> >> at java.base/java.lang.reflect.Constructor.newInstance(Unknown Source) >> >> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:831) >> >> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:755) >> >> at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1501) >> >> at org.apache.hadoop.ipc.Client.call(Client.java:1443) >> >> at org.apache.hadoop.ipc.Client.call(Client.java:1353) >> >> at >> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228) >> >> at >> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) >> >> at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source) >> >> at >> >> >> >> On debugging deep , we found the proxy user doesn't have access to >> delegation tokens in case of K8s .SparkSubmit.submit explicitly creating >> the proxy user and this user doesn't have delegation token. >> >> >> Please help me with the same. >> >> >> Regards >> >> Pralabh Kumar >> >> >> >>