prongs commented on pull request #167: URL: https://github.com/apache/incubator-livy/pull/167#issuecomment-874627547
@jahstreet Huge Thanks for your efforts on this. Saved us a few days. I could have saved some more had I found it sooner :-D Anyways, we're seeing some weird behaviour in which when spark-driver is connecting to Livy RPC. We see the following in the Livy logs:  The Local (L) side is fine, as that's where livy's RPC server is running, but the Remote (R) side is incorrect. The remote should be coming as the driver pod and not `127.0.0.1`. Would you have any idea about this? To get around this, I ended up making the following change.  And now, instead of connecting to the remote side of the channel, it tries to connect to the hostname communicated inside the RPC messages. Now I wonder * Why that's not already the case? Why are we not using the hostname communicated in the message. * How it works for you and others on this thread without this change? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
