gaborgsomogyi opened a new pull request #30389:
URL: https://github.com/apache/spark/pull/30389


   ### What changes were proposed in this pull request?
   Spark creates local server to serialize several type of data for python. The 
python code tries to connect to the server, immediately after it's created but 
there are several system calls in between (this may change in each Spark 
version):
   * getaddrinfo
   * socket
   * settimeout
   * connect
   
   Under some circumstances in heavy user environments these calls can be super 
slow (more than 15 seconds). These issues must be analyzed one-by-one but since 
these are system calls the underlying OS and/or DNS servers must be debugged 
and fixed. This is not trivial task and at the same time data processing must 
work somehow. In this PR I'm only intended to add a configuration possibility 
to increase the mentioned timeouts in order to be able to provide temporary 
workaround. The rootcause analysis is ongoing but I think this can vary in each 
case.
   
   Because the server part doesn't contain huge amount of log entries to with 
one can measure time, I've added some.
   
   ### Why are the changes needed?
   Provide workaround when localhost python server connection timeout appears.
   
   ### Does this PR introduce _any_ user-facing change?
   Yes, new configuration added.
   
   ### How was this patch tested?
   Existing unit tests.
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to