Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/21558
  
    I'm not sure, but I'm starting to think the part of the fix for SPARK-18113 
that added the "Authorizing duplicate request..." stuff should be removed.
    
    The rest of that change replaces `askWithRetry` in the RPC calls with 
`ask`, so now there will only be one request per task. So there's no need to 
handle duplicate requests that way, because the only time you'd have a 
duplicate is when this situation happens (same task attempt number running 
concurrently on two stage attempts).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to