Github user BryanCutler commented on the pull request:

    https://github.com/apache/spark/pull/10284#issuecomment-164873259
  
    Hi @andrewor14, it looks like the default RPC `NettyRpcEnv` will not 
serialize a local message, so if I were to send the message 
`AttachCompletedRebuildUI(appId: String, ui: SparkUI)` it should be fine.  
    
    However, from what I can tell Akka will serialize the message so if someone 
configured the master RPC to be `AkkaRpcEnv`, and created a large event log, I 
think trying to serialize a large object would cause issues.
    
    I'd be happy to take the task of removing all of this in SPARK-12299 since 
I'm pretty familiar with the code now.  That way I can make sure that if the 
`ConcurrentHashMap` is used, it will be properly reverted.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to