[ 
https://issues.apache.org/jira/browse/SPARK-21455?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-21455:
------------------------------------

    Assignee: Apache Spark

> RpcFailure should be call on RpcResponseCallback.onFailure
> ----------------------------------------------------------
>
>                 Key: SPARK-21455
>                 URL: https://issues.apache.org/jira/browse/SPARK-21455
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.2.0
>            Reporter: Xianyang Liu
>            Assignee: Apache Spark
>
> Currently, when there is a `RpcFailure` need be sent back to client, we call 
> `RpcCallContext.sendFailure`, then it will call `NettyRpcCallContext.send`. 
> However, we can see the follow code snippets in the implementation class.
> ```
> private[netty] class RemoteNettyRpcCallContext(
>     nettyEnv: NettyRpcEnv,
>     callback: RpcResponseCallback,
>     senderAddress: RpcAddress)
>   extends NettyRpcCallContext(senderAddress) {
>   override protected def send(message: Any): Unit = {
>     val reply = nettyEnv.serialize(message)
>     callback.onSuccess(reply)
>   }
> }
> ```
> This is unreasonable, there are two reasons:
> # Send back failure message by `RpcResponseCallback.onSuccess`, we can get 
> the details exception messages(such as `StackTrace`) in currently implements.
> # `RpcResponseCallback.onSuccess` and `RpcResponseCallback.onFailure` could 
> have different behavior. Such as:
> NettyBlockTransferService#uploadBlock
> ```
> new RpcResponseCallback {
>         override def onSuccess(response: ByteBuffer): Unit = {
>           logTrace(s"Successfully uploaded block $blockId")
>           result.success((): Unit)
>         }
>         override def onFailure(e: Throwable): Unit = {
>           logError(s"Error while uploading block $blockId", e)
>           result.failure(e)
>         }
>       })
> ```



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to