Github user cloud-fan commented on the issue:
https://github.com/apache/spark/pull/19586
looking at the `SerializationStream` interface, I think it's designed for
read/write objects of different classes, so your optimization should not be
applied there.
Instead, I think we sh
Github user ConeyLiu commented on the issue:
https://github.com/apache/spark/pull/19586
@srowen Thanks for the reviewing.
What do you meaning here?
> I'm trying to think if there's any case where we intend to support
kryo/java serialized objects from 2.x in 2.y.
After
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/19586
Can one of the admins verify this patch?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user ConeyLiu commented on the issue:
https://github.com/apache/spark/pull/19586
One executor, the configuration as follows:
the script:
```shell
${SPARK_HOME}/bin/spark-submit \
--class com.intel.KryoTest \
--master yarn \
Github user ConeyLiu commented on the issue:
https://github.com/apache/spark/pull/19586
Hi, @cloud-fan @jiangxb1987 @chenghao-intel. Would you mind take a look?
Thanks a lot.
---
-
To unsubscribe, e-mail: reviews-u