Github user squito commented on the issue:
https://github.com/apache/spark/pull/23058
@attilapiros yes, something like that would be possible. I was thinking
you'd just use the existing serializer methods to do it, soemthing like:
```scala
val buffer = getRemoteManagedBuffer()
val valueItr = deserializeStream(buffer.createInputStream())
val result = valueItr.next()
assert(!valueItr.hasNext()) // makes sure its closed too
```
my reluctance to bother with it is that you'd still be getting a
`DirectTaskResult`, which has the data sitting in a `ByteBuffer` anyway. Also,
its not that big a deal, as this is only for a single task result, which is not
large in general. The change here is to avoid reading an entire partition into
memory.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]