I use Apache Commons Lang3's SerializationUtils in the code.
SerializationUtils.serialize()
to store a customized class as files into disk and
SerializationUtils.deserialize(byte[])
to restore them again.
In the local environment (Mac OS), all serialized files can be deserialized
normally
uot;byte length = " + bytes.length); // Does this match
what you expect?
try {
Block b = SerializationUtils.deserialize(bytes);
...
Looking forward to hearing the result.
On Wed, Jun 26, 2019 at 11:03 PM big data
<mailto:bigdatab...@outlook.com> wrote:
The XXX Class named
o Suzuki 写道:
Glad to hear you made progress. Good luck!
(Another possibility: you might have changed the package or class name
since you saved the HDFS file.)
On Thu, Jun 27, 2019 at 21:25 big data
<mailto:bigdatab...@outlook.com> wrote:
Thanks. I've tried it, the new Block before it is
e project
that can reproduce the same issue.
Regards,
Tomo
On Wed, Jun 26, 2019 at 9:20 PM big data
<mailto:bigdatab...@outlook.com> wrote:
Hi,
Actually, the class com.XXX.XXX is normally called in the before spark
code, and this exception error is happened in one static method of this
class
Hi,
Actually, the class com.XXX.XXX is normally called in the before spark
code, and this exception error is happened in one static method of this
class.
So the jar dependency problem can be excluded.
在 2019/6/26 下午10:23, Tomo Suzuki 写道:
> Hi Big data,
>
> I don't use Serializa