Sorry I've posted another solved issue to Spark groups. Below is the details.
It seems that Java Generic operation in Commons Lang problem when in Spark
Yarn. Or java generic machenism.
The Spark code and Class deserialized code (using Apache Common Lang) like this:
val fis =
Glad to hear you made progress. Good luck!
(Another possibility: you might have changed the package or class name
since you saved the HDFS file.)
On Thu, Jun 27, 2019 at 21:25 big data wrote:
> Thanks. I've tried it, the new Block before it is OK.
>
> I've solved it and posted another issue to
Thanks. I've tried it, the new Block before it is OK.
I've solved it and posted another issue to describe this progress. The details
refer to another email: Java Generic T makes ClassNotFoundException
在 2019/6/27 下午8:41, Tomo Suzuki 写道:
My suggestion after reading ClassNotFoundException is to
My suggestion after reading ClassNotFoundException is to try to instantiate
the class just before deserializing it:
public static Block deserializeFrom(byte[] bytes) {
// Dummy instantiation to ensure Block class and its related classes
are available
System.out.println("dummy = " + new
The XXX Class named Block, below is part codes of it:
The deserialize code like this:
public static Block deserializeFrom(byte[] bytes) {
try {
Block b = SerializationUtils.deserialize(bytes);
System.out.println("b="+b);
return b;
} catch (ClassCastException e) {
I'm afraid that I don't have enough information to troubleshoot problem in
com.XXX.XXX. It would be great if you can create a minimal example project
that can reproduce the same issue.
Regards,
Tomo
On Wed, Jun 26, 2019 at 9:20 PM big data wrote:
> Hi,
>
> Actually, the class com.XXX.XXX is
Hi,
Actually, the class com.XXX.XXX is normally called in the before spark
code, and this exception error is happened in one static method of this
class.
So the jar dependency problem can be excluded.
在 2019/6/26 下午10:23, Tomo Suzuki 写道:
> Hi Big data,
>
> I don't use SerializationUtils, but
Hi Big data,
I don't use SerializationUtils, but if I interpret the error message:
ClassNotFoundException: com..
, this says com.. is not available in the class path of JVM (which
your Spark is running on). I would verify that you can instantiate
com.. in Spark/Scala
I use Apache Commons Lang3's SerializationUtils in the code.
SerializationUtils.serialize()
to store a customized class as files into disk and
SerializationUtils.deserialize(byte[])
to restore them again.
In the local environment (Mac OS), all serialized files can be deserialized
normally