Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-27 Thread big data
Sorry I've posted another solved issue to Spark groups. Below is the details.

It seems that Java Generic operation in Commons Lang problem when in Spark 
Yarn. Or java generic machenism.


The Spark code and Class deserialized code (using Apache Common Lang) like this:

val fis = spark.sparkContext.binaryFiles("/folder/abc*.file")
val RDD = fis.map(x => {
  val content = x._2.toArray()
  val b = Block.deserializeFrom(content)
  ...
}




public static Block deserializeFrom(byte[] bytes) {
try {
Block b = SerializationUtils.deserialize(bytes);
System.out.println("b="+b);
return b;
} catch (ClassCastException e) {
System.out.println("ClassCastException");
e.printStackTrace();
} catch (IllegalArgumentException e) {
System.out.println("IllegalArgumentException");
e.printStackTrace();

} catch (SerializationException e) {
System.out.println("SerializationException");
e.printStackTrace();
}
return null;
}

Below is Commons Lang source code about deserialize:

public static  T deserialize(final byte[] objectData) {
Validate.isTrue(objectData != null, "The byte[] must not be null");
return deserialize(new ByteArrayInputStream(objectData));
}


public static  T deserialize(final InputStream inputStream) {
Validate.isTrue(inputStream != null, "The InputStream must not be null");
try (ObjectInputStream in = new ObjectInputStream(inputStream)) {
@SuppressWarnings("unchecked")
final T obj = (T) in.readObject();
return obj;
} catch (final ClassNotFoundException | IOException ex) {
throw new SerializationException(ex);
}
}

In the Spark local mode, the code runs OK. But in Cluster On Yarn mode, Spark 
code runs error like this:

org.apache.commons.lang3.SerializationException: 
java.lang.ClassNotFoundException: com.Block
at 
org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)
at 
org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)
at com.com...deserializeFrom(XXX.java:81)
at com.XXX.$$anonfun$3.apply(B.scala:157)
at com.XXX.$$anonfun$3.apply(B.scala:153)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)
at scala.collection.AbstractIterator.to(Iterator.scala:1336)
at 
scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
at 
org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
at 
org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
at 
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
at 
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: com.Block
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:686)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1868)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at 
org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:223)


From the error, we can get error happens in Apache Common Lang package at

final T obj = (T) in.readObject();

T is Block class, and when 

Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-27 Thread Tomo Suzuki
Glad to hear you made progress. Good luck!

(Another possibility: you might have changed the package or class name
since you saved the HDFS file.)

On Thu, Jun 27, 2019 at 21:25 big data  wrote:

> Thanks. I've tried it, the new Block before it is OK.
>
> I've solved it and posted another issue to describe this progress. The
> details refer to another email:  Java Generic T makes ClassNotFoundException
>
> 在 2019/6/27 下午8:41, Tomo Suzuki 写道:
>
> My suggestion after reading ClassNotFoundException is to try to instantiate
> the class just before deserializing it:
>
> public static Block deserializeFrom(byte[] bytes) {
> // Dummy instantiation to ensure Block class and its related classes
> are available
> System.out.println("dummy = " + new Block());
> System.out.println("byte length = " + bytes.length); // Does this match
> what you expect?
> try {
> Block b = SerializationUtils.deserialize(bytes);
> ...
>
>
> Looking forward to hearing the result.
>
>
>
> On Wed, Jun 26, 2019 at 11:03 PM big data  > wrote:
>
>
>
> The XXX Class named Block, below is part codes of it:
>
> The deserialize code like this:
>
> public static Block deserializeFrom(byte[] bytes) {
> try {
> Block b = SerializationUtils.deserialize(bytes);
> System.out.println("b="+b);
> return b;
> } catch (ClassCastException e) {
> System.out.println("ClassCastException");
> e.printStackTrace();
> } catch (IllegalArgumentException e) {
> System.out.println("IllegalArgumentException");
> e.printStackTrace();
>
> } catch (SerializationException e) {
> System.out.println("SerializationException");
> e.printStackTrace();
> }
> return null;
> }
>
>
> The Spark code is:
>
> val fis = spark.sparkContext.binaryFiles("/folder/abc*.file")
> val RDD = fis.map(x => {
>   val content = x._2.toArray()
>   val b = Block.deserializeFrom(content)
>   ...
> }
>
>
> All codes above can run successfully in Spark local mode, but when run it
> in Yarn cluster mode, the error happens.
>
> 在 2019/6/27 上午9:49, Tomo Suzuki 写道:
>
> I'm afraid that I don't have enough information to troubleshoot problem in
> com.XXX.XXX. It would be great if you can create a minimal example project
> that can reproduce the same issue.
>
> Regards,
> Tomo
>
> On Wed, Jun 26, 2019 at 9:20 PM big data  bigdatab...@outlook.com> bigdatab...@outlook.com> wrote:
>
>
>
> Hi,
>
> Actually, the class com.XXX.XXX is normally called in the before spark
> code, and this exception error is happened in one static method of this
> class.
>
> So the jar dependency problem can be excluded.
>
> 在 2019/6/26 下午10:23, Tomo Suzuki 写道:
>
>
> Hi Big data,
>
> I don't use SerializationUtils, but if I interpret the error message:
>
>ClassNotFoundException: com..
>
> , this says com.. is not available in the class path of JVM
>
>
> (which
>
>
> your Spark is running on). I would verify that you can instantiate
> com.. in Spark/Scala *without* SerializationUtils.
>
> Regards,
> Tomo
>
>
>
> On Wed, Jun 26, 2019 at 4:12 AM big data  bigdatab...@outlook.com> bigdatab...@outlook.com>
>
>
> wrote:
>
>
>
>
>
> I use Apache Commons Lang3's SerializationUtils in the code.
>
> SerializationUtils.serialize()
>
> to store a customized class as files into disk and
>
> SerializationUtils.deserialize(byte[])
>
> to restore them again.
>
> In the local environment (Mac OS), all serialized files can be
> deserialized normally and no error happens. But when I copy these
> serialized files into HDFS, and read them from HDFS by using
>
>
> Spark/Scala, a
>
>
> SerializeException happens.
>
> The Apache Commons Lang3 version is:
>
>  
>  org.apache.commons
>  commons-lang3
>  3.9
>  
>
>
> the stack error as below:
>
> org.apache.commons.lang3.SerializationException:
> java.lang.ClassNotFoundException: com..
>  at
>
>
>
>
>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)
>
>
>  at
>
>
>
>
>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)
>
>
>  at com.com...deserializeFrom(XXX.java:81)
>  at com.XXX.$$anonfun$3.apply(B.scala:157)
>  at com.XXX.$$anonfun$3.apply(B.scala:153)
>  at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
>  at scala.collection.Iterator$class.foreach(Iterator.scala:893)
>  at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
>  at
> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
>  at
>
>
>
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
>
>
>  at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
>  at scala.collection.TraversableOnce$class.to
> (TraversableOnce.scala:310)
>  at 

Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-27 Thread big data
Thanks. I've tried it, the new Block before it is OK.

I've solved it and posted another issue to describe this progress. The details 
refer to another email:  Java Generic T makes ClassNotFoundException

在 2019/6/27 下午8:41, Tomo Suzuki 写道:

My suggestion after reading ClassNotFoundException is to try to instantiate
the class just before deserializing it:

public static Block deserializeFrom(byte[] bytes) {
// Dummy instantiation to ensure Block class and its related classes
are available
System.out.println("dummy = " + new Block());
System.out.println("byte length = " + bytes.length); // Does this match
what you expect?
try {
Block b = SerializationUtils.deserialize(bytes);
...


Looking forward to hearing the result.



On Wed, Jun 26, 2019 at 11:03 PM big data 
 wrote:



The XXX Class named Block, below is part codes of it:

The deserialize code like this:

public static Block deserializeFrom(byte[] bytes) {
try {
Block b = SerializationUtils.deserialize(bytes);
System.out.println("b="+b);
return b;
} catch (ClassCastException e) {
System.out.println("ClassCastException");
e.printStackTrace();
} catch (IllegalArgumentException e) {
System.out.println("IllegalArgumentException");
e.printStackTrace();

} catch (SerializationException e) {
System.out.println("SerializationException");
e.printStackTrace();
}
return null;
}


The Spark code is:

val fis = spark.sparkContext.binaryFiles("/folder/abc*.file")
val RDD = fis.map(x => {
  val content = x._2.toArray()
  val b = Block.deserializeFrom(content)
  ...
}


All codes above can run successfully in Spark local mode, but when run it
in Yarn cluster mode, the error happens.

在 2019/6/27 上午9:49, Tomo Suzuki 写道:

I'm afraid that I don't have enough information to troubleshoot problem in
com.XXX.XXX. It would be great if you can create a minimal example project
that can reproduce the same issue.

Regards,
Tomo

On Wed, Jun 26, 2019 at 9:20 PM big data 
 wrote:



Hi,

Actually, the class com.XXX.XXX is normally called in the before spark
code, and this exception error is happened in one static method of this
class.

So the jar dependency problem can be excluded.

在 2019/6/26 下午10:23, Tomo Suzuki 写道:


Hi Big data,

I don't use SerializationUtils, but if I interpret the error message:

   ClassNotFoundException: com..

, this says com.. is not available in the class path of JVM


(which


your Spark is running on). I would verify that you can instantiate
com.. in Spark/Scala *without* SerializationUtils.

Regards,
Tomo



On Wed, Jun 26, 2019 at 4:12 AM big data 



wrote:





I use Apache Commons Lang3's SerializationUtils in the code.

SerializationUtils.serialize()

to store a customized class as files into disk and

SerializationUtils.deserialize(byte[])

to restore them again.

In the local environment (Mac OS), all serialized files can be
deserialized normally and no error happens. But when I copy these
serialized files into HDFS, and read them from HDFS by using


Spark/Scala, a


SerializeException happens.

The Apache Commons Lang3 version is:

 
 org.apache.commons
 commons-lang3
 3.9
 


the stack error as below:

org.apache.commons.lang3.SerializationException:
java.lang.ClassNotFoundException: com..
 at




org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)


 at




org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)


 at com.com...deserializeFrom(XXX.java:81)
 at com.XXX.$$anonfun$3.apply(B.scala:157)
 at com.XXX.$$anonfun$3.apply(B.scala:153)
 at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
 at scala.collection.Iterator$class.foreach(Iterator.scala:893)
 at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
 at
scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
 at



scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)


 at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
 at scala.collection.TraversableOnce$class.to
(TraversableOnce.scala:310)
 at scala.collection.AbstractIterator.to(Iterator.scala:1336)
 at



scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)


 at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
 at



scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)


 at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
 at




org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)


 at





Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-27 Thread Tomo Suzuki
My suggestion after reading ClassNotFoundException is to try to instantiate
the class just before deserializing it:

public static Block deserializeFrom(byte[] bytes) {
// Dummy instantiation to ensure Block class and its related classes
are available
System.out.println("dummy = " + new Block());
System.out.println("byte length = " + bytes.length); // Does this match
what you expect?
try {
Block b = SerializationUtils.deserialize(bytes);
...


Looking forward to hearing the result.



On Wed, Jun 26, 2019 at 11:03 PM big data  wrote:

> The XXX Class named Block, below is part codes of it:
>
> The deserialize code like this:
>
> public static Block deserializeFrom(byte[] bytes) {
> try {
> Block b = SerializationUtils.deserialize(bytes);
> System.out.println("b="+b);
> return b;
> } catch (ClassCastException e) {
> System.out.println("ClassCastException");
> e.printStackTrace();
> } catch (IllegalArgumentException e) {
> System.out.println("IllegalArgumentException");
> e.printStackTrace();
>
> } catch (SerializationException e) {
> System.out.println("SerializationException");
> e.printStackTrace();
> }
> return null;
> }
>
>
> The Spark code is:
>
> val fis = spark.sparkContext.binaryFiles("/folder/abc*.file")
> val RDD = fis.map(x => {
>   val content = x._2.toArray()
>   val b = Block.deserializeFrom(content)
>   ...
> }
>
>
> All codes above can run successfully in Spark local mode, but when run it
> in Yarn cluster mode, the error happens.
>
> 在 2019/6/27 上午9:49, Tomo Suzuki 写道:
>
> I'm afraid that I don't have enough information to troubleshoot problem in
> com.XXX.XXX. It would be great if you can create a minimal example project
> that can reproduce the same issue.
>
> Regards,
> Tomo
>
> On Wed, Jun 26, 2019 at 9:20 PM big data  bigdatab...@outlook.com> wrote:
>
>
>
> Hi,
>
> Actually, the class com.XXX.XXX is normally called in the before spark
> code, and this exception error is happened in one static method of this
> class.
>
> So the jar dependency problem can be excluded.
>
> 在 2019/6/26 下午10:23, Tomo Suzuki 写道:
>
>
> Hi Big data,
>
> I don't use SerializationUtils, but if I interpret the error message:
>
>ClassNotFoundException: com..
>
> , this says com.. is not available in the class path of JVM
>
>
> (which
>
>
> your Spark is running on). I would verify that you can instantiate
> com.. in Spark/Scala *without* SerializationUtils.
>
> Regards,
> Tomo
>
>
>
> On Wed, Jun 26, 2019 at 4:12 AM big data  bigdatab...@outlook.com>
>
>
> wrote:
>
>
>
>
>
> I use Apache Commons Lang3's SerializationUtils in the code.
>
> SerializationUtils.serialize()
>
> to store a customized class as files into disk and
>
> SerializationUtils.deserialize(byte[])
>
> to restore them again.
>
> In the local environment (Mac OS), all serialized files can be
> deserialized normally and no error happens. But when I copy these
> serialized files into HDFS, and read them from HDFS by using
>
>
> Spark/Scala, a
>
>
> SerializeException happens.
>
> The Apache Commons Lang3 version is:
>
>  
>  org.apache.commons
>  commons-lang3
>  3.9
>  
>
>
> the stack error as below:
>
> org.apache.commons.lang3.SerializationException:
> java.lang.ClassNotFoundException: com..
>  at
>
>
>
>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)
>
>
>  at
>
>
>
>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)
>
>
>  at com.com...deserializeFrom(XXX.java:81)
>  at com.XXX.$$anonfun$3.apply(B.scala:157)
>  at com.XXX.$$anonfun$3.apply(B.scala:153)
>  at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
>  at scala.collection.Iterator$class.foreach(Iterator.scala:893)
>  at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
>  at
> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
>  at
>
>
>
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
>
>
>  at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
>  at scala.collection.TraversableOnce$class.to
> (TraversableOnce.scala:310)
>  at scala.collection.AbstractIterator.to(Iterator.scala:1336)
>  at
>
>
>
> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
>
>
>  at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
>  at
>
>
>
> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
>
>
>  at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
>  at
>
>
>
>
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
>
>
>  at
>
>
>
>
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
>
>
>  at
>
>
>
>
> 

Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-26 Thread big data
The XXX Class named Block, below is part codes of it:

The deserialize code like this:

public static Block deserializeFrom(byte[] bytes) {
try {
Block b = SerializationUtils.deserialize(bytes);
System.out.println("b="+b);
return b;
} catch (ClassCastException e) {
System.out.println("ClassCastException");
e.printStackTrace();
} catch (IllegalArgumentException e) {
System.out.println("IllegalArgumentException");
e.printStackTrace();

} catch (SerializationException e) {
System.out.println("SerializationException");
e.printStackTrace();
}
return null;
}


The Spark code is:

val fis = spark.sparkContext.binaryFiles("/folder/abc*.file")
val RDD = fis.map(x => {
  val content = x._2.toArray()
  val b = Block.deserializeFrom(content)
  ...
}


All codes above can run successfully in Spark local mode, but when run it in 
Yarn cluster mode, the error happens.

在 2019/6/27 上午9:49, Tomo Suzuki 写道:

I'm afraid that I don't have enough information to troubleshoot problem in
com.XXX.XXX. It would be great if you can create a minimal example project
that can reproduce the same issue.

Regards,
Tomo

On Wed, Jun 26, 2019 at 9:20 PM big data 
 wrote:



Hi,

Actually, the class com.XXX.XXX is normally called in the before spark
code, and this exception error is happened in one static method of this
class.

So the jar dependency problem can be excluded.

在 2019/6/26 下午10:23, Tomo Suzuki 写道:


Hi Big data,

I don't use SerializationUtils, but if I interpret the error message:

   ClassNotFoundException: com..

, this says com.. is not available in the class path of JVM


(which


your Spark is running on). I would verify that you can instantiate
com.. in Spark/Scala *without* SerializationUtils.

Regards,
Tomo



On Wed, Jun 26, 2019 at 4:12 AM big data 



wrote:





I use Apache Commons Lang3's SerializationUtils in the code.

SerializationUtils.serialize()

to store a customized class as files into disk and

SerializationUtils.deserialize(byte[])

to restore them again.

In the local environment (Mac OS), all serialized files can be
deserialized normally and no error happens. But when I copy these
serialized files into HDFS, and read them from HDFS by using


Spark/Scala, a


SerializeException happens.

The Apache Commons Lang3 version is:

 
 org.apache.commons
 commons-lang3
 3.9
 


the stack error as below:

org.apache.commons.lang3.SerializationException:
java.lang.ClassNotFoundException: com..
 at



org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)


 at



org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)


 at com.com...deserializeFrom(XXX.java:81)
 at com.XXX.$$anonfun$3.apply(B.scala:157)
 at com.XXX.$$anonfun$3.apply(B.scala:153)
 at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
 at scala.collection.Iterator$class.foreach(Iterator.scala:893)
 at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
 at
scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
 at



scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)


 at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
 at scala.collection.TraversableOnce$class.to
(TraversableOnce.scala:310)
 at scala.collection.AbstractIterator.to(Iterator.scala:1336)
 at



scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)


 at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
 at



scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)


 at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
 at



org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)


 at



org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)


 at



org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)


 at



org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)


 at


org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)


 at org.apache.spark.scheduler.Task.run(Task.scala:109)
 at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
 at



java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)


 at



java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)


 at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: com..
 at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
 at 

Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-26 Thread Tomo Suzuki
I'm afraid that I don't have enough information to troubleshoot problem in
com.XXX.XXX. It would be great if you can create a minimal example project
that can reproduce the same issue.

Regards,
Tomo

On Wed, Jun 26, 2019 at 9:20 PM big data  wrote:

> Hi,
>
> Actually, the class com.XXX.XXX is normally called in the before spark
> code, and this exception error is happened in one static method of this
> class.
>
> So the jar dependency problem can be excluded.
>
> 在 2019/6/26 下午10:23, Tomo Suzuki 写道:
> > Hi Big data,
> >
> > I don't use SerializationUtils, but if I interpret the error message:
> >
> >ClassNotFoundException: com..
> >
> > , this says com.. is not available in the class path of JVM
> (which
> > your Spark is running on). I would verify that you can instantiate
> > com.. in Spark/Scala *without* SerializationUtils.
> >
> > Regards,
> > Tomo
> >
> >
> >
> > On Wed, Jun 26, 2019 at 4:12 AM big data 
> wrote:
> >
> >> I use Apache Commons Lang3's SerializationUtils in the code.
> >>
> >> SerializationUtils.serialize()
> >>
> >> to store a customized class as files into disk and
> >>
> >> SerializationUtils.deserialize(byte[])
> >>
> >> to restore them again.
> >>
> >> In the local environment (Mac OS), all serialized files can be
> >> deserialized normally and no error happens. But when I copy these
> >> serialized files into HDFS, and read them from HDFS by using
> Spark/Scala, a
> >> SerializeException happens.
> >>
> >> The Apache Commons Lang3 version is:
> >>
> >>  
> >>  org.apache.commons
> >>  commons-lang3
> >>  3.9
> >>  
> >>
> >>
> >> the stack error as below:
> >>
> >> org.apache.commons.lang3.SerializationException:
> >> java.lang.ClassNotFoundException: com..
> >>  at
> >>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)
> >>  at
> >>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)
> >>  at com.com...deserializeFrom(XXX.java:81)
> >>  at com.XXX.$$anonfun$3.apply(B.scala:157)
> >>  at com.XXX.$$anonfun$3.apply(B.scala:153)
> >>  at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
> >>  at scala.collection.Iterator$class.foreach(Iterator.scala:893)
> >>  at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
> >>  at
> >> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
> >>  at
> >>
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
> >>  at
> >> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
> >>  at scala.collection.TraversableOnce$class.to
> >> (TraversableOnce.scala:310)
> >>  at scala.collection.AbstractIterator.to(Iterator.scala:1336)
> >>  at
> >>
> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
> >>  at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
> >>  at
> >>
> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
> >>  at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
> >>  at
> >>
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
> >>  at
> >>
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
> >>  at
> >>
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
> >>  at
> >>
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
> >>  at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
> >>  at org.apache.spark.scheduler.Task.run(Task.scala:109)
> >>  at
> >> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> >>  at
> >>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> >>  at
> >>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> >>  at java.lang.Thread.run(Thread.java:748)
> >> Caused by: java.lang.ClassNotFoundException: com..
> >>  at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> >>  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> >>  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
> >>  at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> >>  at java.lang.Class.forName0(Native Method)
> >>  at java.lang.Class.forName(Class.java:348)
> >>  at
> java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:686)
> >>  at
> >> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1868)
> >>  at
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751)
> >>  at
> >>
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042)
> >>  at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
> >>  at 

Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-26 Thread big data
Hi,

Actually, the class com.XXX.XXX is normally called in the before spark 
code, and this exception error is happened in one static method of this 
class.

So the jar dependency problem can be excluded.

在 2019/6/26 下午10:23, Tomo Suzuki 写道:
> Hi Big data,
>
> I don't use SerializationUtils, but if I interpret the error message:
>
>ClassNotFoundException: com..
>
> , this says com.. is not available in the class path of JVM (which
> your Spark is running on). I would verify that you can instantiate
> com.. in Spark/Scala *without* SerializationUtils.
>
> Regards,
> Tomo
>
>
>
> On Wed, Jun 26, 2019 at 4:12 AM big data  wrote:
>
>> I use Apache Commons Lang3's SerializationUtils in the code.
>>
>> SerializationUtils.serialize()
>>
>> to store a customized class as files into disk and
>>
>> SerializationUtils.deserialize(byte[])
>>
>> to restore them again.
>>
>> In the local environment (Mac OS), all serialized files can be
>> deserialized normally and no error happens. But when I copy these
>> serialized files into HDFS, and read them from HDFS by using Spark/Scala, a
>> SerializeException happens.
>>
>> The Apache Commons Lang3 version is:
>>
>>  
>>  org.apache.commons
>>  commons-lang3
>>  3.9
>>  
>>
>>
>> the stack error as below:
>>
>> org.apache.commons.lang3.SerializationException:
>> java.lang.ClassNotFoundException: com..
>>  at
>> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)
>>  at
>> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)
>>  at com.com...deserializeFrom(XXX.java:81)
>>  at com.XXX.$$anonfun$3.apply(B.scala:157)
>>  at com.XXX.$$anonfun$3.apply(B.scala:153)
>>  at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
>>  at scala.collection.Iterator$class.foreach(Iterator.scala:893)
>>  at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
>>  at
>> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
>>  at
>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
>>  at
>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
>>  at scala.collection.TraversableOnce$class.to
>> (TraversableOnce.scala:310)
>>  at scala.collection.AbstractIterator.to(Iterator.scala:1336)
>>  at
>> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
>>  at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
>>  at
>> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
>>  at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
>>  at
>> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
>>  at
>> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
>>  at
>> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
>>  at
>> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
>>  at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
>>  at org.apache.spark.scheduler.Task.run(Task.scala:109)
>>  at
>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
>>  at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>  at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>  at java.lang.Thread.run(Thread.java:748)
>> Caused by: java.lang.ClassNotFoundException: com..
>>  at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
>>  at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>  at java.lang.Class.forName0(Native Method)
>>  at java.lang.Class.forName(Class.java:348)
>>  at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:686)
>>  at
>> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1868)
>>  at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751)
>>  at
>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042)
>>  at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
>>  at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
>>  at
>> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:223)
>>
>> I've check the loaded byte[]'s length, both from local and from HDFS are
>> same. But why it can not be deserialized from HDFS?
>>
>


Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-26 Thread Tomo Suzuki
Hi Big data,

I don't use SerializationUtils, but if I interpret the error message:

  ClassNotFoundException: com..

, this says com.. is not available in the class path of JVM (which
your Spark is running on). I would verify that you can instantiate
com.. in Spark/Scala *without* SerializationUtils.

Regards,
Tomo



On Wed, Jun 26, 2019 at 4:12 AM big data  wrote:

> I use Apache Commons Lang3's SerializationUtils in the code.
>
> SerializationUtils.serialize()
>
> to store a customized class as files into disk and
>
> SerializationUtils.deserialize(byte[])
>
> to restore them again.
>
> In the local environment (Mac OS), all serialized files can be
> deserialized normally and no error happens. But when I copy these
> serialized files into HDFS, and read them from HDFS by using Spark/Scala, a
> SerializeException happens.
>
> The Apache Commons Lang3 version is:
>
> 
> org.apache.commons
> commons-lang3
> 3.9
> 
>
>
> the stack error as below:
>
> org.apache.commons.lang3.SerializationException:
> java.lang.ClassNotFoundException: com..
> at
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)
> at
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)
> at com.com...deserializeFrom(XXX.java:81)
> at com.XXX.$$anonfun$3.apply(B.scala:157)
> at com.XXX.$$anonfun$3.apply(B.scala:153)
> at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
> at scala.collection.Iterator$class.foreach(Iterator.scala:893)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
> at
> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
> at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
> at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
> at scala.collection.TraversableOnce$class.to
> (TraversableOnce.scala:310)
> at scala.collection.AbstractIterator.to(Iterator.scala:1336)
> at
> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
> at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
> at
> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
> at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
> at
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
> at
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
> at
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
> at
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
> at org.apache.spark.scheduler.Task.run(Task.scala:109)
> at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ClassNotFoundException: com..
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:348)
> at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:686)
> at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1868)
> at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
> at
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:223)
>
> I've check the loaded byte[]'s length, both from local and from HDFS are
> same. But why it can not be deserialized from HDFS?
>


-- 
Regards,
Tomo


Re: [lang3]

2019-04-16 Thread Scott Palmer

> On Apr 15, 2019, at 6:55 PM, Rob Tompkins  wrote:
> 
> 
> 
>> On Apr 15, 2019, at 3:08 PM, Gary Gregory  wrote:
>> 
>> On Mon, Apr 15, 2019 at 6:06 PM Bruno P. Kinoshita
>>  wrote:
>> 
>>> I think that should be fine. I think something similar already happened
>>> in the past, but can't recall which component.
>>> 
>>>On Tuesday, 16 April 2019, 9:58:43 am NZST, Rob Tompkins <
>>> chtom...@gmail.com> wrote:
>>> 
>>> 
>>> 
 On Apr 15, 2019, at 2:49 PM, Bruno P. Kinoshita <
>>> brunodepau...@yahoo.com.br.invalid> wrote:
 
 
 Hi Scott,
 I believe it was a mistake. Had a look at 3.8 and we had published it
>>> before.
 Just had a look at the vote thread, and it appears the javadocs jar was
>>> not included in the process. Possibly something with our pom.xml and
>>> plugins set up.
 
 @Rob, @Gary, is it possible to upload just the jar to an existing
>>> release?
>>> 
>>> Yes. My plan was to do just that. With a [LAZY][VOTE] on the staged
>>> artifacts in nexus. Thoughts?
>>> 
>> 
>> Since we approved the sources tagged and we are not changing those, I'd say
>> we are OK to push out the javadoc files.
> 
> This should be fixed now. It may take a little while for maven central to 
> pick up the changes. @Scott - many thanks for the catch there!
> 
> Cheers,
> -Rob

No problem. I appreciate the quick fix.

Scott

Re: [lang3]

2019-04-15 Thread Rob Tompkins



> On Apr 15, 2019, at 5:23 PM, Bruno P. Kinoshita 
>  wrote:
> 
> Great! 
> 
> Rob, just in case I ever do the same. Could you share what steps you had to 
> do in order to upload the javadocs, please?
> Thanks for the super quick fix!

Yeah, I’m going to figure out how to really get it fixed. I don’t much want 
that to happen again. 

> Bruno
> 
>On Tuesday, 16 April 2019, 10:56:22 am NZST, Rob Tompkins 
>  wrote:  
> 
> 
> 
>> On Apr 15, 2019, at 3:08 PM, Gary Gregory  wrote:
>> 
>> On Mon, Apr 15, 2019 at 6:06 PM Bruno P. Kinoshita
>> > > wrote:
>> 
>>> I think that should be fine. I think something similar already happened
>>> in the past, but can't recall which component.
>>> 
>>> On Tuesday, 16 April 2019, 9:58:43 am NZST, Rob Tompkins <
>>> chtom...@gmail.com> wrote:
>>> 
>>> 
>>> 
 On Apr 15, 2019, at 2:49 PM, Bruno P. Kinoshita <
>>> brunodepau...@yahoo.com.br.invalid> wrote:
 
 
 Hi Scott,
 I believe it was a mistake. Had a look at 3.8 and we had published it
>>> before.
 Just had a look at the vote thread, and it appears the javadocs jar was
>>> not included in the process. Possibly something with our pom.xml and
>>> plugins set up.
 
 @Rob, @Gary, is it possible to upload just the jar to an existing
>>> release?
>>> 
>>> Yes. My plan was to do just that. With a [LAZY][VOTE] on the staged
>>> artifacts in nexus. Thoughts?
>>> 
>> 
>> Since we approved the sources tagged and we are not changing those, I'd say
>> we are OK to push out the javadoc files.
> 
> This should be fixed now. It may take a little while for maven central to 
> pick up the changes. @Scott - many thanks for the catch there!
> 
> Cheers,
> -Rob
> 
>> 
>> Gary
>> 
>> 
>>> 
>>> -Rob
>>> 
 CheersBruno
 
   On Tuesday, 16 April 2019, 9:44:07 am NZST, Scott Palmer <
>>> swpal...@gmail.com> wrote:
 
 I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
 Is that intentional or a mistake?
 
 Scott
 (please copy me on responses as I am not subscribed to the list)
 
 -
 To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
 For additional commands, e-mail: user-h...@commons.apache.org
 
>>> 
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
>>> For additional commands, e-mail: user-h...@commons.apache.org

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [lang3]

2019-04-15 Thread Bruno P. Kinoshita
 Great! 

Rob, just in case I ever do the same. Could you share what steps you had to do 
in order to upload the javadocs, please?
Thanks for the super quick fix!
Bruno

On Tuesday, 16 April 2019, 10:56:22 am NZST, Rob Tompkins 
 wrote:  
 
 

> On Apr 15, 2019, at 3:08 PM, Gary Gregory  wrote:
> 
> On Mon, Apr 15, 2019 at 6:06 PM Bruno P. Kinoshita
>  > wrote:
> 
>> I think that should be fine. I think something similar already happened
>> in the past, but can't recall which component.
>> 
>>    On Tuesday, 16 April 2019, 9:58:43 am NZST, Rob Tompkins <
>> chtom...@gmail.com> wrote:
>> 
>> 
>> 
>>> On Apr 15, 2019, at 2:49 PM, Bruno P. Kinoshita <
>> brunodepau...@yahoo.com.br.invalid> wrote:
>>> 
>>> 
>>> Hi Scott,
>>> I believe it was a mistake. Had a look at 3.8 and we had published it
>> before.
>>> Just had a look at the vote thread, and it appears the javadocs jar was
>> not included in the process. Possibly something with our pom.xml and
>> plugins set up.
>>> 
>>> @Rob, @Gary, is it possible to upload just the jar to an existing
>> release?
>> 
>> Yes. My plan was to do just that. With a [LAZY][VOTE] on the staged
>> artifacts in nexus. Thoughts?
>> 
> 
> Since we approved the sources tagged and we are not changing those, I'd say
> we are OK to push out the javadoc files.

This should be fixed now. It may take a little while for maven central to pick 
up the changes. @Scott - many thanks for the catch there!

Cheers,
-Rob

> 
> Gary
> 
> 
>> 
>> -Rob
>> 
>>> CheersBruno
>>> 
>>>  On Tuesday, 16 April 2019, 9:44:07 am NZST, Scott Palmer <
>> swpal...@gmail.com> wrote:
>>> 
>>> I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
>>> Is that intentional or a mistake?
>>> 
>>> Scott
>>> (please copy me on responses as I am not subscribed to the list)
>>> 
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
>>> For additional commands, e-mail: user-h...@commons.apache.org
>>> 
>> 
>> -
>> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
>> For additional commands, e-mail: user-h...@commons.apache.org
  

Re: [lang3]

2019-04-15 Thread Rob Tompkins


> On Apr 15, 2019, at 3:08 PM, Gary Gregory  wrote:
> 
> On Mon, Apr 15, 2019 at 6:06 PM Bruno P. Kinoshita
>  > wrote:
> 
>> I think that should be fine. I think something similar already happened
>> in the past, but can't recall which component.
>> 
>>On Tuesday, 16 April 2019, 9:58:43 am NZST, Rob Tompkins <
>> chtom...@gmail.com> wrote:
>> 
>> 
>> 
>>> On Apr 15, 2019, at 2:49 PM, Bruno P. Kinoshita <
>> brunodepau...@yahoo.com.br.invalid> wrote:
>>> 
>>> 
>>> Hi Scott,
>>> I believe it was a mistake. Had a look at 3.8 and we had published it
>> before.
>>> Just had a look at the vote thread, and it appears the javadocs jar was
>> not included in the process. Possibly something with our pom.xml and
>> plugins set up.
>>> 
>>> @Rob, @Gary, is it possible to upload just the jar to an existing
>> release?
>> 
>> Yes. My plan was to do just that. With a [LAZY][VOTE] on the staged
>> artifacts in nexus. Thoughts?
>> 
> 
> Since we approved the sources tagged and we are not changing those, I'd say
> we are OK to push out the javadoc files.

This should be fixed now. It may take a little while for maven central to pick 
up the changes. @Scott - many thanks for the catch there!

Cheers,
-Rob

> 
> Gary
> 
> 
>> 
>> -Rob
>> 
>>> CheersBruno
>>> 
>>>   On Tuesday, 16 April 2019, 9:44:07 am NZST, Scott Palmer <
>> swpal...@gmail.com> wrote:
>>> 
>>> I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
>>> Is that intentional or a mistake?
>>> 
>>> Scott
>>> (please copy me on responses as I am not subscribed to the list)
>>> 
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
>>> For additional commands, e-mail: user-h...@commons.apache.org
>>> 
>> 
>> -
>> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
>> For additional commands, e-mail: user-h...@commons.apache.org



Re: [lang3]

2019-04-15 Thread Rob Tompkins



> On Apr 15, 2019, at 3:08 PM, Gary Gregory  wrote:
> 
> On Mon, Apr 15, 2019 at 6:06 PM Bruno P. Kinoshita
>  wrote:
> 
>> I think that should be fine. I think something similar already happened
>> in the past, but can't recall which component.
>> 
>>On Tuesday, 16 April 2019, 9:58:43 am NZST, Rob Tompkins <
>> chtom...@gmail.com> wrote:
>> 
>> 
>> 
>>> On Apr 15, 2019, at 2:49 PM, Bruno P. Kinoshita <
>> brunodepau...@yahoo.com.br.invalid> wrote:
>>> 
>>> 
>>> Hi Scott,
>>> I believe it was a mistake. Had a look at 3.8 and we had published it
>> before.
>>> Just had a look at the vote thread, and it appears the javadocs jar was
>> not included in the process. Possibly something with our pom.xml and
>> plugins set up.
>>> 
>>> @Rob, @Gary, is it possible to upload just the jar to an existing
>> release?
>> 
>> Yes. My plan was to do just that. With a [LAZY][VOTE] on the staged
>> artifacts in nexus. Thoughts?
>> 
> 
> Since we approved the sources tagged and we are not changing those, I'd say
> we are OK to push out the javadoc files.

Cool. I’ll sort that out in the next hour. 

> 
> Gary
> 
> 
>> 
>> -Rob
>> 
>>> CheersBruno
>>> 
>>>   On Tuesday, 16 April 2019, 9:44:07 am NZST, Scott Palmer <
>> swpal...@gmail.com> wrote:
>>> 
>>> I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
>>> Is that intentional or a mistake?
>>> 
>>> Scott
>>> (please copy me on responses as I am not subscribed to the list)
>>> 
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
>>> For additional commands, e-mail: user-h...@commons.apache.org
>>> 
>> 
>> -
>> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
>> For additional commands, e-mail: user-h...@commons.apache.org
>> 

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [lang3]

2019-04-15 Thread Gary Gregory
On Mon, Apr 15, 2019 at 6:06 PM Bruno P. Kinoshita
 wrote:

>  I think that should be fine. I think something similar already happened
> in the past, but can't recall which component.
>
> On Tuesday, 16 April 2019, 9:58:43 am NZST, Rob Tompkins <
> chtom...@gmail.com> wrote:
>
>
>
> > On Apr 15, 2019, at 2:49 PM, Bruno P. Kinoshita <
> brunodepau...@yahoo.com.br.invalid> wrote:
> >
> >
> > Hi Scott,
> > I believe it was a mistake. Had a look at 3.8 and we had published it
> before.
> > Just had a look at the vote thread, and it appears the javadocs jar was
> not included in the process. Possibly something with our pom.xml and
> plugins set up.
> >
> > @Rob, @Gary, is it possible to upload just the jar to an existing
> release?
>
> Yes. My plan was to do just that. With a [LAZY][VOTE] on the staged
> artifacts in nexus. Thoughts?
>

Since we approved the sources tagged and we are not changing those, I'd say
we are OK to push out the javadoc files.

Gary


>
> -Rob
>
> > CheersBruno
> >
> >On Tuesday, 16 April 2019, 9:44:07 am NZST, Scott Palmer <
> swpal...@gmail.com> wrote:
> >
> > I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
> > Is that intentional or a mistake?
> >
> > Scott
> > (please copy me on responses as I am not subscribed to the list)
> >
> > -
> > To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
> > For additional commands, e-mail: user-h...@commons.apache.org
> >
>
> -
> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
> For additional commands, e-mail: user-h...@commons.apache.org
>


Re: [lang3]

2019-04-15 Thread Bruno P. Kinoshita
 I think that should be fine. I think something similar already happened in the 
past, but can't recall which component.

On Tuesday, 16 April 2019, 9:58:43 am NZST, Rob Tompkins 
 wrote:  
 
 

> On Apr 15, 2019, at 2:49 PM, Bruno P. Kinoshita 
>  wrote:
> 
> 
> Hi Scott,
> I believe it was a mistake. Had a look at 3.8 and we had published it before.
> Just had a look at the vote thread, and it appears the javadocs jar was not 
> included in the process. Possibly something with our pom.xml and plugins set 
> up.
> 
> @Rob, @Gary, is it possible to upload just the jar to an existing release?

Yes. My plan was to do just that. With a [LAZY][VOTE] on the staged artifacts 
in nexus. Thoughts?

-Rob

> CheersBruno
> 
>    On Tuesday, 16 April 2019, 9:44:07 am NZST, Scott Palmer 
> wrote:  
> 
> I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
> Is that intentional or a mistake?
> 
> Scott
> (please copy me on responses as I am not subscribed to the list)
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
> For additional commands, e-mail: user-h...@commons.apache.org
> 

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org
  

Re: [lang3]

2019-04-15 Thread Rob Tompkins



> On Apr 15, 2019, at 2:49 PM, Bruno P. Kinoshita 
>  wrote:
> 
> 
> Hi Scott,
> I believe it was a mistake. Had a look at 3.8 and we had published it before.
> Just had a look at the vote thread, and it appears the javadocs jar was not 
> included in the process. Possibly something with our pom.xml and plugins set 
> up.
> 
> @Rob, @Gary, is it possible to upload just the jar to an existing release?

Yes. My plan was to do just that. With a [LAZY][VOTE] on the staged artifacts 
in nexus. Thoughts?

-Rob

> CheersBruno
> 
>On Tuesday, 16 April 2019, 9:44:07 am NZST, Scott Palmer 
>  wrote:  
> 
> I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
> Is that intentional or a mistake?
> 
> Scott
> (please copy me on responses as I am not subscribed to the list)
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
> For additional commands, e-mail: user-h...@commons.apache.org
> 

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [lang3]

2019-04-15 Thread Bruno P. Kinoshita
 
Hi Scott,
I believe it was a mistake. Had a look at 3.8 and we had published it before.
Just had a look at the vote thread, and it appears the javadocs jar was not 
included in the process. Possibly something with our pom.xml and plugins set up.

@Rob, @Gary, is it possible to upload just the jar to an existing release?
CheersBruno

On Tuesday, 16 April 2019, 9:44:07 am NZST, Scott Palmer 
 wrote:  
 
 I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
Is that intentional or a mistake?

Scott
(please copy me on responses as I am not subscribed to the list)

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org

  

Re: [lang3]

2019-04-15 Thread Rob Tompkins
Hm. Curious. Let me look at that.

-Rob

> On Apr 15, 2019, at 12:58 PM, Scott Palmer  wrote:
> 
> I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
> Is that intentional or a mistake?
> 
> Scott
> (please copy me on responses as I am not subscribed to the list)
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
> For additional commands, e-mail: user-h...@commons.apache.org
> 

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [lang3] FastDateFormat fails on some locales?

2019-02-01 Thread Kevin Risden
I did some further testing and found out of the ~160 locales that my
JDK8 had, only ja-JP-u-ca-japanese-x-lvariant-JP failed with the
ArrayIndexOutOfBoundsException

Kevin Risden

On Wed, Jan 30, 2019 at 1:36 PM Kevin Risden  wrote:
>
> I found this while looking at Apache Lucene/Solr and Hadoop 3. Hadoop
> uses FastDateFormat to format the current timestamp. Apache
> Lucene/Solr randomizes locales to ensure that things behave correctly
> even when there are different locales being used. There have been a
> few failures that have the following stack trace:
>
> ava.lang.ArrayIndexOutOfBoundsException: 4
>[junit4]   2> at
> org.apache.commons.lang3.time.FastDatePrinter$TextField.appendTo(FastDatePrinter.java:901)
> ~[commons-lang3-3.7.jar:3.7]
>[junit4]   2> at
> org.apache.commons.lang3.time.FastDatePrinter.applyRules(FastDatePrinter.java:573)
> ~[commons-lang3-3.7.jar:3.7]
>[junit4]   2> at
> org.apache.commons.lang3.time.FastDatePrinter.applyRulesToString(FastDatePrinter.java:455)
> ~[commons-lang3-3.7.jar:3.7]
>[junit4]   2> at
> org.apache.commons.lang3.time.FastDatePrinter.format(FastDatePrinter.java:446)
> ~[commons-lang3-3.7.jar:3.7]
>[junit4]   2> at
> org.apache.commons.lang3.time.FastDateFormat.format(FastDateFormat.java:428)
> ~[commons-lang3-3.7.jar:3.7]
>[junit4]   2> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.start(DirectoryScanner.java:281)
> ~[hadoop-hdfs-3.2.0.jar:?]
>[junit4]   2> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.initDirectoryScanner(DataNode.java:1090)
> ~[hadoop-hdfs-3.2.0.jar:?]
>[junit4]   2> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1686)
> ~[hadoop-hdfs-3.2.0.jar:?]
>[junit4]   2> at
> org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:390)
> ~[hadoop-hdfs-3.2.0.jar:?]
>[junit4]   2> at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:280)
> ~[hadoop-hdfs-3.2.0.jar:?]
>[junit4]   2> at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:819)
> [hadoop-hdfs-3.2.0.jar:?]
>[junit4]   2> at java.lang.Thread.run(Thread.java:748) [?:1.8.0_191]
>
> I was also able to reproduce this with a simple test case:
>
> long timestamp = System.currentTimeMillis();
> Locale.setDefault(Locale.forLanguageTag("ja-JP-u-ca-japanese-x-lvariant-JP"));
> Assert.assertEquals(SimpleDateFormat.getInstance().format(timestamp),
> FastDateFormat.getInstance().format(timestamp));
>
> Showing that the issue isn't with Hadoop but with commons-lang3
> specifically. SimpleDateFormat has no issue formatting the timestamp
> with the given locale. The FastDateFormat javadoc doesn't state any
> issues with locales.
>
> Is this to be expected?
>
> Kevin Risden

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [lang3] Problem with the OSGi metadata: Bundle-SymbolicName / breaking change between 3.7 and 3.8

2018-09-06 Thread P. Ottlinger
Hi,

thanks for quick response ...

Am 06.09.2018 um 21:24 schrieb Oliver Heger:
> So opening a ticket in Jira would be the correct action to take.

https://issues.apache.org/jira/browse/LANG-1419

Done :-) Hopefully I didn't miss any important stuff in Jira.

Cheers,
Phil

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [lang3] Problem with the OSGi metadata: Bundle-SymbolicName / breaking change between 3.7 and 3.8

2018-09-06 Thread Oliver Heger
Hi Phil,

as you already assume, this change in the OSGi meta data was caused by
changes in the build process and not intended.

So opening a ticket in Jira would be the correct action to take.

Thank you for reporting!
Oliver

Am 06.09.2018 um 20:49 schrieb P. Ottlinger:
> Hi,
> 
> I've just stumbled upon a problem that prevents me from updating from
> 3.7 to 3.8 in an OSGi context.
> 
> Although the release has just been a patch one, the bundle's symbolic
> name changed
> from "Bundle-SymbolicName org.apache.commons.lang3" in 3.7.0
> to "Bundle-SymbolicName org.apache.commons.commons-lang3" in 3.8.0.
> 
> That makes it impossible to do a drop-in update, as it is a breaking change.
> 
> Is that change an error in 3.8.0 or a wanted one that could be
> communicated more directly to downstream users?
> 
> May I file a bugticket in the LANG-Jira for it? I assume there has been
> a hickup when building the OSGi release JAR and the change was not intended.
> 
> Thanks,
> Phil
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
> For additional commands, e-mail: user-h...@commons.apache.org
> 

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org