Re: Apache Commons Lang (Commons Lang3) Compatibility

2022-07-27 Thread Gary Gregory
Hello Pranav Kumar,

I've not tested that very old version of Commons Lang with modern JDKs. The
git master build runs on Java 8, 11, and 17. So you should expect the
current version 3.12.0 to run ok on the above Java LTS versions.

It is always best to rely on your own builds and tests to validate whatever
stack you rely on since there are a lot of combinations of OSs, Java
vendors and jar dependencies that can make up an app.

Gary

There are many Java vendors these days so we rely on GitHub for testing
builds

On Wed, Jul 27, 2022, 22:56 Pranav Kumar (EXT)
 wrote:

> Hi Team,
>
> Could you please confirm is Apache Commons Lang (Commons Lang3) version
> 3.1 is compatible with Open Java 11 & 17, if not which version is
> compatible with Open Java 11 & 17?
>
> Regards,
> Pranav Kumar
>


Apache Commons Lang (Commons Lang3) Compatibility

2022-07-27 Thread Pranav Kumar (EXT)
Hi Team,

Could you please confirm is Apache Commons Lang (Commons Lang3) version 3.1 is 
compatible with Open Java 11 & 17, if not which version is compatible with Open 
Java 11 & 17?

Regards,
Pranav Kumar


Re: [commons-lang3] potential bug in CharSequenceUtils?

2020-04-29 Thread Xeno Amess
yes it is really a bug.
I created a fix pr (with test codes) at
https://github.com/apache/commons-lang/pull/529
check in it when you guys have time.


Xeno Amess  于2020年4月29日周三 上午5:04写道:

> well when I look at StringUtil I found something like this.
>
> final char c1 = cs.charAt(index1++);
> final char c2 = substring.charAt(index2++);
>
> if (c1 == c2) {
> continue;
> }
>
> if (!ignoreCase) {
> return false;
> }
>
> // The same check as in String.regionMatches():
> if (Character.toUpperCase(c1) != Character.toUpperCase(c2)
> && Character.toLowerCase(c1) != Character.toLowerCase(c2)) {
> return false;
> }
>
> But it actually is not quite same to what in String.regionMatches.
> the code part in String.regionMatches. in JKD8 is actually
>
> char c1 = ta[to++];
> char c2 = pa[po++];
> if (c1 == c2) {
> continue;
> }
> if (ignoreCase) {
> // If characters don't match but case may be ignored,
> // try converting both characters to uppercase.
> // If the results match, then the comparison scan should
> // continue.
> char u1 = Character.toUpperCase(c1);
> char u2 = Character.toUpperCase(c2);
> if (u1 == u2) {
> continue;
> }
> // Unfortunately, conversion to uppercase does not work properly
> // for the Georgian alphabet, which has strange rules about case
> // conversion.  So we need to make one last check before
> // exiting.
> if (Character.toLowerCase(u1) == Character.toLowerCase(u2)) {
> continue;
> }
> }
>
> see, the chars to invoke Character.toLowerCase is actually u1 and u2, but
> according to logic  in CharSequenceUtils they should be c1 and c2.
> If they are functional equal, then why oracle guys create the two
> variables u1 and u2? That is a waste of time then.
> So I think it might be a bug.
> But me myself know nothing about Georgian.
> Is there anybody familiar with Georgian alphabet and willing to do further
> debug about this?
>
>
>


[commons-lang3] potential bug in CharSequenceUtils?

2020-04-28 Thread Xeno Amess
well when I look at StringUtil I found something like this.

final char c1 = cs.charAt(index1++);
final char c2 = substring.charAt(index2++);

if (c1 == c2) {
continue;
}

if (!ignoreCase) {
return false;
}

// The same check as in String.regionMatches():
if (Character.toUpperCase(c1) != Character.toUpperCase(c2)
&& Character.toLowerCase(c1) != Character.toLowerCase(c2)) {
return false;
}

But it actually is not quite same to what in String.regionMatches.
the code part in String.regionMatches. in JKD8 is actually

char c1 = ta[to++];
char c2 = pa[po++];
if (c1 == c2) {
continue;
}
if (ignoreCase) {
// If characters don't match but case may be ignored,
// try converting both characters to uppercase.
// If the results match, then the comparison scan should
// continue.
char u1 = Character.toUpperCase(c1);
char u2 = Character.toUpperCase(c2);
if (u1 == u2) {
continue;
}
// Unfortunately, conversion to uppercase does not work properly
// for the Georgian alphabet, which has strange rules about case
// conversion.  So we need to make one last check before
// exiting.
if (Character.toLowerCase(u1) == Character.toLowerCase(u2)) {
continue;
}
}

see, the chars to invoke Character.toLowerCase is actually u1 and u2, but
according to logic  in CharSequenceUtils they should be c1 and c2.
If they are functional equal, then why oracle guys create the two variables
u1 and u2? That is a waste of time then.
So I think it might be a bug.
But me myself know nothing about Georgian.
Is there anybody familiar with Georgian alphabet and willing to do further
debug about this?


[lang3]

2020-02-14 Thread Garry Shamis
Hi,

I have question about StopWatch.

I want to do interleaved timing with a single StopWatch instance.
Something like this:
{
  code-to-time-group-1
}
{
  code-to-time-group-2
}
{
  code-to-time-group-1
}
this could be done with 2 instances of StopWatch at higher performance
impact.
with suspend/resume for each instance.
I would like to do it with single instance and if I had 1 of these changes,
I could:
- resume() could return elapsed-suspend-time
- new access methods to retrieve startNanoTime

Is there something I am missing?
Can this be done already?
Seems like a simple change.

Thanks


[lang3] StringUtils does not handle supplementary characters correctly

2019-08-06 Thread Jason Pickens
Hi,

I was just wondering whether StringUtils should be handling Unicode
supplementary characters correctly?

For example org.apache.commons.lang3.StringUtils#isAlphanumeric will return
false for code point 65536 which is actually a letter. This is because it
uses java.lang.CharSequence#charAt rather
than java.lang.CharSequence#codePoints. The former will only return the
high-surrogate code unit if that code point is a supplementary code point.


Cheers,

Jason


Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-27 Thread big data
mon Lang package at

final T obj = (T) in.readObject();

T is Block class, and when it want to transfer object to T (Block), it seems 
that it can not find Block class in JVM. so ClassNotFoundException happens.


Then I copy Lang source code and change T to Block directly, the program runs 
OK again like below:

public static Block deserializeFrom(byte[] bytes) {
//Block b = SerializationUtils.deserialize(bytes);
ByteArrayInputStream inputStream = new ByteArrayInputStream(bytes);
try (ObjectInputStream in = new ObjectInputStream(inputStream)) {
return (Block) in.readObject();
} catch (final ClassNotFoundException | IOException ex) {
System.out.println("ClassNotFoundException | IOException");
ex.printStackTrace();
} catch (ClassCastException e) {
System.out.println("ClassCastException");
e.printStackTrace();
} catch (IllegalArgumentException e) {
System.out.println("IllegalArgumentException");
e.printStackTrace();

} catch (SerializationException e) {
System.out.println("SerializationException");
e.printStackTrace();
}
return null;
}


在 2019/6/28 上午10:08, Tomo Suzuki 写道:

Glad to hear you made progress. Good luck!

(Another possibility: you might have changed the package or class name
since you saved the HDFS file.)

On Thu, Jun 27, 2019 at 21:25 big data 
<mailto:bigdatab...@outlook.com> wrote:



Thanks. I've tried it, the new Block before it is OK.

I've solved it and posted another issue to describe this progress. The
details refer to another email:  Java Generic T makes ClassNotFoundException

在 2019/6/27 下午8:41, Tomo Suzuki 写道:

My suggestion after reading ClassNotFoundException is to try to instantiate
the class just before deserializing it:

public static Block deserializeFrom(byte[] bytes) {
// Dummy instantiation to ensure Block class and its related classes
are available
System.out.println("dummy = " + new Block());
System.out.println("byte length = " + bytes.length); // Does this match
what you expect?
try {
Block b = SerializationUtils.deserialize(bytes);
...


Looking forward to hearing the result.



On Wed, Jun 26, 2019 at 11:03 PM big data 
mailto:bigdatab...@outlook.com>


<mailto:bigdatab...@outlook.com><mailto:bigdatab...@outlook.com> wrote:





The XXX Class named Block, below is part codes of it:

The deserialize code like this:

public static Block deserializeFrom(byte[] bytes) {
try {
Block b = SerializationUtils.deserialize(bytes);
System.out.println("b="+b);
return b;
} catch (ClassCastException e) {
System.out.println("ClassCastException");
e.printStackTrace();
} catch (IllegalArgumentException e) {
System.out.println("IllegalArgumentException");
e.printStackTrace();

} catch (SerializationException e) {
System.out.println("SerializationException");
e.printStackTrace();
}
return null;
}


The Spark code is:

val fis = spark.sparkContext.binaryFiles("/folder/abc*.file")
val RDD = fis.map(x => {
  val content = x._2.toArray()
  val b = Block.deserializeFrom(content)
  ...
}


All codes above can run successfully in Spark local mode, but when run it
in Yarn cluster mode, the error happens.

在 2019/6/27 上午9:49, Tomo Suzuki 写道:

I'm afraid that I don't have enough information to troubleshoot problem in
com.XXX.XXX. It would be great if you can create a minimal example project
that can reproduce the same issue.

Regards,
Tomo

On Wed, Jun 26, 2019 at 9:20 PM big data 
<mailto:bigdatab...@outlook.com><mailto:bigdatab...@outlook.com><mailto:bigdatab...@outlook.com><mailto:bigdatab...@outlook.com><mailto:bigdatab...@outlook.com>
 wrote:



Hi,

Actually, the class com.XXX.XXX is normally called in the before spark
code, and this exception error is happened in one static method of this
class.

So the jar dependency problem can be excluded.

在 2019/6/26 下午10:23, Tomo Suzuki 写道:


Hi Big data,

I don't use SerializationUtils, but if I interpret the error message:

   ClassNotFoundException: com..

, this says com.. is not available in the class path of JVM


(which


your Spark is running on). I would verify that you can instantiate
com.. in Spark/Scala *without* SerializationUtils.

Regards,
Tomo



On Wed, Jun 26, 2019 at 4:12 AM big data 
<mailto:bigdatab...@outlook.com><mailto:bigdatab...@outlook.com><mailto:bigdatab...@outlook.com><mailto:bigdatab...@outlook.com><mailto:bigdatab...@outlook.com>


wrote:





I use Apache Commons Lang3's SerializationUtils in the code.

SerializationUtils.serialize()

to store a customized class as files into disk and

SerializationUtils.des

Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-27 Thread Tomo Suzuki
Glad to hear you made progress. Good luck!

(Another possibility: you might have changed the package or class name
since you saved the HDFS file.)

On Thu, Jun 27, 2019 at 21:25 big data  wrote:

> Thanks. I've tried it, the new Block before it is OK.
>
> I've solved it and posted another issue to describe this progress. The
> details refer to another email:  Java Generic T makes ClassNotFoundException
>
> 在 2019/6/27 下午8:41, Tomo Suzuki 写道:
>
> My suggestion after reading ClassNotFoundException is to try to instantiate
> the class just before deserializing it:
>
> public static Block deserializeFrom(byte[] bytes) {
> // Dummy instantiation to ensure Block class and its related classes
> are available
> System.out.println("dummy = " + new Block());
> System.out.println("byte length = " + bytes.length); // Does this match
> what you expect?
> try {
> Block b = SerializationUtils.deserialize(bytes);
> ...
>
>
> Looking forward to hearing the result.
>
>
>
> On Wed, Jun 26, 2019 at 11:03 PM big data  ><mailto:bigdatab...@outlook.com> wrote:
>
>
>
> The XXX Class named Block, below is part codes of it:
>
> The deserialize code like this:
>
> public static Block deserializeFrom(byte[] bytes) {
> try {
> Block b = SerializationUtils.deserialize(bytes);
> System.out.println("b="+b);
> return b;
> } catch (ClassCastException e) {
> System.out.println("ClassCastException");
> e.printStackTrace();
> } catch (IllegalArgumentException e) {
> System.out.println("IllegalArgumentException");
> e.printStackTrace();
>
> } catch (SerializationException e) {
> System.out.println("SerializationException");
> e.printStackTrace();
> }
> return null;
> }
>
>
> The Spark code is:
>
> val fis = spark.sparkContext.binaryFiles("/folder/abc*.file")
> val RDD = fis.map(x => {
>   val content = x._2.toArray()
>   val b = Block.deserializeFrom(content)
>   ...
> }
>
>
> All codes above can run successfully in Spark local mode, but when run it
> in Yarn cluster mode, the error happens.
>
> 在 2019/6/27 上午9:49, Tomo Suzuki 写道:
>
> I'm afraid that I don't have enough information to troubleshoot problem in
> com.XXX.XXX. It would be great if you can create a minimal example project
> that can reproduce the same issue.
>
> Regards,
> Tomo
>
> On Wed, Jun 26, 2019 at 9:20 PM big data  bigdatab...@outlook.com> bigdatab...@outlook.com><mailto:bigdatab...@outlook.com> wrote:
>
>
>
> Hi,
>
> Actually, the class com.XXX.XXX is normally called in the before spark
> code, and this exception error is happened in one static method of this
> class.
>
> So the jar dependency problem can be excluded.
>
> 在 2019/6/26 下午10:23, Tomo Suzuki 写道:
>
>
> Hi Big data,
>
> I don't use SerializationUtils, but if I interpret the error message:
>
>ClassNotFoundException: com..
>
> , this says com.. is not available in the class path of JVM
>
>
> (which
>
>
> your Spark is running on). I would verify that you can instantiate
> com.. in Spark/Scala *without* SerializationUtils.
>
> Regards,
> Tomo
>
>
>
> On Wed, Jun 26, 2019 at 4:12 AM big data  bigdatab...@outlook.com> bigdatab...@outlook.com><mailto:bigdatab...@outlook.com>
>
>
> wrote:
>
>
>
>
>
> I use Apache Commons Lang3's SerializationUtils in the code.
>
> SerializationUtils.serialize()
>
> to store a customized class as files into disk and
>
> SerializationUtils.deserialize(byte[])
>
> to restore them again.
>
> In the local environment (Mac OS), all serialized files can be
> deserialized normally and no error happens. But when I copy these
> serialized files into HDFS, and read them from HDFS by using
>
>
> Spark/Scala, a
>
>
> SerializeException happens.
>
> The Apache Commons Lang3 version is:
>
>  
>  org.apache.commons
>  commons-lang3
>  3.9
>  
>
>
> the stack error as below:
>
> org.apache.commons.lang3.SerializationException:
> java.lang.ClassNotFoundException: com..
>  at
>
>
>
>
>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)
>
>
>  at
>
>
>
>
>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)
>
>
>  at com.com...deserializeFrom(XXX.java:81)
>  at com.XXX.$$anonfun$3.apply(B.scala

Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-27 Thread big data
Thanks. I've tried it, the new Block before it is OK.

I've solved it and posted another issue to describe this progress. The details 
refer to another email:  Java Generic T makes ClassNotFoundException

在 2019/6/27 下午8:41, Tomo Suzuki 写道:

My suggestion after reading ClassNotFoundException is to try to instantiate
the class just before deserializing it:

public static Block deserializeFrom(byte[] bytes) {
// Dummy instantiation to ensure Block class and its related classes
are available
System.out.println("dummy = " + new Block());
System.out.println("byte length = " + bytes.length); // Does this match
what you expect?
try {
Block b = SerializationUtils.deserialize(bytes);
...


Looking forward to hearing the result.



On Wed, Jun 26, 2019 at 11:03 PM big data 
<mailto:bigdatab...@outlook.com> wrote:



The XXX Class named Block, below is part codes of it:

The deserialize code like this:

public static Block deserializeFrom(byte[] bytes) {
try {
Block b = SerializationUtils.deserialize(bytes);
System.out.println("b="+b);
return b;
} catch (ClassCastException e) {
System.out.println("ClassCastException");
e.printStackTrace();
} catch (IllegalArgumentException e) {
System.out.println("IllegalArgumentException");
e.printStackTrace();

} catch (SerializationException e) {
System.out.println("SerializationException");
e.printStackTrace();
}
return null;
}


The Spark code is:

val fis = spark.sparkContext.binaryFiles("/folder/abc*.file")
val RDD = fis.map(x => {
  val content = x._2.toArray()
  val b = Block.deserializeFrom(content)
  ...
}


All codes above can run successfully in Spark local mode, but when run it
in Yarn cluster mode, the error happens.

在 2019/6/27 上午9:49, Tomo Suzuki 写道:

I'm afraid that I don't have enough information to troubleshoot problem in
com.XXX.XXX. It would be great if you can create a minimal example project
that can reproduce the same issue.

Regards,
Tomo

On Wed, Jun 26, 2019 at 9:20 PM big data 
<mailto:bigdatab...@outlook.com><mailto:bigdatab...@outlook.com> wrote:



Hi,

Actually, the class com.XXX.XXX is normally called in the before spark
code, and this exception error is happened in one static method of this
class.

So the jar dependency problem can be excluded.

在 2019/6/26 下午10:23, Tomo Suzuki 写道:


Hi Big data,

I don't use SerializationUtils, but if I interpret the error message:

   ClassNotFoundException: com..

, this says com.. is not available in the class path of JVM


(which


your Spark is running on). I would verify that you can instantiate
com.. in Spark/Scala *without* SerializationUtils.

Regards,
Tomo



On Wed, Jun 26, 2019 at 4:12 AM big data 
<mailto:bigdatab...@outlook.com><mailto:bigdatab...@outlook.com>


wrote:





I use Apache Commons Lang3's SerializationUtils in the code.

SerializationUtils.serialize()

to store a customized class as files into disk and

SerializationUtils.deserialize(byte[])

to restore them again.

In the local environment (Mac OS), all serialized files can be
deserialized normally and no error happens. But when I copy these
serialized files into HDFS, and read them from HDFS by using


Spark/Scala, a


SerializeException happens.

The Apache Commons Lang3 version is:

 
 org.apache.commons
 commons-lang3
 3.9
 


the stack error as below:

org.apache.commons.lang3.SerializationException:
java.lang.ClassNotFoundException: com..
 at




org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)


 at




org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)


 at com.com...deserializeFrom(XXX.java:81)
 at com.XXX.$$anonfun$3.apply(B.scala:157)
 at com.XXX.$$anonfun$3.apply(B.scala:153)
 at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
 at scala.collection.Iterator$class.foreach(Iterator.scala:893)
 at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
 at
scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
 at



scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)


 at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
 at scala.collection.TraversableOnce$class.to
(TraversableOnce.scala:310)
 at scala.collection.AbstractIterator.to(Iterator.scala:1336)
 at



scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)


 at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
 at



scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)


 at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
 at




org.apache.spark.rdd.RDD$$anonfun$collec

Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-27 Thread Tomo Suzuki
My suggestion after reading ClassNotFoundException is to try to instantiate
the class just before deserializing it:

public static Block deserializeFrom(byte[] bytes) {
// Dummy instantiation to ensure Block class and its related classes
are available
System.out.println("dummy = " + new Block());
System.out.println("byte length = " + bytes.length); // Does this match
what you expect?
try {
Block b = SerializationUtils.deserialize(bytes);
...


Looking forward to hearing the result.



On Wed, Jun 26, 2019 at 11:03 PM big data  wrote:

> The XXX Class named Block, below is part codes of it:
>
> The deserialize code like this:
>
> public static Block deserializeFrom(byte[] bytes) {
> try {
> Block b = SerializationUtils.deserialize(bytes);
> System.out.println("b="+b);
> return b;
> } catch (ClassCastException e) {
> System.out.println("ClassCastException");
> e.printStackTrace();
> } catch (IllegalArgumentException e) {
> System.out.println("IllegalArgumentException");
> e.printStackTrace();
>
> } catch (SerializationException e) {
> System.out.println("SerializationException");
> e.printStackTrace();
> }
> return null;
> }
>
>
> The Spark code is:
>
> val fis = spark.sparkContext.binaryFiles("/folder/abc*.file")
> val RDD = fis.map(x => {
>   val content = x._2.toArray()
>   val b = Block.deserializeFrom(content)
>   ...
> }
>
>
> All codes above can run successfully in Spark local mode, but when run it
> in Yarn cluster mode, the error happens.
>
> 在 2019/6/27 上午9:49, Tomo Suzuki 写道:
>
> I'm afraid that I don't have enough information to troubleshoot problem in
> com.XXX.XXX. It would be great if you can create a minimal example project
> that can reproduce the same issue.
>
> Regards,
> Tomo
>
> On Wed, Jun 26, 2019 at 9:20 PM big data  bigdatab...@outlook.com> wrote:
>
>
>
> Hi,
>
> Actually, the class com.XXX.XXX is normally called in the before spark
> code, and this exception error is happened in one static method of this
> class.
>
> So the jar dependency problem can be excluded.
>
> 在 2019/6/26 下午10:23, Tomo Suzuki 写道:
>
>
> Hi Big data,
>
> I don't use SerializationUtils, but if I interpret the error message:
>
>ClassNotFoundException: com..
>
> , this says com.. is not available in the class path of JVM
>
>
> (which
>
>
> your Spark is running on). I would verify that you can instantiate
> com.. in Spark/Scala *without* SerializationUtils.
>
> Regards,
> Tomo
>
>
>
> On Wed, Jun 26, 2019 at 4:12 AM big data  bigdatab...@outlook.com>
>
>
> wrote:
>
>
>
>
>
> I use Apache Commons Lang3's SerializationUtils in the code.
>
> SerializationUtils.serialize()
>
> to store a customized class as files into disk and
>
> SerializationUtils.deserialize(byte[])
>
> to restore them again.
>
> In the local environment (Mac OS), all serialized files can be
> deserialized normally and no error happens. But when I copy these
> serialized files into HDFS, and read them from HDFS by using
>
>
> Spark/Scala, a
>
>
> SerializeException happens.
>
> The Apache Commons Lang3 version is:
>
>  
>  org.apache.commons
>  commons-lang3
>  3.9
>  
>
>
> the stack error as below:
>
> org.apache.commons.lang3.SerializationException:
> java.lang.ClassNotFoundException: com..
>  at
>
>
>
>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)
>
>
>  at
>
>
>
>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)
>
>
>  at com.com...deserializeFrom(XXX.java:81)
>  at com.XXX.$$anonfun$3.apply(B.scala:157)
>  at com.XXX.$$anonfun$3.apply(B.scala:153)
>  at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
>  at scala.collection.Iterator$class.foreach(Iterator.scala:893)
>  at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
>  at
> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
>  at
>
>
>
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
>
>
>  at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
>  at scala.collection.TraversableOnce$class.to
> (TraversableOnce.scala:310)
>  at scala.collection.AbstractIterator.to(I

Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-26 Thread big data
The XXX Class named Block, below is part codes of it:

The deserialize code like this:

public static Block deserializeFrom(byte[] bytes) {
try {
Block b = SerializationUtils.deserialize(bytes);
System.out.println("b="+b);
return b;
} catch (ClassCastException e) {
System.out.println("ClassCastException");
e.printStackTrace();
} catch (IllegalArgumentException e) {
System.out.println("IllegalArgumentException");
e.printStackTrace();

} catch (SerializationException e) {
System.out.println("SerializationException");
e.printStackTrace();
}
return null;
}


The Spark code is:

val fis = spark.sparkContext.binaryFiles("/folder/abc*.file")
val RDD = fis.map(x => {
  val content = x._2.toArray()
  val b = Block.deserializeFrom(content)
  ...
}


All codes above can run successfully in Spark local mode, but when run it in 
Yarn cluster mode, the error happens.

在 2019/6/27 上午9:49, Tomo Suzuki 写道:

I'm afraid that I don't have enough information to troubleshoot problem in
com.XXX.XXX. It would be great if you can create a minimal example project
that can reproduce the same issue.

Regards,
Tomo

On Wed, Jun 26, 2019 at 9:20 PM big data 
<mailto:bigdatab...@outlook.com> wrote:



Hi,

Actually, the class com.XXX.XXX is normally called in the before spark
code, and this exception error is happened in one static method of this
class.

So the jar dependency problem can be excluded.

在 2019/6/26 下午10:23, Tomo Suzuki 写道:


Hi Big data,

I don't use SerializationUtils, but if I interpret the error message:

   ClassNotFoundException: com..

, this says com.. is not available in the class path of JVM


(which


your Spark is running on). I would verify that you can instantiate
com.. in Spark/Scala *without* SerializationUtils.

Regards,
Tomo



On Wed, Jun 26, 2019 at 4:12 AM big data 
<mailto:bigdatab...@outlook.com>


wrote:





I use Apache Commons Lang3's SerializationUtils in the code.

SerializationUtils.serialize()

to store a customized class as files into disk and

SerializationUtils.deserialize(byte[])

to restore them again.

In the local environment (Mac OS), all serialized files can be
deserialized normally and no error happens. But when I copy these
serialized files into HDFS, and read them from HDFS by using


Spark/Scala, a


SerializeException happens.

The Apache Commons Lang3 version is:

 
 org.apache.commons
 commons-lang3
 3.9
 


the stack error as below:

org.apache.commons.lang3.SerializationException:
java.lang.ClassNotFoundException: com..
 at



org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)


 at



org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)


 at com.com...deserializeFrom(XXX.java:81)
 at com.XXX.$$anonfun$3.apply(B.scala:157)
 at com.XXX.$$anonfun$3.apply(B.scala:153)
 at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
 at scala.collection.Iterator$class.foreach(Iterator.scala:893)
 at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
 at
scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
 at



scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)


 at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
 at scala.collection.TraversableOnce$class.to
(TraversableOnce.scala:310)
 at scala.collection.AbstractIterator.to(Iterator.scala:1336)
 at



scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)


 at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
 at



scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)


 at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
 at



org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)


 at



org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)


 at



org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)


 at



org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)


 at


org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)


 at org.apache.spark.scheduler.Task.run(Task.scala:109)
 at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
 at



java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)


 at



java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)


 at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: com..
 at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
 at java.lang.ClassLoader.loadClass(ClassLo

Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-26 Thread Tomo Suzuki
I'm afraid that I don't have enough information to troubleshoot problem in
com.XXX.XXX. It would be great if you can create a minimal example project
that can reproduce the same issue.

Regards,
Tomo

On Wed, Jun 26, 2019 at 9:20 PM big data  wrote:

> Hi,
>
> Actually, the class com.XXX.XXX is normally called in the before spark
> code, and this exception error is happened in one static method of this
> class.
>
> So the jar dependency problem can be excluded.
>
> 在 2019/6/26 下午10:23, Tomo Suzuki 写道:
> > Hi Big data,
> >
> > I don't use SerializationUtils, but if I interpret the error message:
> >
> >ClassNotFoundException: com..
> >
> > , this says com.. is not available in the class path of JVM
> (which
> > your Spark is running on). I would verify that you can instantiate
> > com.. in Spark/Scala *without* SerializationUtils.
> >
> > Regards,
> > Tomo
> >
> >
> >
> > On Wed, Jun 26, 2019 at 4:12 AM big data 
> wrote:
> >
> >> I use Apache Commons Lang3's SerializationUtils in the code.
> >>
> >> SerializationUtils.serialize()
> >>
> >> to store a customized class as files into disk and
> >>
> >> SerializationUtils.deserialize(byte[])
> >>
> >> to restore them again.
> >>
> >> In the local environment (Mac OS), all serialized files can be
> >> deserialized normally and no error happens. But when I copy these
> >> serialized files into HDFS, and read them from HDFS by using
> Spark/Scala, a
> >> SerializeException happens.
> >>
> >> The Apache Commons Lang3 version is:
> >>
> >>  
> >>  org.apache.commons
> >>  commons-lang3
> >>  3.9
> >>  
> >>
> >>
> >> the stack error as below:
> >>
> >> org.apache.commons.lang3.SerializationException:
> >> java.lang.ClassNotFoundException: com..
> >>  at
> >>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)
> >>  at
> >>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)
> >>  at com.com...deserializeFrom(XXX.java:81)
> >>  at com.XXX.$$anonfun$3.apply(B.scala:157)
> >>  at com.XXX.$$anonfun$3.apply(B.scala:153)
> >>  at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
> >>  at scala.collection.Iterator$class.foreach(Iterator.scala:893)
> >>  at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
> >>  at
> >> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
> >>  at
> >>
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
> >>  at
> >> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
> >>  at scala.collection.TraversableOnce$class.to
> >> (TraversableOnce.scala:310)
> >>  at scala.collection.AbstractIterator.to(Iterator.scala:1336)
> >>  at
> >>
> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
> >>  at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
> >>  at
> >>
> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
> >>  at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
> >>  at
> >>
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
> >>  at
> >>
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
> >>  at
> >>
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
> >>  at
> >>
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
> >>  at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
> >>  at org.apache.spark.scheduler.Task.run(Task.scala:109)
> >>  at
> >> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> >>  at
> >>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> >>  at
> >>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> >>  at java.lang.Thread.run(Thread.java:748)
> >> Caused by: java.lang.ClassNotFoundException: com..
> >>  at java.net.URLClassLoader.findClass

Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-26 Thread big data
Hi,

Actually, the class com.XXX.XXX is normally called in the before spark 
code, and this exception error is happened in one static method of this 
class.

So the jar dependency problem can be excluded.

在 2019/6/26 下午10:23, Tomo Suzuki 写道:
> Hi Big data,
>
> I don't use SerializationUtils, but if I interpret the error message:
>
>ClassNotFoundException: com..
>
> , this says com.. is not available in the class path of JVM (which
> your Spark is running on). I would verify that you can instantiate
> com.. in Spark/Scala *without* SerializationUtils.
>
> Regards,
> Tomo
>
>
>
> On Wed, Jun 26, 2019 at 4:12 AM big data  wrote:
>
>> I use Apache Commons Lang3's SerializationUtils in the code.
>>
>> SerializationUtils.serialize()
>>
>> to store a customized class as files into disk and
>>
>> SerializationUtils.deserialize(byte[])
>>
>> to restore them again.
>>
>> In the local environment (Mac OS), all serialized files can be
>> deserialized normally and no error happens. But when I copy these
>> serialized files into HDFS, and read them from HDFS by using Spark/Scala, a
>> SerializeException happens.
>>
>> The Apache Commons Lang3 version is:
>>
>>  
>>  org.apache.commons
>>  commons-lang3
>>  3.9
>>  
>>
>>
>> the stack error as below:
>>
>> org.apache.commons.lang3.SerializationException:
>> java.lang.ClassNotFoundException: com..
>>  at
>> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)
>>  at
>> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)
>>  at com.com...deserializeFrom(XXX.java:81)
>>  at com.XXX.$$anonfun$3.apply(B.scala:157)
>>  at com.XXX.$$anonfun$3.apply(B.scala:153)
>>  at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
>>  at scala.collection.Iterator$class.foreach(Iterator.scala:893)
>>  at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
>>  at
>> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
>>  at
>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
>>  at
>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
>>  at scala.collection.TraversableOnce$class.to
>> (TraversableOnce.scala:310)
>>  at scala.collection.AbstractIterator.to(Iterator.scala:1336)
>>  at
>> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
>>  at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
>>  at
>> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
>>  at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
>>  at
>> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
>>  at
>> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
>>  at
>> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
>>  at
>> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
>>  at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
>>  at org.apache.spark.scheduler.Task.run(Task.scala:109)
>>  at
>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
>>  at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>  at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>  at java.lang.Thread.run(Thread.java:748)
>> Caused by: java.lang.ClassNotFoundException: com..
>>  at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
>>  at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>  at java.lang.Class.forName0(Native Method)
>>  at java.lang.Class.forName(Class.java:348)
>>  at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:686)
>>  at
>> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1868)
>>  at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751)
>>  at
>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042)
>>  at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
>>  at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
>>  at
>> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:223)
>>
>> I've check the loaded byte[]'s length, both from local and from HDFS are
>> same. But why it can not be deserialized from HDFS?
>>
>


Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-26 Thread Tomo Suzuki
Hi Big data,

I don't use SerializationUtils, but if I interpret the error message:

  ClassNotFoundException: com..

, this says com.. is not available in the class path of JVM (which
your Spark is running on). I would verify that you can instantiate
com.. in Spark/Scala *without* SerializationUtils.

Regards,
Tomo



On Wed, Jun 26, 2019 at 4:12 AM big data  wrote:

> I use Apache Commons Lang3's SerializationUtils in the code.
>
> SerializationUtils.serialize()
>
> to store a customized class as files into disk and
>
> SerializationUtils.deserialize(byte[])
>
> to restore them again.
>
> In the local environment (Mac OS), all serialized files can be
> deserialized normally and no error happens. But when I copy these
> serialized files into HDFS, and read them from HDFS by using Spark/Scala, a
> SerializeException happens.
>
> The Apache Commons Lang3 version is:
>
> 
> org.apache.commons
> commons-lang3
> 3.9
> 
>
>
> the stack error as below:
>
> org.apache.commons.lang3.SerializationException:
> java.lang.ClassNotFoundException: com..
> at
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)
> at
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)
> at com.com...deserializeFrom(XXX.java:81)
> at com.XXX.$$anonfun$3.apply(B.scala:157)
> at com.XXX.$$anonfun$3.apply(B.scala:153)
> at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
> at scala.collection.Iterator$class.foreach(Iterator.scala:893)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
> at
> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
> at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
> at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
> at scala.collection.TraversableOnce$class.to
> (TraversableOnce.scala:310)
> at scala.collection.AbstractIterator.to(Iterator.scala:1336)
> at
> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
> at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
> at
> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
> at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
> at
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
> at
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
> at
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
> at
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
> at org.apache.spark.scheduler.Task.run(Task.scala:109)
> at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ClassNotFoundException: com..
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:348)
> at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:686)
> at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1868)
> at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
> at
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:223)
>
> I've check the loaded byte[]'s length, both from local and from HDFS are
> same. But why it can not be deserialized from HDFS?
>


-- 
Regards,
Tomo


[lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-26 Thread big data
I use Apache Commons Lang3's SerializationUtils in the code.

SerializationUtils.serialize()

to store a customized class as files into disk and

SerializationUtils.deserialize(byte[])

to restore them again.

In the local environment (Mac OS), all serialized files can be deserialized 
normally and no error happens. But when I copy these serialized files into 
HDFS, and read them from HDFS by using Spark/Scala, a SerializeException 
happens.

The Apache Commons Lang3 version is:


org.apache.commons
commons-lang3
3.9



the stack error as below:

org.apache.commons.lang3.SerializationException: 
java.lang.ClassNotFoundException: com..
at 
org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)
at 
org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)
at com.com...deserializeFrom(XXX.java:81)
at com.XXX.$$anonfun$3.apply(B.scala:157)
at com.XXX.$$anonfun$3.apply(B.scala:153)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)
at scala.collection.AbstractIterator.to(Iterator.scala:1336)
at 
scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
at 
org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
at 
org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
at 
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
at 
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: com..
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:686)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1868)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at 
org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:223)

I've check the loaded byte[]'s length, both from local and from HDFS are same. 
But why it can not be deserialized from HDFS?


Re: [lang3]

2019-04-16 Thread Scott Palmer

> On Apr 15, 2019, at 6:55 PM, Rob Tompkins  wrote:
> 
> 
> 
>> On Apr 15, 2019, at 3:08 PM, Gary Gregory  wrote:
>> 
>> On Mon, Apr 15, 2019 at 6:06 PM Bruno P. Kinoshita
>>  wrote:
>> 
>>> I think that should be fine. I think something similar already happened
>>> in the past, but can't recall which component.
>>> 
>>>On Tuesday, 16 April 2019, 9:58:43 am NZST, Rob Tompkins <
>>> chtom...@gmail.com> wrote:
>>> 
>>> 
>>> 
 On Apr 15, 2019, at 2:49 PM, Bruno P. Kinoshita <
>>> brunodepau...@yahoo.com.br.invalid> wrote:
 
 
 Hi Scott,
 I believe it was a mistake. Had a look at 3.8 and we had published it
>>> before.
 Just had a look at the vote thread, and it appears the javadocs jar was
>>> not included in the process. Possibly something with our pom.xml and
>>> plugins set up.
 
 @Rob, @Gary, is it possible to upload just the jar to an existing
>>> release?
>>> 
>>> Yes. My plan was to do just that. With a [LAZY][VOTE] on the staged
>>> artifacts in nexus. Thoughts?
>>> 
>> 
>> Since we approved the sources tagged and we are not changing those, I'd say
>> we are OK to push out the javadoc files.
> 
> This should be fixed now. It may take a little while for maven central to 
> pick up the changes. @Scott - many thanks for the catch there!
> 
> Cheers,
> -Rob

No problem. I appreciate the quick fix.

Scott

Re: [lang3]

2019-04-15 Thread Rob Tompkins



> On Apr 15, 2019, at 5:23 PM, Bruno P. Kinoshita 
>  wrote:
> 
> Great! 
> 
> Rob, just in case I ever do the same. Could you share what steps you had to 
> do in order to upload the javadocs, please?
> Thanks for the super quick fix!

Yeah, I’m going to figure out how to really get it fixed. I don’t much want 
that to happen again. 

> Bruno
> 
>On Tuesday, 16 April 2019, 10:56:22 am NZST, Rob Tompkins 
>  wrote:  
> 
> 
> 
>> On Apr 15, 2019, at 3:08 PM, Gary Gregory  wrote:
>> 
>> On Mon, Apr 15, 2019 at 6:06 PM Bruno P. Kinoshita
>> > <mailto:brunodepau...@yahoo.com.br.invalid>> wrote:
>> 
>>> I think that should be fine. I think something similar already happened
>>> in the past, but can't recall which component.
>>> 
>>> On Tuesday, 16 April 2019, 9:58:43 am NZST, Rob Tompkins <
>>> chtom...@gmail.com> wrote:
>>> 
>>> 
>>> 
>>>> On Apr 15, 2019, at 2:49 PM, Bruno P. Kinoshita <
>>> brunodepau...@yahoo.com.br.invalid> wrote:
>>>> 
>>>> 
>>>> Hi Scott,
>>>> I believe it was a mistake. Had a look at 3.8 and we had published it
>>> before.
>>>> Just had a look at the vote thread, and it appears the javadocs jar was
>>> not included in the process. Possibly something with our pom.xml and
>>> plugins set up.
>>>> 
>>>> @Rob, @Gary, is it possible to upload just the jar to an existing
>>> release?
>>> 
>>> Yes. My plan was to do just that. With a [LAZY][VOTE] on the staged
>>> artifacts in nexus. Thoughts?
>>> 
>> 
>> Since we approved the sources tagged and we are not changing those, I'd say
>> we are OK to push out the javadoc files.
> 
> This should be fixed now. It may take a little while for maven central to 
> pick up the changes. @Scott - many thanks for the catch there!
> 
> Cheers,
> -Rob
> 
>> 
>> Gary
>> 
>> 
>>> 
>>> -Rob
>>> 
>>>> CheersBruno
>>>> 
>>>>   On Tuesday, 16 April 2019, 9:44:07 am NZST, Scott Palmer <
>>> swpal...@gmail.com> wrote:
>>>> 
>>>> I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
>>>> Is that intentional or a mistake?
>>>> 
>>>> Scott
>>>> (please copy me on responses as I am not subscribed to the list)
>>>> 
>>>> -
>>>> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
>>>> For additional commands, e-mail: user-h...@commons.apache.org
>>>> 
>>> 
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
>>> For additional commands, e-mail: user-h...@commons.apache.org

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [lang3]

2019-04-15 Thread Bruno P. Kinoshita
 Great! 

Rob, just in case I ever do the same. Could you share what steps you had to do 
in order to upload the javadocs, please?
Thanks for the super quick fix!
Bruno

On Tuesday, 16 April 2019, 10:56:22 am NZST, Rob Tompkins 
 wrote:  
 
 

> On Apr 15, 2019, at 3:08 PM, Gary Gregory  wrote:
> 
> On Mon, Apr 15, 2019 at 6:06 PM Bruno P. Kinoshita
>  <mailto:brunodepau...@yahoo.com.br.invalid>> wrote:
> 
>> I think that should be fine. I think something similar already happened
>> in the past, but can't recall which component.
>> 
>>    On Tuesday, 16 April 2019, 9:58:43 am NZST, Rob Tompkins <
>> chtom...@gmail.com> wrote:
>> 
>> 
>> 
>>> On Apr 15, 2019, at 2:49 PM, Bruno P. Kinoshita <
>> brunodepau...@yahoo.com.br.invalid> wrote:
>>> 
>>> 
>>> Hi Scott,
>>> I believe it was a mistake. Had a look at 3.8 and we had published it
>> before.
>>> Just had a look at the vote thread, and it appears the javadocs jar was
>> not included in the process. Possibly something with our pom.xml and
>> plugins set up.
>>> 
>>> @Rob, @Gary, is it possible to upload just the jar to an existing
>> release?
>> 
>> Yes. My plan was to do just that. With a [LAZY][VOTE] on the staged
>> artifacts in nexus. Thoughts?
>> 
> 
> Since we approved the sources tagged and we are not changing those, I'd say
> we are OK to push out the javadoc files.

This should be fixed now. It may take a little while for maven central to pick 
up the changes. @Scott - many thanks for the catch there!

Cheers,
-Rob

> 
> Gary
> 
> 
>> 
>> -Rob
>> 
>>> CheersBruno
>>> 
>>>  On Tuesday, 16 April 2019, 9:44:07 am NZST, Scott Palmer <
>> swpal...@gmail.com> wrote:
>>> 
>>> I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
>>> Is that intentional or a mistake?
>>> 
>>> Scott
>>> (please copy me on responses as I am not subscribed to the list)
>>> 
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
>>> For additional commands, e-mail: user-h...@commons.apache.org
>>> 
>> 
>> -
>> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
>> For additional commands, e-mail: user-h...@commons.apache.org
  

Re: [lang3]

2019-04-15 Thread Rob Tompkins


> On Apr 15, 2019, at 3:08 PM, Gary Gregory  wrote:
> 
> On Mon, Apr 15, 2019 at 6:06 PM Bruno P. Kinoshita
>  <mailto:brunodepau...@yahoo.com.br.invalid>> wrote:
> 
>> I think that should be fine. I think something similar already happened
>> in the past, but can't recall which component.
>> 
>>On Tuesday, 16 April 2019, 9:58:43 am NZST, Rob Tompkins <
>> chtom...@gmail.com> wrote:
>> 
>> 
>> 
>>> On Apr 15, 2019, at 2:49 PM, Bruno P. Kinoshita <
>> brunodepau...@yahoo.com.br.invalid> wrote:
>>> 
>>> 
>>> Hi Scott,
>>> I believe it was a mistake. Had a look at 3.8 and we had published it
>> before.
>>> Just had a look at the vote thread, and it appears the javadocs jar was
>> not included in the process. Possibly something with our pom.xml and
>> plugins set up.
>>> 
>>> @Rob, @Gary, is it possible to upload just the jar to an existing
>> release?
>> 
>> Yes. My plan was to do just that. With a [LAZY][VOTE] on the staged
>> artifacts in nexus. Thoughts?
>> 
> 
> Since we approved the sources tagged and we are not changing those, I'd say
> we are OK to push out the javadoc files.

This should be fixed now. It may take a little while for maven central to pick 
up the changes. @Scott - many thanks for the catch there!

Cheers,
-Rob

> 
> Gary
> 
> 
>> 
>> -Rob
>> 
>>> CheersBruno
>>> 
>>>   On Tuesday, 16 April 2019, 9:44:07 am NZST, Scott Palmer <
>> swpal...@gmail.com> wrote:
>>> 
>>> I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
>>> Is that intentional or a mistake?
>>> 
>>> Scott
>>> (please copy me on responses as I am not subscribed to the list)
>>> 
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
>>> For additional commands, e-mail: user-h...@commons.apache.org
>>> 
>> 
>> -
>> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
>> For additional commands, e-mail: user-h...@commons.apache.org



Re: [lang3]

2019-04-15 Thread Rob Tompkins



> On Apr 15, 2019, at 3:08 PM, Gary Gregory  wrote:
> 
> On Mon, Apr 15, 2019 at 6:06 PM Bruno P. Kinoshita
>  wrote:
> 
>> I think that should be fine. I think something similar already happened
>> in the past, but can't recall which component.
>> 
>>On Tuesday, 16 April 2019, 9:58:43 am NZST, Rob Tompkins <
>> chtom...@gmail.com> wrote:
>> 
>> 
>> 
>>> On Apr 15, 2019, at 2:49 PM, Bruno P. Kinoshita <
>> brunodepau...@yahoo.com.br.invalid> wrote:
>>> 
>>> 
>>> Hi Scott,
>>> I believe it was a mistake. Had a look at 3.8 and we had published it
>> before.
>>> Just had a look at the vote thread, and it appears the javadocs jar was
>> not included in the process. Possibly something with our pom.xml and
>> plugins set up.
>>> 
>>> @Rob, @Gary, is it possible to upload just the jar to an existing
>> release?
>> 
>> Yes. My plan was to do just that. With a [LAZY][VOTE] on the staged
>> artifacts in nexus. Thoughts?
>> 
> 
> Since we approved the sources tagged and we are not changing those, I'd say
> we are OK to push out the javadoc files.

Cool. I’ll sort that out in the next hour. 

> 
> Gary
> 
> 
>> 
>> -Rob
>> 
>>> CheersBruno
>>> 
>>>   On Tuesday, 16 April 2019, 9:44:07 am NZST, Scott Palmer <
>> swpal...@gmail.com> wrote:
>>> 
>>> I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
>>> Is that intentional or a mistake?
>>> 
>>> Scott
>>> (please copy me on responses as I am not subscribed to the list)
>>> 
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
>>> For additional commands, e-mail: user-h...@commons.apache.org
>>> 
>> 
>> -
>> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
>> For additional commands, e-mail: user-h...@commons.apache.org
>> 

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [lang3]

2019-04-15 Thread Gary Gregory
On Mon, Apr 15, 2019 at 6:06 PM Bruno P. Kinoshita
 wrote:

>  I think that should be fine. I think something similar already happened
> in the past, but can't recall which component.
>
> On Tuesday, 16 April 2019, 9:58:43 am NZST, Rob Tompkins <
> chtom...@gmail.com> wrote:
>
>
>
> > On Apr 15, 2019, at 2:49 PM, Bruno P. Kinoshita <
> brunodepau...@yahoo.com.br.invalid> wrote:
> >
> >
> > Hi Scott,
> > I believe it was a mistake. Had a look at 3.8 and we had published it
> before.
> > Just had a look at the vote thread, and it appears the javadocs jar was
> not included in the process. Possibly something with our pom.xml and
> plugins set up.
> >
> > @Rob, @Gary, is it possible to upload just the jar to an existing
> release?
>
> Yes. My plan was to do just that. With a [LAZY][VOTE] on the staged
> artifacts in nexus. Thoughts?
>

Since we approved the sources tagged and we are not changing those, I'd say
we are OK to push out the javadoc files.

Gary


>
> -Rob
>
> > CheersBruno
> >
> >On Tuesday, 16 April 2019, 9:44:07 am NZST, Scott Palmer <
> swpal...@gmail.com> wrote:
> >
> > I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
> > Is that intentional or a mistake?
> >
> > Scott
> > (please copy me on responses as I am not subscribed to the list)
> >
> > -
> > To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
> > For additional commands, e-mail: user-h...@commons.apache.org
> >
>
> -
> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
> For additional commands, e-mail: user-h...@commons.apache.org
>


Re: [lang3]

2019-04-15 Thread Bruno P. Kinoshita
 I think that should be fine. I think something similar already happened in the 
past, but can't recall which component.

On Tuesday, 16 April 2019, 9:58:43 am NZST, Rob Tompkins 
 wrote:  
 
 

> On Apr 15, 2019, at 2:49 PM, Bruno P. Kinoshita 
>  wrote:
> 
> 
> Hi Scott,
> I believe it was a mistake. Had a look at 3.8 and we had published it before.
> Just had a look at the vote thread, and it appears the javadocs jar was not 
> included in the process. Possibly something with our pom.xml and plugins set 
> up.
> 
> @Rob, @Gary, is it possible to upload just the jar to an existing release?

Yes. My plan was to do just that. With a [LAZY][VOTE] on the staged artifacts 
in nexus. Thoughts?

-Rob

> CheersBruno
> 
>    On Tuesday, 16 April 2019, 9:44:07 am NZST, Scott Palmer 
> wrote:  
> 
> I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
> Is that intentional or a mistake?
> 
> Scott
> (please copy me on responses as I am not subscribed to the list)
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
> For additional commands, e-mail: user-h...@commons.apache.org
> 

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org
  

Re: [lang3]

2019-04-15 Thread Rob Tompkins



> On Apr 15, 2019, at 2:49 PM, Bruno P. Kinoshita 
>  wrote:
> 
> 
> Hi Scott,
> I believe it was a mistake. Had a look at 3.8 and we had published it before.
> Just had a look at the vote thread, and it appears the javadocs jar was not 
> included in the process. Possibly something with our pom.xml and plugins set 
> up.
> 
> @Rob, @Gary, is it possible to upload just the jar to an existing release?

Yes. My plan was to do just that. With a [LAZY][VOTE] on the staged artifacts 
in nexus. Thoughts?

-Rob

> CheersBruno
> 
>On Tuesday, 16 April 2019, 9:44:07 am NZST, Scott Palmer 
>  wrote:  
> 
> I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
> Is that intentional or a mistake?
> 
> Scott
> (please copy me on responses as I am not subscribed to the list)
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
> For additional commands, e-mail: user-h...@commons.apache.org
> 

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [lang3]

2019-04-15 Thread Bruno P. Kinoshita
 
Hi Scott,
I believe it was a mistake. Had a look at 3.8 and we had published it before.
Just had a look at the vote thread, and it appears the javadocs jar was not 
included in the process. Possibly something with our pom.xml and plugins set up.

@Rob, @Gary, is it possible to upload just the jar to an existing release?
CheersBruno

On Tuesday, 16 April 2019, 9:44:07 am NZST, Scott Palmer 
 wrote:  
 
 I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
Is that intentional or a mistake?

Scott
(please copy me on responses as I am not subscribed to the list)

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org

  

Re: [lang3]

2019-04-15 Thread Rob Tompkins
Hm. Curious. Let me look at that.

-Rob

> On Apr 15, 2019, at 12:58 PM, Scott Palmer  wrote:
> 
> I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
> Is that intentional or a mistake?
> 
> Scott
> (please copy me on responses as I am not subscribed to the list)
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
> For additional commands, e-mail: user-h...@commons.apache.org
> 

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



[lang3]

2019-04-15 Thread Scott Palmer
I noticed there are no javadocs on Maven Central for commons-lang3 3.9.
Is that intentional or a mistake?

Scott
(please copy me on responses as I am not subscribed to the list)

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [lang3] FastDateFormat fails on some locales?

2019-02-01 Thread Kevin Risden
I did some further testing and found out of the ~160 locales that my
JDK8 had, only ja-JP-u-ca-japanese-x-lvariant-JP failed with the
ArrayIndexOutOfBoundsException

Kevin Risden

On Wed, Jan 30, 2019 at 1:36 PM Kevin Risden  wrote:
>
> I found this while looking at Apache Lucene/Solr and Hadoop 3. Hadoop
> uses FastDateFormat to format the current timestamp. Apache
> Lucene/Solr randomizes locales to ensure that things behave correctly
> even when there are different locales being used. There have been a
> few failures that have the following stack trace:
>
> ava.lang.ArrayIndexOutOfBoundsException: 4
>[junit4]   2> at
> org.apache.commons.lang3.time.FastDatePrinter$TextField.appendTo(FastDatePrinter.java:901)
> ~[commons-lang3-3.7.jar:3.7]
>[junit4]   2> at
> org.apache.commons.lang3.time.FastDatePrinter.applyRules(FastDatePrinter.java:573)
> ~[commons-lang3-3.7.jar:3.7]
>[junit4]   2> at
> org.apache.commons.lang3.time.FastDatePrinter.applyRulesToString(FastDatePrinter.java:455)
> ~[commons-lang3-3.7.jar:3.7]
>[junit4]   2> at
> org.apache.commons.lang3.time.FastDatePrinter.format(FastDatePrinter.java:446)
> ~[commons-lang3-3.7.jar:3.7]
>[junit4]   2> at
> org.apache.commons.lang3.time.FastDateFormat.format(FastDateFormat.java:428)
> ~[commons-lang3-3.7.jar:3.7]
>[junit4]   2> at
> org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.start(DirectoryScanner.java:281)
> ~[hadoop-hdfs-3.2.0.jar:?]
>[junit4]   2> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.initDirectoryScanner(DataNode.java:1090)
> ~[hadoop-hdfs-3.2.0.jar:?]
>[junit4]   2> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1686)
> ~[hadoop-hdfs-3.2.0.jar:?]
>[junit4]   2> at
> org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:390)
> ~[hadoop-hdfs-3.2.0.jar:?]
>[junit4]   2> at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:280)
> ~[hadoop-hdfs-3.2.0.jar:?]
>[junit4]   2> at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:819)
> [hadoop-hdfs-3.2.0.jar:?]
>[junit4]   2> at java.lang.Thread.run(Thread.java:748) [?:1.8.0_191]
>
> I was also able to reproduce this with a simple test case:
>
> long timestamp = System.currentTimeMillis();
> Locale.setDefault(Locale.forLanguageTag("ja-JP-u-ca-japanese-x-lvariant-JP"));
> Assert.assertEquals(SimpleDateFormat.getInstance().format(timestamp),
> FastDateFormat.getInstance().format(timestamp));
>
> Showing that the issue isn't with Hadoop but with commons-lang3
> specifically. SimpleDateFormat has no issue formatting the timestamp
> with the given locale. The FastDateFormat javadoc doesn't state any
> issues with locales.
>
> Is this to be expected?
>
> Kevin Risden

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



[lang3] FastDateFormat fails on some locales?

2019-01-30 Thread Kevin Risden
I found this while looking at Apache Lucene/Solr and Hadoop 3. Hadoop
uses FastDateFormat to format the current timestamp. Apache
Lucene/Solr randomizes locales to ensure that things behave correctly
even when there are different locales being used. There have been a
few failures that have the following stack trace:

ava.lang.ArrayIndexOutOfBoundsException: 4
   [junit4]   2> at
org.apache.commons.lang3.time.FastDatePrinter$TextField.appendTo(FastDatePrinter.java:901)
~[commons-lang3-3.7.jar:3.7]
   [junit4]   2> at
org.apache.commons.lang3.time.FastDatePrinter.applyRules(FastDatePrinter.java:573)
~[commons-lang3-3.7.jar:3.7]
   [junit4]   2> at
org.apache.commons.lang3.time.FastDatePrinter.applyRulesToString(FastDatePrinter.java:455)
~[commons-lang3-3.7.jar:3.7]
   [junit4]   2> at
org.apache.commons.lang3.time.FastDatePrinter.format(FastDatePrinter.java:446)
~[commons-lang3-3.7.jar:3.7]
   [junit4]   2> at
org.apache.commons.lang3.time.FastDateFormat.format(FastDateFormat.java:428)
~[commons-lang3-3.7.jar:3.7]
   [junit4]   2> at
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.start(DirectoryScanner.java:281)
~[hadoop-hdfs-3.2.0.jar:?]
   [junit4]   2> at
org.apache.hadoop.hdfs.server.datanode.DataNode.initDirectoryScanner(DataNode.java:1090)
~[hadoop-hdfs-3.2.0.jar:?]
   [junit4]   2> at
org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1686)
~[hadoop-hdfs-3.2.0.jar:?]
   [junit4]   2> at
org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:390)
~[hadoop-hdfs-3.2.0.jar:?]
   [junit4]   2> at
org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:280)
~[hadoop-hdfs-3.2.0.jar:?]
   [junit4]   2> at
org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:819)
[hadoop-hdfs-3.2.0.jar:?]
   [junit4]   2> at java.lang.Thread.run(Thread.java:748) [?:1.8.0_191]

I was also able to reproduce this with a simple test case:

long timestamp = System.currentTimeMillis();
Locale.setDefault(Locale.forLanguageTag("ja-JP-u-ca-japanese-x-lvariant-JP"));
Assert.assertEquals(SimpleDateFormat.getInstance().format(timestamp),
FastDateFormat.getInstance().format(timestamp));

Showing that the issue isn't with Hadoop but with commons-lang3
specifically. SimpleDateFormat has no issue formatting the timestamp
with the given locale. The FastDateFormat javadoc doesn't state any
issues with locales.

Is this to be expected?

Kevin Risden

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [lang3] Problem with the OSGi metadata: Bundle-SymbolicName / breaking change between 3.7 and 3.8

2018-09-06 Thread P. Ottlinger
Hi,

thanks for quick response ...

Am 06.09.2018 um 21:24 schrieb Oliver Heger:
> So opening a ticket in Jira would be the correct action to take.

https://issues.apache.org/jira/browse/LANG-1419

Done :-) Hopefully I didn't miss any important stuff in Jira.

Cheers,
Phil

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [lang3] Problem with the OSGi metadata: Bundle-SymbolicName / breaking change between 3.7 and 3.8

2018-09-06 Thread Oliver Heger
Hi Phil,

as you already assume, this change in the OSGi meta data was caused by
changes in the build process and not intended.

So opening a ticket in Jira would be the correct action to take.

Thank you for reporting!
Oliver

Am 06.09.2018 um 20:49 schrieb P. Ottlinger:
> Hi,
> 
> I've just stumbled upon a problem that prevents me from updating from
> 3.7 to 3.8 in an OSGi context.
> 
> Although the release has just been a patch one, the bundle's symbolic
> name changed
> from "Bundle-SymbolicName org.apache.commons.lang3" in 3.7.0
> to "Bundle-SymbolicName org.apache.commons.commons-lang3" in 3.8.0.
> 
> That makes it impossible to do a drop-in update, as it is a breaking change.
> 
> Is that change an error in 3.8.0 or a wanted one that could be
> communicated more directly to downstream users?
> 
> May I file a bugticket in the LANG-Jira for it? I assume there has been
> a hickup when building the OSGi release JAR and the change was not intended.
> 
> Thanks,
> Phil
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
> For additional commands, e-mail: user-h...@commons.apache.org
> 

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



[lang3] Problem with the OSGi metadata: Bundle-SymbolicName / breaking change between 3.7 and 3.8

2018-09-06 Thread P. Ottlinger
Hi,

I've just stumbled upon a problem that prevents me from updating from
3.7 to 3.8 in an OSGi context.

Although the release has just been a patch one, the bundle's symbolic
name changed
from "Bundle-SymbolicName org.apache.commons.lang3" in 3.7.0
to "Bundle-SymbolicName org.apache.commons.commons-lang3" in 3.8.0.

That makes it impossible to do a drop-in update, as it is a breaking change.

Is that change an error in 3.8.0 or a wanted one that could be
communicated more directly to downstream users?

May I file a bugticket in the LANG-Jira for it? I assume there has been
a hickup when building the OSGi release JAR and the change was not intended.

Thanks,
Phil

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: commons-lang3: Too early to deprecate RandomStringUtils in favor of RandomStringGenerator ?

2017-09-03 Thread Amey Jadiye
Yes.

Regards,
Amey

On Sun, Sep 3, 2017 at 9:06 PM, Philippe Mouawad <philippe.moua...@gmail.com
> wrote:

> Hi Again,
> What is the expected schedule for this plan ?
> Will it be available in commons-text-1.2 ?
>
> Thanks
>
> On Sun, Sep 3, 2017 at 4:28 PM, Amey Jadiye <ameyjad...@gmail.com> wrote:
>
> > Hello Philippe,
> >
> > Looking at similar kind of demand we are thinking to execute below plan,
> I
> > think it will be good for your expectations.
> >
> > http://markmail.org/message/azxw4nai7fs2laas
> >
> > Regards,
> > Amey
> >
> > On Sun, Sep 3, 2017 at 6:26 PM, Philippe Mouawad <pmoua...@apache.org>
> > wrote:
> >
> > > Hello,
> > > Since version 3.6 of commons-lang3, RandomStringUtils has been
> deprecated
> > > following introduction of commons-text.
> > >
> > > Looking at current 1.1 version (and even snapshot 1.2) I wonder if it's
> > not
> > > too early for deprecation.
> > >
> > > RandomStringUtils was very simple and intuitive to use. I don't
> remember
> > I
> > > ever had to think when using it :-)
> > >
> > > RandomStringGenerator is nice in terms of API and much more powerful
> for
> > > advanced usage, but it looks to me much more complex to use for simple,
> > > average cases:
> > >
> > >- RandomStringUtils.random ? => Is this the equivalent
> > >- new RandomStringGenerator.Builder()
> > >   .filteredBy(CharacterPredicates.LETTERS)
> > >   .build();
> > >   - I don't get exactly the same results ? Is it due to Unicode
> > chars ?
> > >   - RandomStringUtils.randomAlphabetic(count) => new
> > >RandomStringGenerator.Builder()
> > >.withinRange('0', 'z')
> > >.filteredBy(CharacterPredicates.LETTERS,
> > >CharacterPredicates.DIGITS)
> > >.build().generate(count)
> > >
> > > What about use cases when count and source chars are configurable :
> > >
> > >- RandomStringUtils.random(count, chars)
> > >- => Are we supposed to build each time the generator ?
> > >
> > > Is it as efficient in terms of CPU and memory usage as
> RandomStringUtils
> > > equivalent ?
> > >
> > > Sorry if my questions are stupid.
> > >
> > > Thanks
> > >
> > > Regards
> > >
> >
> >
> >
> > --
> >
> > -
> >
> > To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
> >
> > For additional commands, e-mail: dev-h...@commons.apache.org
> >
>
>
>
> --
> Cordialement.
> Philippe Mouawad.
>



-- 

-

To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org

For additional commands, e-mail: dev-h...@commons.apache.org


Re: commons-lang3: Too early to deprecate RandomStringUtils in favor of RandomStringGenerator ?

2017-09-03 Thread Philippe Mouawad
Hi Again,
What is the expected schedule for this plan ?
Will it be available in commons-text-1.2 ?

Thanks

On Sun, Sep 3, 2017 at 4:28 PM, Amey Jadiye <ameyjad...@gmail.com> wrote:

> Hello Philippe,
>
> Looking at similar kind of demand we are thinking to execute below plan, I
> think it will be good for your expectations.
>
> http://markmail.org/message/azxw4nai7fs2laas
>
> Regards,
> Amey
>
> On Sun, Sep 3, 2017 at 6:26 PM, Philippe Mouawad <pmoua...@apache.org>
> wrote:
>
> > Hello,
> > Since version 3.6 of commons-lang3, RandomStringUtils has been deprecated
> > following introduction of commons-text.
> >
> > Looking at current 1.1 version (and even snapshot 1.2) I wonder if it's
> not
> > too early for deprecation.
> >
> > RandomStringUtils was very simple and intuitive to use. I don't remember
> I
> > ever had to think when using it :-)
> >
> > RandomStringGenerator is nice in terms of API and much more powerful for
> > advanced usage, but it looks to me much more complex to use for simple,
> > average cases:
> >
> >- RandomStringUtils.random ? => Is this the equivalent
> >- new RandomStringGenerator.Builder()
> >   .filteredBy(CharacterPredicates.LETTERS)
> >   .build();
> >   - I don't get exactly the same results ? Is it due to Unicode
> chars ?
> >   - RandomStringUtils.randomAlphabetic(count) => new
> >RandomStringGenerator.Builder()
> >.withinRange('0', 'z')
> >.filteredBy(CharacterPredicates.LETTERS,
> >CharacterPredicates.DIGITS)
> >.build().generate(count)
> >
> > What about use cases when count and source chars are configurable :
> >
> >- RandomStringUtils.random(count, chars)
> >- => Are we supposed to build each time the generator ?
> >
> > Is it as efficient in terms of CPU and memory usage as RandomStringUtils
> > equivalent ?
> >
> > Sorry if my questions are stupid.
> >
> > Thanks
> >
> > Regards
> >
>
>
>
> --
>
> -
>
> To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
>
> For additional commands, e-mail: dev-h...@commons.apache.org
>



-- 
Cordialement.
Philippe Mouawad.


Re: commons-lang3: Too early to deprecate RandomStringUtils in favor of RandomStringGenerator ?

2017-09-03 Thread Philippe Mouawad
Thanks for your answer.

On Sunday, September 3, 2017, Amey Jadiye <ameyjad...@gmail.com> wrote:

> Hello Philippe,
>
> Looking at similar kind of demand we are thinking to execute below plan, I
> think it will be good for your expectations.
>
> http://markmail.org/message/azxw4nai7fs2laas
>
> Regards,
> Amey
>
> On Sun, Sep 3, 2017 at 6:26 PM, Philippe Mouawad <pmoua...@apache.org
> <javascript:;>>
> wrote:
>
> > Hello,
> > Since version 3.6 of commons-lang3, RandomStringUtils has been deprecated
> > following introduction of commons-text.
> >
> > Looking at current 1.1 version (and even snapshot 1.2) I wonder if it's
> not
> > too early for deprecation.
> >
> > RandomStringUtils was very simple and intuitive to use. I don't remember
> I
> > ever had to think when using it :-)
> >
> > RandomStringGenerator is nice in terms of API and much more powerful for
> > advanced usage, but it looks to me much more complex to use for simple,
> > average cases:
> >
> >- RandomStringUtils.random ? => Is this the equivalent
> >- new RandomStringGenerator.Builder()
> >   .filteredBy(CharacterPredicates.LETTERS)
> >   .build();
> >   - I don't get exactly the same results ? Is it due to Unicode
> chars ?
> >   - RandomStringUtils.randomAlphabetic(count) => new
> >RandomStringGenerator.Builder()
> >.withinRange('0', 'z')
> >.filteredBy(CharacterPredicates.LETTERS,
> >CharacterPredicates.DIGITS)
> >.build().generate(count)
> >
> > What about use cases when count and source chars are configurable :
> >
> >- RandomStringUtils.random(count, chars)
> >- => Are we supposed to build each time the generator ?
> >
> > Is it as efficient in terms of CPU and memory usage as RandomStringUtils
> > equivalent ?
> >
> > Sorry if my questions are stupid.
> >
> > Thanks
> >
> > Regards
> >
>
>
>
> --
>
> -
>
> To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org <javascript:;>
>
> For additional commands, e-mail: dev-h...@commons.apache.org
> <javascript:;>
>


-- 
Cordialement.
Philippe Mouawad.


Re: commons-lang3: Too early to deprecate RandomStringUtils in favor of RandomStringGenerator ?

2017-09-03 Thread Amey Jadiye
Hello Philippe,

Looking at similar kind of demand we are thinking to execute below plan, I
think it will be good for your expectations.

http://markmail.org/message/azxw4nai7fs2laas

Regards,
Amey

On Sun, Sep 3, 2017 at 6:26 PM, Philippe Mouawad <pmoua...@apache.org>
wrote:

> Hello,
> Since version 3.6 of commons-lang3, RandomStringUtils has been deprecated
> following introduction of commons-text.
>
> Looking at current 1.1 version (and even snapshot 1.2) I wonder if it's not
> too early for deprecation.
>
> RandomStringUtils was very simple and intuitive to use. I don't remember I
> ever had to think when using it :-)
>
> RandomStringGenerator is nice in terms of API and much more powerful for
> advanced usage, but it looks to me much more complex to use for simple,
> average cases:
>
>- RandomStringUtils.random ? => Is this the equivalent
>- new RandomStringGenerator.Builder()
>   .filteredBy(CharacterPredicates.LETTERS)
>   .build();
>   - I don't get exactly the same results ? Is it due to Unicode chars ?
>   - RandomStringUtils.randomAlphabetic(count) => new
>RandomStringGenerator.Builder()
>.withinRange('0', 'z')
>.filteredBy(CharacterPredicates.LETTERS,
>CharacterPredicates.DIGITS)
>.build().generate(count)
>
> What about use cases when count and source chars are configurable :
>
>- RandomStringUtils.random(count, chars)
>- => Are we supposed to build each time the generator ?
>
> Is it as efficient in terms of CPU and memory usage as RandomStringUtils
> equivalent ?
>
> Sorry if my questions are stupid.
>
> Thanks
>
> Regards
>



-- 

-

To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org

For additional commands, e-mail: dev-h...@commons.apache.org


commons-lang3: Too early to deprecate RandomStringUtils in favor of RandomStringGenerator ?

2017-09-03 Thread Philippe Mouawad
Hello,
Since version 3.6 of commons-lang3, RandomStringUtils has been deprecated
following introduction of commons-text.

Looking at current 1.1 version (and even snapshot 1.2) I wonder if it's not
too early for deprecation.

RandomStringUtils was very simple and intuitive to use. I don't remember I
ever had to think when using it :-)

RandomStringGenerator is nice in terms of API and much more powerful for
advanced usage, but it looks to me much more complex to use for simple,
average cases:

   - RandomStringUtils.random ? => Is this the equivalent
   - new RandomStringGenerator.Builder()
  .filteredBy(CharacterPredicates.LETTERS)
  .build();
  - I don't get exactly the same results ? Is it due to Unicode chars ?
  - RandomStringUtils.randomAlphabetic(count) => new
   RandomStringGenerator.Builder()
   .withinRange('0', 'z')
   .filteredBy(CharacterPredicates.LETTERS,
   CharacterPredicates.DIGITS)
   .build().generate(count)

What about use cases when count and source chars are configurable :

   - RandomStringUtils.random(count, chars)
   - => Are we supposed to build each time the generator ?

Is it as efficient in terms of CPU and memory usage as RandomStringUtils
equivalent ?

Sorry if my questions are stupid.

Thanks

Regards


Re: commons lang3: NullArgumentException missing?

2011-12-22 Thread Matt Benson
On Wed, Dec 21, 2011 at 3:41 PM, Karsten Wutzke kwut...@web.de wrote:
 Wow, the Java/JDK is getting dirtier and dirtier with every release. The code

  public Foo(Bar bar) {
     this.bar = Objects.requireNonNull(bar);
  }
 looks really really awful to me.

As opposed to this.bar = Validate.notNull(bar) from commons-lang3? ;)

Matt


 IMO it's time to reimplement the JDK and throw away backward compatibility 
 than introduce patch after patch. At least a second clean API of the JDK 
 should be provided.

 I wonder what others are thinking about this.

 Karsten


 -Ursprüngliche Nachricht-
 Von: Paul Benedict pbened...@apache.org
 Gesendet: 21.12.2011 19:04:48
 An: Commons Users List user@commons.apache.org
 Betreff: Re: commons lang3: NullArgumentException missing?

The official standard in the JDK is to throw NPE for null arguments. Since
JDK 7, they have made API available for this in
java.util.Objects#requireNonNull(). Commons is following the official
direction.
On Dec 21, 2011 10:16 AM, kwut...@web.de wrote:


 ___
 SMS schreiben mit WEB.DE FreeMail - einfach, schnell und
 kostenguenstig. Jetzt gleich testen! http://f.web.de/?mc=021192

 -
 To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
 For additional commands, e-mail: user-h...@commons.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



commons lang3: NullArgumentException missing?

2011-12-21 Thread kwutzke
Hello,

I can see NullArgumentException has been removed from the lang3 API, but I 
don't understand why. There have been long discussions in the past why a 
NullArgumentException is better than using an IllegalArgumentException. Most 
people are using commons-lang anyway, so what's the point of removing 
NullArgumentException?

What if developers don't want to use IllegalArgumentException? Is is more 
advantageous to have these people provide their own NullArgumentException 
implementations? This is stupid. One of these reasons why the commons-* libs 
have been created was to fill the gap where the Java API has provided nothing. 
Now lang3 is there, too.

If people aren't interested in using NullArgumentException as provided, why 
don't they simply ignore it or provide their own implementations? I don't 
understand it.

Any comments/explanations to clear this up are welcome.

Karsten
___
SMS schreiben mit WEB.DE FreeMail - einfach, schnell und
kostenguenstig. Jetzt gleich testen! http://f.web.de/?mc=021192

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: commons lang3: NullArgumentException missing?

2011-12-21 Thread Paul Benedict
The official standard in the JDK is to throw NPE for null arguments. Since
JDK 7, they have made API available for this in
java.util.Objects#requireNonNull(). Commons is following the official
direction.
On Dec 21, 2011 10:16 AM, kwut...@web.de wrote:


[lang] common-lang3 mvn repo

2010-11-04 Thread JammyZ
Hi,
  I am using the 3.0-SNAPSHOT of commons-lang3 and it seems like something
changed in the repository yesterday.
  See the dates at
https://repository.apache.org/content/repositories/snapshots/org/apache/commons/commons-lang3/3.0-SNAPSHOT/

  However the version has not changed, it is the same since September.
Someone from my team just checked out our project yesterday and this library
wouldn't download. I've not had the time to look at why yet, but it seems
strange that the files have changed if the version has not changed, possible
tampering?

Regards,
Iker.


Re: [lang] common-lang3 mvn repo

2010-11-04 Thread Dennis Lundberg
On 2010-11-04 11:18, JammyZ wrote:
 Hi,
   I am using the 3.0-SNAPSHOT of commons-lang3 and it seems like something
 changed in the repository yesterday.
   See the dates at
 https://repository.apache.org/content/repositories/snapshots/org/apache/commons/commons-lang3/3.0-SNAPSHOT/
 
   However the version has not changed, it is the same since September.
 Someone from my team just checked out our project yesterday and this library
 wouldn't download. I've not had the time to look at why yet, but it seems
 strange that the files have changed if the version has not changed, possible
 tampering?

It is the checksum files that have changed, not the artifact itself.
Perhaps someone is running a maintenance job on the Nexus instance (the
repository manager we use) at repository.apache.org?

 
 Regards,
 Iker.
 


-- 
Dennis Lundberg

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org