Re: CVE-2022-40160O on commons-jxpath

2023-06-30 Thread Tomo Suzuki
Good to know such cases. As always, thank you for maintaining OSS
ecosystem, including responding vulnerability questions.


https://nvd.nist.gov/vuln/detail/CVE-2022-40160
Description

** DISPUTED ** This record was originally reported by the oss-fuzz project
who failed to consider the security context in which JXPath is intended to
be used and failed to contact the JXPath maintainers prior to requesting
the CVE allocation. The CVE was then allocated by Google in breach of the
CNA rules. After review by the JXPath maintainers, the original report was
found to be invalid.

On Fri, Jun 30, 2023 at 09:40 Gary Gregory  wrote:

> That CVE is invalid, please see
> https://nvd.nist.gov/vuln/detail/CVE-2022-40160
>
> You should rely on official CVE databases like nist.gov.
>
> Gary
>
>
>
> On Fri, Jun 30, 2023, 09:04 Debraj Manna  wrote:
>
> > commons-jxpath 1.3 is also getting flagged for CVE-2022-401
> > 59.
> >
> > On Fri, Jun 30, 2023 at 6:28 PM Debraj Manna 
> > wrote:
> >
> > > Hi
> > >
> > > We have been flagged for CVE-2022-401600
> > >  on
> > > commons-jxpath, version 1.3.
> > >
> > > Can someone let me know commons-jxpath is really affected by this
> > > vulnerability? If yes, is there any plan to fix this?
> > >
> >
>
-- 
Regards,
Tomo


Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-27 Thread Tomo Suzuki
Glad to hear you made progress. Good luck!

(Another possibility: you might have changed the package or class name
since you saved the HDFS file.)

On Thu, Jun 27, 2019 at 21:25 big data  wrote:

> Thanks. I've tried it, the new Block before it is OK.
>
> I've solved it and posted another issue to describe this progress. The
> details refer to another email:  Java Generic T makes ClassNotFoundException
>
> 在 2019/6/27 下午8:41, Tomo Suzuki 写道:
>
> My suggestion after reading ClassNotFoundException is to try to instantiate
> the class just before deserializing it:
>
> public static Block deserializeFrom(byte[] bytes) {
> // Dummy instantiation to ensure Block class and its related classes
> are available
> System.out.println("dummy = " + new Block());
> System.out.println("byte length = " + bytes.length); // Does this match
> what you expect?
> try {
> Block b = SerializationUtils.deserialize(bytes);
> ...
>
>
> Looking forward to hearing the result.
>
>
>
> On Wed, Jun 26, 2019 at 11:03 PM big data  ><mailto:bigdatab...@outlook.com> wrote:
>
>
>
> The XXX Class named Block, below is part codes of it:
>
> The deserialize code like this:
>
> public static Block deserializeFrom(byte[] bytes) {
> try {
> Block b = SerializationUtils.deserialize(bytes);
> System.out.println("b="+b);
> return b;
> } catch (ClassCastException e) {
> System.out.println("ClassCastException");
> e.printStackTrace();
> } catch (IllegalArgumentException e) {
> System.out.println("IllegalArgumentException");
> e.printStackTrace();
>
> } catch (SerializationException e) {
> System.out.println("SerializationException");
> e.printStackTrace();
> }
> return null;
> }
>
>
> The Spark code is:
>
> val fis = spark.sparkContext.binaryFiles("/folder/abc*.file")
> val RDD = fis.map(x => {
>   val content = x._2.toArray()
>   val b = Block.deserializeFrom(content)
>   ...
> }
>
>
> All codes above can run successfully in Spark local mode, but when run it
> in Yarn cluster mode, the error happens.
>
> 在 2019/6/27 上午9:49, Tomo Suzuki 写道:
>
> I'm afraid that I don't have enough information to troubleshoot problem in
> com.XXX.XXX. It would be great if you can create a minimal example project
> that can reproduce the same issue.
>
> Regards,
> Tomo
>
> On Wed, Jun 26, 2019 at 9:20 PM big data  bigdatab...@outlook.com> bigdatab...@outlook.com><mailto:bigdatab...@outlook.com> wrote:
>
>
>
> Hi,
>
> Actually, the class com.XXX.XXX is normally called in the before spark
> code, and this exception error is happened in one static method of this
> class.
>
> So the jar dependency problem can be excluded.
>
> 在 2019/6/26 下午10:23, Tomo Suzuki 写道:
>
>
> Hi Big data,
>
> I don't use SerializationUtils, but if I interpret the error message:
>
>ClassNotFoundException: com..
>
> , this says com.. is not available in the class path of JVM
>
>
> (which
>
>
> your Spark is running on). I would verify that you can instantiate
> com.. in Spark/Scala *without* SerializationUtils.
>
> Regards,
> Tomo
>
>
>
> On Wed, Jun 26, 2019 at 4:12 AM big data  bigdatab...@outlook.com> bigdatab...@outlook.com><mailto:bigdatab...@outlook.com>
>
>
> wrote:
>
>
>
>
>
> I use Apache Commons Lang3's SerializationUtils in the code.
>
> SerializationUtils.serialize()
>
> to store a customized class as files into disk and
>
> SerializationUtils.deserialize(byte[])
>
> to restore them again.
>
> In the local environment (Mac OS), all serialized files can be
> deserialized normally and no error happens. But when I copy these
> serialized files into HDFS, and read them from HDFS by using
>
>
> Spark/Scala, a
>
>
> SerializeException happens.
>
> The Apache Commons Lang3 version is:
>
>  
>  org.apache.commons
>  commons-lang3
>  3.9
>  
>
>
> the stack error as below:
>
> org.apache.commons.lang3.SerializationException:
> java.lang.ClassNotFoundException: com..
>  at
>
>
>
>
>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)
>
>
>  at
>
>
>
>
>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)
>
>
>  at com.com...deserializeFrom(XXX.java:81)
>  at com.XXX.$$anonfun$3.apply(B.scala

Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-27 Thread Tomo Suzuki
My suggestion after reading ClassNotFoundException is to try to instantiate
the class just before deserializing it:

public static Block deserializeFrom(byte[] bytes) {
// Dummy instantiation to ensure Block class and its related classes
are available
System.out.println("dummy = " + new Block());
System.out.println("byte length = " + bytes.length); // Does this match
what you expect?
try {
Block b = SerializationUtils.deserialize(bytes);
...


Looking forward to hearing the result.



On Wed, Jun 26, 2019 at 11:03 PM big data  wrote:

> The XXX Class named Block, below is part codes of it:
>
> The deserialize code like this:
>
> public static Block deserializeFrom(byte[] bytes) {
> try {
> Block b = SerializationUtils.deserialize(bytes);
> System.out.println("b="+b);
> return b;
> } catch (ClassCastException e) {
> System.out.println("ClassCastException");
> e.printStackTrace();
> } catch (IllegalArgumentException e) {
> System.out.println("IllegalArgumentException");
> e.printStackTrace();
>
> } catch (SerializationException e) {
> System.out.println("SerializationException");
> e.printStackTrace();
> }
> return null;
> }
>
>
> The Spark code is:
>
> val fis = spark.sparkContext.binaryFiles("/folder/abc*.file")
> val RDD = fis.map(x => {
>   val content = x._2.toArray()
>   val b = Block.deserializeFrom(content)
>   ...
> }
>
>
> All codes above can run successfully in Spark local mode, but when run it
> in Yarn cluster mode, the error happens.
>
> 在 2019/6/27 上午9:49, Tomo Suzuki 写道:
>
> I'm afraid that I don't have enough information to troubleshoot problem in
> com.XXX.XXX. It would be great if you can create a minimal example project
> that can reproduce the same issue.
>
> Regards,
> Tomo
>
> On Wed, Jun 26, 2019 at 9:20 PM big data  bigdatab...@outlook.com> wrote:
>
>
>
> Hi,
>
> Actually, the class com.XXX.XXX is normally called in the before spark
> code, and this exception error is happened in one static method of this
> class.
>
> So the jar dependency problem can be excluded.
>
> 在 2019/6/26 下午10:23, Tomo Suzuki 写道:
>
>
> Hi Big data,
>
> I don't use SerializationUtils, but if I interpret the error message:
>
>ClassNotFoundException: com..
>
> , this says com.. is not available in the class path of JVM
>
>
> (which
>
>
> your Spark is running on). I would verify that you can instantiate
> com.. in Spark/Scala *without* SerializationUtils.
>
> Regards,
> Tomo
>
>
>
> On Wed, Jun 26, 2019 at 4:12 AM big data  bigdatab...@outlook.com>
>
>
> wrote:
>
>
>
>
>
> I use Apache Commons Lang3's SerializationUtils in the code.
>
> SerializationUtils.serialize()
>
> to store a customized class as files into disk and
>
> SerializationUtils.deserialize(byte[])
>
> to restore them again.
>
> In the local environment (Mac OS), all serialized files can be
> deserialized normally and no error happens. But when I copy these
> serialized files into HDFS, and read them from HDFS by using
>
>
> Spark/Scala, a
>
>
> SerializeException happens.
>
> The Apache Commons Lang3 version is:
>
>  
>  org.apache.commons
>  commons-lang3
>  3.9
>  
>
>
> the stack error as below:
>
> org.apache.commons.lang3.SerializationException:
> java.lang.ClassNotFoundException: com..
>  at
>
>
>
>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)
>
>
>  at
>
>
>
>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)
>
>
>  at com.com...deserializeFrom(XXX.java:81)
>  at com.XXX.$$anonfun$3.apply(B.scala:157)
>  at com.XXX.$$anonfun$3.apply(B.scala:153)
>  at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
>  at scala.collection.Iterator$class.foreach(Iterator.scala:893)
>  at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
>  at
> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
>  at
>
>
>
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
>
>
>  at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
>  at scala.collection.TraversableOnce$class.to
> (TraversableOnce.scala:310)
>  at scala.collection.AbstractIterator.to(I

Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-26 Thread Tomo Suzuki
I'm afraid that I don't have enough information to troubleshoot problem in
com.XXX.XXX. It would be great if you can create a minimal example project
that can reproduce the same issue.

Regards,
Tomo

On Wed, Jun 26, 2019 at 9:20 PM big data  wrote:

> Hi,
>
> Actually, the class com.XXX.XXX is normally called in the before spark
> code, and this exception error is happened in one static method of this
> class.
>
> So the jar dependency problem can be excluded.
>
> 在 2019/6/26 下午10:23, Tomo Suzuki 写道:
> > Hi Big data,
> >
> > I don't use SerializationUtils, but if I interpret the error message:
> >
> >ClassNotFoundException: com..
> >
> > , this says com.. is not available in the class path of JVM
> (which
> > your Spark is running on). I would verify that you can instantiate
> > com.. in Spark/Scala *without* SerializationUtils.
> >
> > Regards,
> > Tomo
> >
> >
> >
> > On Wed, Jun 26, 2019 at 4:12 AM big data 
> wrote:
> >
> >> I use Apache Commons Lang3's SerializationUtils in the code.
> >>
> >> SerializationUtils.serialize()
> >>
> >> to store a customized class as files into disk and
> >>
> >> SerializationUtils.deserialize(byte[])
> >>
> >> to restore them again.
> >>
> >> In the local environment (Mac OS), all serialized files can be
> >> deserialized normally and no error happens. But when I copy these
> >> serialized files into HDFS, and read them from HDFS by using
> Spark/Scala, a
> >> SerializeException happens.
> >>
> >> The Apache Commons Lang3 version is:
> >>
> >>  
> >>  org.apache.commons
> >>  commons-lang3
> >>  3.9
> >>  
> >>
> >>
> >> the stack error as below:
> >>
> >> org.apache.commons.lang3.SerializationException:
> >> java.lang.ClassNotFoundException: com..
> >>  at
> >>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)
> >>  at
> >>
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)
> >>  at com.com...deserializeFrom(XXX.java:81)
> >>  at com.XXX.$$anonfun$3.apply(B.scala:157)
> >>  at com.XXX.$$anonfun$3.apply(B.scala:153)
> >>  at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
> >>  at scala.collection.Iterator$class.foreach(Iterator.scala:893)
> >>  at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
> >>  at
> >> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
> >>  at
> >>
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
> >>  at
> >> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
> >>  at scala.collection.TraversableOnce$class.to
> >> (TraversableOnce.scala:310)
> >>  at scala.collection.AbstractIterator.to(Iterator.scala:1336)
> >>  at
> >>
> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
> >>  at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
> >>  at
> >>
> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
> >>  at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
> >>  at
> >>
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
> >>  at
> >>
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
> >>  at
> >>
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
> >>  at
> >>
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
> >>  at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
> >>  at org.apache.spark.scheduler.Task.run(Task.scala:109)
> >>  at
> >> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> >>  at
> >>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> >>  at
> >>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> >>  at java.lang.Thread.run(Thread.java:748)
> >> Caused by: java.lang.ClassNotFoundException: com..
> >>  at java.net.URLClassLoader.findClass

Re: [lang3]java.lang.ClassNotFoundException when use Apache Commons Lang3 SerializationUtils.deserialize

2019-06-26 Thread Tomo Suzuki
Hi Big data,

I don't use SerializationUtils, but if I interpret the error message:

  ClassNotFoundException: com..

, this says com.. is not available in the class path of JVM (which
your Spark is running on). I would verify that you can instantiate
com.. in Spark/Scala *without* SerializationUtils.

Regards,
Tomo



On Wed, Jun 26, 2019 at 4:12 AM big data  wrote:

> I use Apache Commons Lang3's SerializationUtils in the code.
>
> SerializationUtils.serialize()
>
> to store a customized class as files into disk and
>
> SerializationUtils.deserialize(byte[])
>
> to restore them again.
>
> In the local environment (Mac OS), all serialized files can be
> deserialized normally and no error happens. But when I copy these
> serialized files into HDFS, and read them from HDFS by using Spark/Scala, a
> SerializeException happens.
>
> The Apache Commons Lang3 version is:
>
> 
> org.apache.commons
> commons-lang3
> 3.9
> 
>
>
> the stack error as below:
>
> org.apache.commons.lang3.SerializationException:
> java.lang.ClassNotFoundException: com..
> at
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:227)
> at
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:265)
> at com.com...deserializeFrom(XXX.java:81)
> at com.XXX.$$anonfun$3.apply(B.scala:157)
> at com.XXX.$$anonfun$3.apply(B.scala:153)
> at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
> at scala.collection.Iterator$class.foreach(Iterator.scala:893)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
> at
> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
> at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
> at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
> at scala.collection.TraversableOnce$class.to
> (TraversableOnce.scala:310)
> at scala.collection.AbstractIterator.to(Iterator.scala:1336)
> at
> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
> at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
> at
> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
> at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
> at
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
> at
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
> at
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
> at
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
> at org.apache.spark.scheduler.Task.run(Task.scala:109)
> at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ClassNotFoundException: com..
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:348)
> at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:686)
> at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1868)
> at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
> at
> org.apache.commons.lang3.SerializationUtils.deserialize(SerializationUtils.java:223)
>
> I've check the loaded byte[]'s length, both from local and from HDFS are
> same. But why it can not be deserialized from HDFS?
>


-- 
Regards,
Tomo