Hello,

Sorry for asking twice, but anyone has any idea which issue I could be
facing with this depencency problem :-/?

Thank you,

Aurelien

Le sam. 22 janv. 2022 à 00:49, Aurélien Mazoyer <aurel...@aepsilon.com> a
écrit :

> Hello,
>
> I migrated my code to Spark 3.2 and I am facing some issues. When I run my
> unit tests via Maven, I get this error:
> java.lang.NoClassDefFoundError: Could not initialize class
> org.apache.spark.rdd.RDDOperationScope$
> which is not super nice.
>
> However, when I run my test via Intellij, I get the following one:
> java.lang.ExceptionInInitializerError
> at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
> at org.apache.spark.rdd.RDD.map(RDD.scala:421)
> ...
> Caused by: com.fasterxml.jackson.databind.JsonMappingException: Scala
> module 2.12.3 requires Jackson Databind version >= 2.12.0 and < 2.13.0
> which is far better imo since it gives me some clue on what is missing in
> my pom.xml file to make it work. After putting a few more dependencies, my
> tests are again passing.... in intellij, but I am stuck on the same error
> when I am running maven command :-/.
> It seems that jdk and maven versions are the same and both are using the
> same .m2 directory.
> Any clue on what can be going wrong?
>
> Thank you,
>
> Aurelien
>

Reply via email to