Hi Aurélien!

Please run

        mvn dependency:tree

and check it for Jackson dependencies.

Feel free to respond with the output if you have any questions about it.

Cheers,

Steve C

> On 22 Jan 2022, at 10:49 am, Aurélien Mazoyer <aurel...@aepsilon.com> wrote:
>
> Hello,
>
> I migrated my code to Spark 3.2 and I am facing some issues. When I run my 
> unit tests via Maven, I get this error:
> java.lang.NoClassDefFoundError: Could not initialize class 
> org.apache.spark.rdd.RDDOperationScope$
> which is not super nice.
>
> However, when I run my test via Intellij, I get the following one:
> java.lang.ExceptionInInitializerError
> at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
> at org.apache.spark.rdd.RDD.map(RDD.scala:421)
> ...
> Caused by: com.fasterxml.jackson.databind.JsonMappingException: Scala module 
> 2.12.3 requires Jackson Databind version >= 2.12.0 and < 2.13.0
> which is far better imo since it gives me some clue on what is missing in my 
> pom.xml file to make it work. After putting a few more dependencies, my tests 
> are again passing.... in intellij, but I am stuck on the same error when I am 
> running maven command :-/.
> It seems that jdk and maven versions are the same and both are using the same 
> .m2 directory.
> Any clue on what can be going wrong?
>
> Thank you,
>
> Aurelien

This email contains confidential information of and is the copyright of 
Infomedia. It must not be forwarded, amended or disclosed without consent of 
the sender. If you received this message by mistake, please advise the sender 
and delete all copies. Security of transmission on the internet cannot be 
guaranteed, could be infected, intercepted, or corrupted and you should ensure 
you have suitable antivirus protection in place. By sending us your or any 
third party personal details, you consent to (or confirm you have obtained 
consent from such third parties) to Infomedia’s privacy policy. 
http://www.infomedia.com.au/privacy-policy/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to