My guess is that something else you depend on is actually bringing in a
different json4s, or you're otherwise mixing library/Spark versions. Use
mvn dependency:tree or equivalent on your build to see what you actually
build in. You probably do not need to include json4s at all as it is in
Spark anway

On Fri, Feb 4, 2022 at 2:35 PM Amit Sharma <resolve...@gmail.com> wrote:

> Martin Sean, changed it to  3.7.0-MS still getting the below error.
> I am still getting the same issue
> Exception in thread "streaming-job-executor-0"
> java.lang.NoSuchMethodError:
> org.json4s.ShortTypeHints$.apply$default$2()Ljava/lang/String;
>
>
> Thanks
> Amit
>
> On Fri, Feb 4, 2022 at 9:03 AM Martin Grigorov <mgrigo...@apache.org>
> wrote:
>
>> Hi,
>>
>> Amit said that he uses Spark 3.1, so the link should be
>> https://github.com/apache/spark/blob/branch-3.1/pom.xml#L879 (3.7.0-M5)
>>
>> @Amit: check your classpath. Maybe there are more jars of this dependency.
>>
>> On Thu, Feb 3, 2022 at 10:53 PM Sean Owen <sro...@gmail.com> wrote:
>>
>>> You can look it up:
>>> https://github.com/apache/spark/blob/branch-3.2/pom.xml#L916
>>> 3.7.0-M11
>>>
>>> On Thu, Feb 3, 2022 at 1:57 PM Amit Sharma <resolve...@gmail.com> wrote:
>>>
>>>> Hello, everyone. I am migrating my spark stream to spark version 3.1. I
>>>> also upgraded  json version  as below
>>>>
>>>> libraryDependencies += "org.json4s" %% "json4s-native" % "3.7.0-M5"
>>>>
>>>>
>>>> While running the job I getting an error for the below code where I am
>>>> serializing the given inputs.
>>>>
>>>> implicit val formats = 
>>>> Serialization.formats(ShortTypeHints(List(classOf[ForecastResponse], 
>>>> classOf[OverlayRequest],
>>>>   classOf[FTEResponseFromSpark], classOf[QuotaResponse], 
>>>> classOf[CloneResponse]
>>>>
>>>> )))
>>>>
>>>>
>>>> Exception in thread "streaming-job-executor-4" 
>>>> java.lang.NoSuchMethodError: 
>>>> org.json4s.ShortTypeHints$.apply$default$2()Ljava/lang/String;
>>>>
>>>> It seems to me jar issue, not sure which version of json4s-native should I 
>>>> use with spark 3.1.
>>>>
>>>>

Reply via email to