can you please explain a bit more about last option. We are using yarn so
guava might be in some classpath.

On Thu, Aug 11, 2016 at 1:29 AM, Robert Metzger <rmetz...@apache.org> wrote:

> Can you check if the jar you are submitting to the cluster contains a
> different Guava than you use at compile time?
>
> Also, it might happen that Guava is in your classpath, for example one
> some YARN setups.
>
> The last resort to resolve these issues is to use the maven-shade-plugin
> and relocated the guava version you need into your own namespace.
>
> On Wed, Aug 10, 2016 at 9:56 PM, Janardhan Reddy <
> janardhan.re...@olacabs.com> wrote:
>
>> #1 is thrown from user code.
>>
>> We use hadoop 2.7 which uses gauva 11.2 but our application uses 18.0. I
>> think the hadoop's gauva is getting picked up instead of ours
>>
>> On Thu, Aug 11, 2016 at 1:24 AM, Robert Metzger <rmetz...@apache.org>
>> wrote:
>>
>>> Hi Janardhan,
>>>
>>> #1 Is the exception thrown from your user code, or from Flink?
>>>
>>> #2 is most likely caused due to a compiler / runtime version mismatch:
>>> http://stackoverflow.com/questions/10382929/how-to
>>> -fix-java-lang-unsupportedclassversionerror-unsupported-majo
>>> r-minor-versi
>>> You compiled the code with Java8, but you try to run it with an older
>>> JVM.
>>>
>>> On Wed, Aug 10, 2016 at 9:46 PM, Janardhan Reddy <
>>> janardhan.re...@olacabs.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> We are getting the following error on submitting the flink jobs to the
>>>> cluster.
>>>>
>>>> 1. Caused by: java.lang.NoSuchMethodError: com.google.common.io
>>>> .Resources.asCharSource
>>>>
>>>> 2. This is for entirely different job
>>>> Caused by: java.lang.UnsupportedClassVersionError:
>>>> com/olacabs/fabric/common/Metadata : Unsupported major.minor version
>>>> 52.0
>>>>
>>>> But when we are running the flink locally, there is no error in both
>>>> the jobs.
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>
>

Reply via email to