Hi Gyula,

we discovered yesterday that our build process for Scala 2.11 is broken for
the Kafka connector. The reason is that a property value is not properly
resolved and thus pulls in the 2.10 Kafka dependencies. Max already opened
a PR to fix this problem. I hope this will also solve your problem.

Cheers,
Till

On Mar 3, 2016 9:36 AM, "Gyula Fóra" <gyula.f...@gmail.com> wrote:
>
> Hey,
>
> Do we have any idea why this is happening in the snapshot repo? We have
run into the same issue again...
>
> Cheers,
> Gyula
>
> Gyula Fóra <gyula.f...@gmail.com> ezt írta (időpont: 2016. febr. 26., P,
11:17):
>>
>> Thanks Robert, so apparently the snapshot version was screwed up somehow
and included the 2.11 dependencies.
>>
>> Now it works.
>>
>> Cheers,
>> Gyula
>>
>> Gyula Fóra <gyula.f...@gmail.com> ezt írta (időpont: 2016. febr. 26., P,
11:09):
>>>
>>> That actually seemed to be the issue, not that I compiled my own
version it doesnt have these wrond jars in the dependency tree...
>>>
>>> Gyula Fóra <gyula.f...@gmail.com> ezt írta (időpont: 2016. febr. 26.,
P, 11:01):
>>>>
>>>> I was using the snapshot repo in this case, let me try building my own
version...
>>>>
>>>> Maybe this is interesting:
>>>> mvn dependency:tree | grep 2.11
>>>> [INFO] |  \- org.apache.kafka:kafka_2.11:jar:0.8.2.2:compile
>>>> [INFO] |     +- org.scala-lang.modules:scala-xml_2.11:jar:1.0.2:compile
>>>> [INFO] |     +-
org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.2:compile
>>>>
>>>>
>>>> Robert Metzger <rmetz...@apache.org> ezt írta (időpont: 2016. febr.
26., P, 10:56):
>>>>>
>>>>> Are you building 1.0-SNAPSHOT yourself or are you relying on the
snapshot repository?
>>>>>
>>>>> We had issues in the past that jars in the snapshot repo were
incorrect
>>>>>
>>>>> On Fri, Feb 26, 2016 at 10:45 AM, Gyula Fóra <gyula.f...@gmail.com>
wrote:
>>>>>>
>>>>>> I am not sure what is happening. I tried running against a Flink
cluster that is definitely running the correct Scala version (2.10) and I
still got the error. So it might be something with the pom.xml but we just
don't see how it is different from the correct one.
>>>>>>
>>>>>> Gyula
>>>>>>
>>>>>> Till Rohrmann <trohrm...@apache.org> ezt írta (időpont: 2016. febr.
26., P, 10:42):
>>>>>>>
>>>>>>> Hi Gyula,
>>>>>>>
>>>>>>> could it be that you compiled against a different Scala version
than the one you're using for running the job? This usually happens when
you compile against 2.10 and let it run with version 2.11.
>>>>>>>
>>>>>>> Cheers,
>>>>>>> Till
>>>>>>>
>>>>>>> On Fri, Feb 26, 2016 at 10:09 AM, Gyula Fóra <gyula.f...@gmail.com>
wrote:
>>>>>>>>
>>>>>>>> Hey,
>>>>>>>>
>>>>>>>> For one of our jobs we ran into this issue. It's probably some
dependency issue but we cant figure it out as a very similar setup works
without issues for a different program.
>>>>>>>>
>>>>>>>> java.lang.NoSuchMethodError:
scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
>>>>>>>> at
kafka.consumer.FetchRequestAndResponseMetrics.<init>(FetchRequestAndResponseStats.scala:32)
>>>>>>>> at
kafka.consumer.FetchRequestAndResponseStats.<init>(FetchRequestAndResponseStats.scala:46)
>>>>>>>> at
kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
>>>>>>>> at
kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
>>>>>>>> at kafka.utils.Pool.getAndMaybePut(Pool.scala:61)
>>>>>>>> at
kafka.consumer.FetchRequestAndResponseStatsRegistry$.getFetchRequestAndResponseStats(FetchRequestAndResponseStats.scala:63)
>>>>>>>> at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
>>>>>>>> at
kafka.javaapi.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:34)
>>>>>>>> at
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:518)
>>>>>>>> at
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:218)
>>>>>>>> at
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:193)
>>>>>>>> at
com.king.deduplo.source.EventSource.readRawInput(EventSource.java:46)
>>>>>>>> at com.king.deduplo.DeduploProgram.main(DeduploProgram.java:33)
>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>> at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>>>>> at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:497)
>>>>>>>> at
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:505)
>>>>>>>> at
org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:403)
>>>>>>>> at
org.apache.flink.client.program.Client.runBlocking(Client.java:248)
>>>>>>>> at
org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:866)
>>>>>>>> at org.apache.flink.client.CliFrontend.run(CliFrontend.java:333)
>>>>>>>> at
org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1189)
>>>>>>>> at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1239)
>>>>>>>>
>>>>>>>> Any insights?
>>>>>>>>
>>>>>>>> Cheers,
>>>>>>>> Gyula
>>>>>>>
>>>>>>>
>>>>>

Reply via email to