I don't understand.

I've tried to run it from IDE. Could you explain how do you run it?

2016-06-03 0:08 GMT+02:00 Amit Sela <[email protected]>:

> Are you testing under package test ?
>
> On Fri, Jun 3, 2016, 01:06 Pawel Szczur <[email protected]> wrote:
>
>> I've just tested, it throws an exception with nightly Beam:
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> org/apache/spark/api/java/JavaSparkContext
>>
>>
>> 2016-06-01 14:23 GMT+02:00 Ismaël Mejía <[email protected]>:
>>
>>> You can find my pom.xml file for my beam-playground project that runs in
>>> the spark runner, note that it uses the daily snapshots, but I will change
>>> it for the released jars once they are on maven central:
>>>
>>> https://github.com/iemejia/beam-playground/blob/master/pom.xml
>>>
>>> Regards,
>>> Ismaël
>>>
>>> On Wed, Jun 1, 2016 at 3:01 AM, Pawel Szczur <[email protected]>
>>> wrote:
>>>
>>>> Hi,
>>>>
>>>> A Beam can be easily configured to run against Direct, Dataflow and
>>>> Flink, but I couldn't get Spark to work.
>>>>
>>>> Here's a repo I've prepared for a bug reproduction, it may serve as
>>>> init:
>>>> https://github.com/orian/cogroup-wrong-grouping
>>>>
>>>> Could someone modify it or share some working example (outside of Beam
>>>> repo).
>>>>
>>>> Cheers, Pawel
>>>>
>>>
>>>
>>

Reply via email to