Hi Antony.
Spark uses serializers to serialize data, however this clashes with Beam's
concept of coders, so we should be using coders instead of Spark's
serializer (Specifically, in our configuration, Kryo is used as Spark's
serializer).
>From your stack trace it seems that Kryo is being used to
lated to Combine.globally().
> Attached is my demo code showing the error.
>
> Thanks,
> a.
>
>
> On Friday, 24 March 2017, 10:19, Aviem Zur <aviem...@gmail.com> wrote:
>
>
> Hi Antony.
>
> Spark uses serializers to serialize data, however this clashes with B
e file I
> sent shortly after the first one? That second one had the Row class
> included (using just "implements Serializable").
>
> Thanks,
> a.
>
>
> On Friday, 24 March 2017, 13:36, Aviem Zur <aviem...@gmail.com> wrote:
>
>
> Hi Antony,
>
>
Hi Antony.
You are correct, PTransform#expand cannot throw checked exceptions. What
you need to do is wrap your checked exception in a runtime exception.
For example:
Hi Jayaraman,
Thanks for reaching out.
We run Beam using Spark runner daily on a yarn cluster.
It appears that in many of the logs you sent there is hanging when
connecting to certain servers on certain ports, could this be a network
issue or an issue with your Spark setup?
Could you please
thank you.
>
>
> org.apache.beam
> beam-sdks-java-core
> 0.6.0
>
>
> org.apache.beam
> beam-runners-direct-java
> 0.6.0
> runtime
>
>
> org.apache.beam
> beam-sdks-java-io-kafka
> 0.6.0
>
>
> org.apache.beam
> beam-runner
> Done.
> >
> > On Fri, Apr 28, 2017 at 3:32 PM, Andrew Psaltis <
> psaltis.and...@gmail.com>
> > wrote:
> >
> > > Please add me as well. Thanks,
> > >
> > > On Fri, Apr 28, 2017 at 7:59 AM, Anuj Kumar <anujs...@gmail.com>
>
t;
>
> org.apache.beam
> beam-runners-direct-java
> 0.7.0-SNAPSHOT
> runtime
>
>
> org.apache.beam
> beam-sdks-java-io-kafka
> 0.7.0-SNAPSHOT
>
>
> org.apache.beam
> beam-runners-spark
> 0.7.0-SNAPSHOT
>
Done
On Sun, Jun 25, 2017 at 4:51 PM Aleksandr wrote:
> Hello,
> Can someone please add me to the slack channel?
>
> Best regards
> Aleksandr Gortujev.
>
>
Hi Antony,
Unbounded views are not currently supported in the Spark runner. The
following ticket tracks the progress for adding support for this:
https://issues.apache.org/jira/browse/BEAM-2112
On Tue, May 30, 2017 at 7:11 PM Jean-Baptiste Onofré
wrote:
> Hi Antony,
>
>
Invitation sent
On Thu, Sep 14, 2017 at 2:14 PM Saiguang Che wrote:
> Hi,
>
> Could you please add me to the Slack channel? Thanks!
>
> Saiguang
>
duct(i, "Product #" + i));
>>>>>> }
>>>>>> }));
>>>>>>
>>>>>> pipeline.run();
>>>>>> }
>>>>>>
>>>>>>
>>>>>> Then it is just a matter of having
Invitation sent.
On Mon, Aug 28, 2017 at 3:31 PM Sobhan Badiozamany <
sobhan.badiozam...@leovegas.com> wrote:
> Hi,
>
> Could you please add me to the slack channel?
>
> Thanks,
> Sobi
>
Nice!
On Mon, Dec 18, 2017 at 12:51 PM Jean-Baptiste Onofré
wrote:
> Hi all,
>
> We are pleased to announce that Spark 2.x support in Spark runner has been
> merged this morning. It supports Spark 2.2.1.
>
> In the same PR, we did update to Scala 2.11, including Flink
Invitation sent.
On Mon, Feb 5, 2018 at 1:43 PM Jose Ignacio Honrado Benítez <
jihonra...@gmail.com> wrote:
> Hello,
>
> I would like to get invited to the Apache Beam Slack.
>
> Thanks in advance!
>
> Cheers
>
Invitation sent
On Tue, Feb 13, 2018 at 5:40 PM Sanjeev Nutan wrote:
> Hi fellow Beamers, please could I have a invite to the slack channel?
> Thanks.
>
Invitation sent
On Tue, Feb 13, 2018 at 5:43 PM ramesh krishnan muthusamy <
ramkrish1...@gmail.com> wrote:
> Hi
>
> please could I have a invite to the slack channel?
>
> -Ramesh
>
17 matches
Mail list logo