Hello Trevor,
Kenn is correct that Beam creates no-op Flink sinks. So if a sink isn't
being created, it's possibly a bug in Beam.
Is this a batch or streaming pipeline? Which Flink version are you using?
Kyle
On Thu, Jun 10, 2021 at 3:19 PM Kenneth Knowles wrote:
> Beam doesn't use Flink's
Beam doesn't use Flink's sink API. I recall from a very long time ago that
we attached a noop sink to each PCollection to avoid this error. +Kyle
Weaver might know something about how this applies to
Python on Flink.
Kenn
On Wed, Jun 9, 2021 at 4:41 PM Trevor Kramer
wrote:
> Hello Beam
Hello all,
I have posted this on the Flink group but posting it here because we write
our jobs in Beam and use the Flink Runner and I don't fully understand the
interactions between the Beam Coders and Flink Type Serializers
We are seeing the following error when restarting the job from a
Hi Tao,
"Limited spark options”, that you mentioned, are Beam's application arguments
and if you run your job via "spark-submit" you should still be able to
configure Spark application via normal spark-submit “--conf key=value” CLI
option.
Doesn’t it work for you?
—
Alexey
> On 10 Jun 2021,
Good afternoon,
I am using Python 3.7.9 and the Python Beam SDK 2.29.0, under Darwin.
I attempted running, in a stand-alone way, the coders.py cookbook example
[1]. However, I failed to use it. When I run it using the coders_test.py
file [2], it works. However, if I try to use it to read from a