Fully agree.

It's what we discussed in the "refactoring Flink runner" PR ;)

Please, let me know if I can help there.

Regards
JB

On 10/17/2016 03:10 PM, Aljoscha Krettek wrote:
IMHO, now might be a good idea to remove the custom Flink examples that
started out as copies of the examples of the Dataflow SDK.

On Mon, 17 Oct 2016 at 14:46 Maximilian Michels <m...@apache.org
<mailto:m...@apache.org>> wrote:

    Hi Trevor,

    The examples used to be working but they are not anymore due to SDK
    changes. Unfortunately, they are not automatically tested. We should
    change that.

    You can make the examples work if you use a bounded source, e.g. set
    `.withMaxNumRecords(100)` on the socket source.

    Cheers,
    Max

    On Sun, Oct 16, 2016 at 10:43 PM, Trevor Grant
    <trevor.d.gr...@gmail.com <mailto:trevor.d.gr...@gmail.com>> wrote:
    > So to be clear-
    >
    > The following currently merged examples aren't supposed to be runnable
    >
    
https://github.com/apache/incubator-beam/blob/f62d04e22679ee2ac19e3ae37dec487d953d51c1/runners/flink/examples/src/main/java/org/apache/beam/runners/flink/examples/streaming/WindowedWordCount.java
    >
    
https://github.com/apache/incubator-beam/blob/f62d04e22679ee2ac19e3ae37dec487d953d51c1/runners/flink/examples/src/main/java/org/apache/beam/runners/flink/examples/streaming/JoinExamples.java
    >
    > Because those throw the same error re: Write / Bounded collections.
    >
    > tg
    >
    > Trevor Grant
    > Data Scientist
    > https://github.com/rawkintrevo
    > http://stackexchange.com/users/3002022/rawkintrevo
    > http://trevorgrant.org
    >
    > "Fortunate is he, who is able to know the causes of things."  -Virgil
    >
    >
    > On Fri, Oct 14, 2016 at 8:09 PM, Kenneth Knowles <k...@google.com
    <mailto:k...@google.com>> wrote:
    >>
    >> Hi Trevor,
    >>
    >> The problem is that "Write" is an old name that should be changed to
    >> "BoundedWrite" (actually it is even more specific). In fact, it
    re-windows
    >> into the global window and removes all triggering, so it is
    suitable only
    >> for bounded PCollections where this will ensure all the data
    arrives at
    >> once. Hence the error message.
    >>
    >> For a streaming write, I'd recommend just performing the write in a
    >> ParDo(DoFn) with probably some prep to get the data ready for
    idempotent
    >> writing. If you look around the codebase you'll see some examples
    of this.
    >> Also note that Write isn't a primitive but just a certain pattern
    of how you
    >> can use ParDo and side inputs.
    >>
    >> Kenn
    >>
    >>
    >> On Fri, Oct 14, 2016, 17:16 Trevor Grant
    <trevor.d.gr...@gmail.com <mailto:trevor.d.gr...@gmail.com>> wrote:
    >>>
    >>> Hi,
    >>>
    >>> Trying to write a Twitter exampe on the Flink streamer-
    >>>
    >>>
    >>>
    
https://github.com/rawkintrevo/incubator-beam/blob/flink-twitter-example/runners/flink/examples/src/main/java/org/apache/beam/runners/flink/examples/streaming/TwitterWindowedWordCountExamples.java
    >>>
    >>> I'm getting an error
    >>> Caused by: java.lang.IllegalArgumentException: Write can only be
    applied
    >>> to a Bounded PCollection
    >>>
    >>> I was nearly copy and pasting from the other word count example,
    every
    >>> thing looks good in intellij. Can't figure out for the life of
    me what I'm
    >>> doing wrong here.
    >>>
    >>> I am running the job against a flink cluster (uploaded via
    web-UI) if
    >>> that is informative...
    >>>
    >>> Thanks, I plan on adding the example back in once it is done
    (still have
    >>> to parse the tweets).
    >>>
    >>> tg
    >>>
    >>>
    >>> Trevor Grant
    >>> Data Scientist
    >>> https://github.com/rawkintrevo
    >>> http://stackexchange.com/users/3002022/rawkintrevo
    >>> http://trevorgrant.org
    >>>
    >>> "Fortunate is he, who is able to know the causes of things."
    -Virgil
    >>>
    >


--
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com

Reply via email to