The FlinkRunner on the release-2.3.0 branch explicitly specifies Flink
version 1.4.0.

I think your initialization action is the culprit.

On Wed, Feb 14, 2018 at 2:00 PM, Eugene Kirpichov <kirpic...@google.com>
wrote:

> Tentative -1.
>
> I tried to validate quickstart on Flink on a Dataproc cluster and ran into
> a couple of issues.
>
> Here's the script I followed:
> $ curl https://raw.githubusercontent.com/GoogleCloudPlatform/
> dataproc-initialization-actions/master/flink/flink.sh -o flink.sh
> $ gsutil cp flink.sh gs://my_bucket/flink.sh
> $ gcloud dataproc clusters create kirpichov-flink-230-rc3
> --initialization-actions gs://my_bucket/flink.sh 
> --initialization-action-timeout
> 5m --scopes=https://www.googleapis.com/auth/cloud-platform
> $ mvn package -Pflink-runner
> $ gcloud compute scp --zone "us-central1-a" 
> target/word-count-beam-bundled-0.1.jar
> kirpichov-flink-230-rc3-m:~/
> $ gcloud compute ssh --zone "us-central1-a" "kirpichov-flink-230-rc3-m"
> $ sudo su -
> $ cd /home/kirpichov
> $ yarn application -list
> # (get the application id)
> $ HADOOP_CONF_DIR=/etc/hadoop/conf /usr/lib/flink/bin/flink run -m
> yarn-cluster -yid (application id) -c org.apache.beam.examples.WordCount
> word-count-beam-bundled-0.1.jar --runner=FlinkRunner
> "--inputFile=gs://apache-beam-samples/shakespeare/*"
> "--output=gs://my_bucket/word-count-output"
>
> I ran into the following issues:
> 1) java.lang.NoSuchMethodError: com.google.protobuf.
> Descriptors$Descriptor.getOneofs()Ljava/util/List;
> This is I suppose because Flink includes a version of protobuf that
> clashes with ours.
> I resolved this by changing the configuration of the shade plugin in the
> generated wordcount pom.xml:
>                <relocations>
>                  <relocation>
>                    <pattern>com.google.protobuf</pattern>
>                    <shadedPattern>org.example.shaded.com.google.protobuf</
> shadedPattern>
>                  </relocation>
>                </relocations>
>
> 2) After this (and repeating instructions starting with "mvn package"), I
> got a different error:
> Caused by: com.typesafe.config.ConfigException$UnresolvedSubstitution:
> reference.conf: 804: Could not resolve substitution to a value:
> ${akka.stream.materializer}
> (full stack: https://gist.github.com/jkff/9fb07e470f81b558e38b1bccfe2e4a99
> )
>
> The flink version on the Dataproc cluster is 1.3.1. I'm not sure how to
> proceed. Has anyone else been able to validate quickstart on Flink on YARN?
>
> On Wed, Feb 14, 2018 at 12:24 PM Valentyn Tymofieiev <valen...@google.com>
> wrote:
>
>> +1, Validated Python - Mobile game walkthrough, Python - Quickstart
>> (Streaming Alpha).
>>
>>
>> On Wed, Feb 14, 2018 at 10:42 AM, Alan Myrvold <amyrv...@google.com>
>> wrote:
>>
>>> +1 Validated java quickstarts for direct, dataflow, apex, flink, and
>>> spark.
>>>
>>> On Wed, Feb 14, 2018 at 9:21 AM, Lukasz Cwik <lc...@google.com> wrote:
>>>
>>>> +1 (binding)
>>>> Validated several quickstarts including the regression that I
>>>> originally reported with Spark.
>>>>
>>>> On Wed, Feb 14, 2018 at 5:34 AM, Ismaël Mejía <ieme...@gmail.com>
>>>> wrote:
>>>>
>>>>> +1 (binding)
>>>>>
>>>>> Validated SHAs + tag vs source.zip file.
>>>>> Run mvn clean install -Prelease OK
>>>>> Validated that the 3 regressions reported for RC1 were fixed.
>>>>> Run Nexmark on Direct/Flink runner on local mode, no regressions now.
>>>>> Installed python version on virtualenv and run local wordcount with
>>>>> success.
>>>>> Checked that the hadoop-input-format artifact is in the extended
>>>>> staging area.
>>>>>
>>>>> On Tue, Feb 13, 2018 at 5:41 PM, Jean-Baptiste Onofré <j...@nanthrax.net>
>>>>> wrote:
>>>>> > +1 (binding)
>>>>> >
>>>>> > Tested the Spark runner (with wordcount example and beam samples)
>>>>> > Tested the performance of the direct runner
>>>>> >
>>>>> > I just updated the spreadsheet.
>>>>> >
>>>>> > Regards
>>>>> > JB
>>>>> >
>>>>> > On 02/11/2018 06:33 AM, Jean-Baptiste Onofré wrote:
>>>>> >> Hi everyone,
>>>>> >>
>>>>> >> Please review and vote on the release candidate #3 for the version
>>>>> 2.3.0, as
>>>>> >> follows:
>>>>> >>
>>>>> >> [ ] +1, Approve the release
>>>>> >> [ ] -1, Do not approve the release (please provide specific
>>>>> comments)
>>>>> >>
>>>>> >>
>>>>> >> The complete staging area is available for your review, which
>>>>> includes:
>>>>> >> * JIRA release notes [1],
>>>>> >> * the official Apache source release to be deployed to
>>>>> dist.apache.org [2],
>>>>> >> which is signed with the key with fingerprint C8282E76 [3],
>>>>> >> * all artifacts to be deployed to the Maven Central Repository [4],
>>>>> >> * source code tag "v2.3.0-RC3" [5],
>>>>> >> * website pull request listing the release and publishing the API
>>>>> reference
>>>>> >> manual [6].
>>>>> >> * Java artifacts were built with Maven 3.3.9 and Oracle JDK
>>>>> 1.8.0_111.
>>>>> >> * Python artifacts are deployed along with the source release to the
>>>>> >> dist.apache.org [2].
>>>>> >>
>>>>> >> The vote will be open for at least 72 hours. It is adopted by
>>>>> majority approval,
>>>>> >> with at least 3 PMC affirmative votes.
>>>>> >>
>>>>> >> Thanks,
>>>>> >> JB
>>>>> >>
>>>>> >> [1]
>>>>> >> https://issues.apache.org/jira/secure/ReleaseNote.jspa?
>>>>> projectId=12319527&version=12341608
>>>>> >> [2] https://dist.apache.org/repos/dist/dev/beam/2.3.0/
>>>>> >> [3] https://dist.apache.org/repos/dist/release/beam/KEYS
>>>>> >> [4] https://repository.apache.org/content/repositories/
>>>>> orgapachebeam-1028/
>>>>> >> [5] https://github.com/apache/beam/tree/v2.3.0-RC3
>>>>> >> [6] https://github.com/apache/beam-site/pull/381
>>>>> >>
>>>>> >
>>>>> > --
>>>>> > Jean-Baptiste Onofré
>>>>> > jbono...@apache.org
>>>>> > http://blog.nanthrax.net
>>>>> > Talend - http://www.talend.com
>>>>>
>>>>
>>>>
>>>
>>

Reply via email to