See 
<https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/10/display/redirect?page=changes>

Changes:

[git] Remove optionality and add sensible defaults to PubsubIO builders.

[jkai] [BEAM-8331] rewrite the calcite JDBC urls

[boyuanz] Update verify_release_build script to run python tests with dev 
version.

[robertwb] Supporting infrastructure for dataframes on beam.

[robertwb] Basic deferred data frame implementation.

[robertwb] yapf, py2

[filiperegadas] Add BigQuery useAvroLogicalTypes option

[filiperegadas] fixup! Add BigQuery useAvroLogicalTypes option

[jvilcek] [BEAM-9360] Fix equivalence check for FieldType

[github] typings and docs for expressions.py

[chamikara] Logs BQ insert failures

[iemejia] [BEAM-9384] Add SchemaRegistry.getSchemaCoder to get SchemaCoders for

[lcwik] [BEAM-9397] Pass all but output receiver parameters to start

[kcweaver] [BEAM-9401] bind Flink MiniCluster to localhost

[sunjincheng121] [BEAM-9288] Not bundle conscrypt in gRPC vendor

[mxm] [BEAM-9345] Fix source of test flakiness in FlinkSubmissionTest

[kamil.wasilewski] Add integration test for AnnotateImage transform

[github] Add integration test for AnnotateText transform (#10977)

[chadrik] [BEAM-9405] Fix post-commit error about create_job_service

[chadrik] more typing fixes

[chadrik] Fix typing issue with python 3.5.2

[chadrik] fixes

[chadrik] Address more issues discovered after rebase

[chadrik] Improve the idiom used for conditional imports

[chadrik] Fix more issues

[chadrik] Update to latest mypy version

[amaliujia] Moving to 2.21.0-SNAPSHOT on master branch.

[github] [BEAM-8487] Handle nested forward references (#10932)

[github] [BEAM-9287] Add Postcommit tests for dataflow runner v2  (#10998)

[chadrik] [BEAM-7746] Runtime change to timestamp/duration equality

[github] Adds DisplayData for StateSpecs used by stateful ParDos

[iemejia] Fix non correctly formatted class in sdks/java/core

[iemejia] [BEAM-9342[ Update bytebuddy to version 1.10.8

[aromanenko.dev] [BEAM-8925] Tika version update to 1.23

[12602502+Ardagan] [BEAM-8327] Override Gradle cache for community metrics 
prober

[ehudm] Reduce warnings in pytest runs.

[heejong] [BEAM-9415] fix postcommit xvr tests

[github] Merge pull request #10968 from [BEAM-9381] Adding display data to

[github] [BEAM-8335] Add PCollection to DataFrame logic for InteractiveRunner.

[robertwb] Remove excessive logging.

[github] [BEAM-2939] Java UnboundedSource SDF wrapper (#10897)

[iemejia] [website] Update link to environment_type (SDK harness configuration)

[iemejia] Fix typo on python code

[kamil.wasilewski] Fix: skip test if GCP dependencies are not installed

[fernandodiaz] [BEAM-9424] Allow grouping by LogicalType

[github] Revert "[BEAM-8335] Add PCollection to DataFrame logic for

[echauchot] Add metrics export to documentation on the website.

[github] [BEAM-8382] Add rate limit policy to KinesisIO.Read (#9765)

[lcwik] [BEAM-9288] Bump version number vendored gRPC build.

[chadrik] [BEAM-9274] Support running yapf in a git pre-commit hook

[rohde.samuel] [BEAM-8335] Add PCollection to Dataframe logic for 
InteractiveRunner.

[github] [BEAM-8575] Modified trigger test to work for different runners.

[github] [BEAM-9413] fix beam_PostCommit_Py_ValCon (#11023)

[rohde.samuel] ReverseTestStream Implementation

[github] Update lostluck's info on the Go SDK roadmap

[suztomo] Google-cloud-bigquery 1.108.0

[github] [BEAM-9432] Move expansion service into its own project. (#11035)

[ehudm] [BEAM-3713] Remove nosetests from tox.ini

[github] Merge pull request #11025: [BEAM-6428] Improve select performance with

[github] Switch contact email to apache.org.

[github] [BEAM-6374] Emit PCollection metrics from GoSDK (#10942)

[amaliujia] [BEAM-9288] Not bundle conscrypt in gRPC vendor in META-INF/

[kcweaver] [BEAM-9448] Fix log message for job server cache.

[github] Update container image tags used by Dataflow runner for Beam master

[github] [BEAM-8328] Disable community metrics integration test in 'test' task

[iemejia] [BEAM-9450] Update www.apache.org/dist/ links to downloads.apache.org

[iemejia] [BEAM-9450] Convert links available via https to use https

[github] Add integration test for AnnotateVideoWithContext transform (#10986)

[lcwik] [BEAM-9452] Update classgraph to latest version to resolve windows

[hktang] [BEAM-9453] Changed new string creation to use StandardCharsets.UTF_8

[chuck.yang] Use Avro format for file loads to BigQuery

[jkai] [Hotfix] fix rabbitmp spotless check

[kcweaver] Downgrade cache log level from warn->info.

[github] Revert "[BEAM-6374] Emit PCollection metrics from GoSDK (#10942)"

[github] Merge pull request #11032 from [BEAM-8335] Display rather than logging

[github] Fix a bug in performance test for reading data from BigQuery (#11062)

[suztomo] grpc 1.27.2 and gax 1.54.0

[suztomo] bigquerystorage 0.125.0-beta

[apilloud] [BEAM-9463] Bump ZetaSQL to 2020.03.1

[lcwik] [BEAM-2939, BEAM-9458] Add deduplication transform for SplittableDoFns

[lcwik] [BEAM-9464] Fix WithKeys to respect parameterized types

[ankurgoenka] [BEAM-9465] Fire repeatedly in reshuffle

[lcwik] [BEAM-2939, BEAM-9458] Use deduplication transform for UnboundedSources

[echauchot] Fix wrong generated code comment.

[github] [BEAM-9396] Fix Docker image name in CoGBK test for Python on Flink

[lcwik] [BEAM-9288] Update to use vendored gRPC without shaded conscrypt

[github] [BEAM-9319] Clean up start topic in TestPubsubSignal (#11072)

[lcwik] [BEAM-2939] Follow-up on comment in pr/11065

[lcwik] [BEAM-9473] Dont copy over META-INF index/checksum/signing files during

[apilloud] [BEAM-9411] Enable BigQuery DIRECT_READ by default in SQL

[hannahjiang] update CHANGE.md for 2.20

[lcwik] [BEAM-9475] Fix typos and shore up expectations on type

[rohde.samuel] BEAM[8335] TestStreamService integration with DirectRunner

[github] [BEAM-7926] Update Data Visualization (#11020)

[ankurgoenka] [BEAM-9402] Remove options overwrite

[chadrik] Add pre-commit hook for pylint

[github] Additional new Python Katas (#11078)

[github] [BEAM-9478] Update samza runner page to reflect post 1.0 changes

[suztomo] grpc-google-cloud-pubsub-v1 1.85.1

[pabloem] Updating BigQuery client APIs

[github] [BEAM-9481] Exclude signature files from expansion service test

[github] Install typing package only for Python < 3.5.3 (#10821)

[heejong] [BEAM-9056] Staging artifacts from environment

[sunjincheng121] [BEAM-9295] Add Flink 1.10 build target and Make FlinkRunner 
compatible

[ankurgoenka] [BEAM-9485] Raise error when transform urn is not implemented

[12602502+Ardagan] [BEAM-9431] Remove ReadFromPubSub/Read-out0-ElementCount 
from the

[github] Update Python roadmap for 2.7 eol

[mxm] [BEAM-9474] Improve robustness of BundleFactory and ProcessEnvironment

[github] [BEAM-7815] update MemoryReporter comments about using guppy3 (#11073)

[rohde.samuel] [BEAM-8335] Modify the StreamingCache to subclass the 
CacheManager

[sunjincheng121] [BEAM-9298] Drop support for Flink 1.7

[github] Fixing apache_beam.io.gcp.bigquery_test:PubSubBigQueryIT. at head

[mxm] [BEAM-9490] Guard referencing for environment expiration via a lock

[github] Verify schema early in ToJson and JsonToRow (#11105)

[lcwik] [BEAM-9481] fix indentation

[github] Merge pull request #11103 from [BEAM-9494] Reifying outputs from BQ 
file

[github] [BEAM-8335] Implemented Capture Size limitation (#11050)

[github] [BEAM-9294] Move RowJsonException out of RowJsonSerializer (#11102)

[github] Merge pull request #11046: [BEAM-9442] Properly handle nullable fields

[ankurgoenka] [BEAM-9287] disable validates runner test which uses teststreams 
for

[sunjincheng121] [BEAM-9299-PR]Upgrade Flink Runner 1.8x to 1.8.3 and 1.9x to 
1.9.2

[lcwik] [BEAM-2939] Implement interfaces and concrete watermark estimators

[ankurgoenka] [BEAM-9499] Sickbay test_multi_triggered_gbk_side_input for 
streaming

[robertwb] Minor cleanup, lint.

[robertwb] [BEAM-9433] Create expansion service artifact for common Java IOs.

[thw] [BEAM-9490] Use the lock that belongs to the cache when bundle load

[github] Update Dataflow py container version (#11120)

[github] [BEAM-7923] Streaming support and pipeline pruning when instrumenting a

[github] Update default value in Java snippet

[ankurgoenka] [BEAM-9504] Sickbay streaming test for batch VR

[rohde.samuel] [BEAM-8335] Final PR to merge the InteractiveBeam feature branch

[github] [BEAM-9477] RowCoder should be hashable and picklable (#11088)

[apilloud] [BEAM-8057] Reject Infinite or NaN literals at parse time

[robertwb] Log in a daemon thread.

[thw] [BEAM-8815] Skip removal of manifest when no artifacts were staged.

[github] [BEAM-9346] Improve the efficiency of TFRecordIO (#11122)

[kawaigin] [BEAM-8335] Refactor IPythonLogHandler

[apilloud] [BEAM-8070] Preserve type for empty array

[github] Merge pull request #10991 [BEAM-3301] Refactor DoFn validation & allow

[github] Update dataflow py container ver to 20200317 (#11145)


------------------------------------------
[...truncated 41.54 KB...]
ac3e2c206c49: Layer already exists
3663b7fed4c9: Layer already exists
832f129ebea4: Layer already exists
6670e930ed33: Layer already exists
c7f27a4eb870: Layer already exists
e70dfb4c3a48: Layer already exists
1c76bd0dc325: Layer already exists
c3881ea6fdcf: Pushed
c0f158bb7e27: Pushed
80a789adf151: Pushed
db164c127812: Pushed
latest: digest: 
sha256:2da04d75aee454dae2e5e58b5ae470cf8c7301c3b397b3ed34366229805a4d44 size: 
3470
[Gradle] - Launching build.
[src] $ 
<https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/gradlew>
 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g 
-Pdocker-repository-root=gcr.io/apache-beam-testing/beam_portability 
-Pdocker-tag=latest :runners:flink:1.10:job-server-container:docker
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:flink:1.10:copyResourcesOverrides NO-SOURCE
> Task :runners:flink:1.10:job-server:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :model:job-management:processResources UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractProto
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :runners:flink:1.10:job-server-container:copyLicenses
> Task :runners:flink:1.10:job-server-container:dockerClean UP-TO-DATE
> Task :runners:flink:1.10:copySourceOverrides
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :runners:flink:1.10:copyTestResourcesOverrides NO-SOURCE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :runners:flink:1.10:processResources
> Task :sdks:java:build-tools:compileJava FROM-CACHE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :sdks:java:build-tools:processResources
> Task :sdks:java:build-tools:classes
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :sdks:java:build-tools:jar
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :sdks:java:expansion-service:compileJava UP-TO-DATE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:flink:1.10:compileJava FROM-CACHE
> Task :runners:flink:1.10:classes
> Task :runners:flink:1.10:jar
> Task :runners:flink:1.10:job-server:compileJava NO-SOURCE
> Task :runners:flink:1.10:job-server:classes UP-TO-DATE
> Task :runners:flink:1.10:job-server:shadowJar
> Task :runners:flink:1.10:job-server-container:copyDockerfileDependencies
> Task :runners:flink:1.10:job-server-container:dockerPrepare
> Task :runners:flink:1.10:job-server-container:docker

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 57s
61 actionable tasks: 17 executed, 6 from cache, 38 up-to-date

Publishing build scan...
https://gradle.com/s/x4b4lh3n5rbfg

[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe 
/tmp/jenkins8998135900036901681.sh
+ echo 'Tagging image...'
Tagging image...
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe 
/tmp/jenkins4942192846555960994.sh
+ docker tag 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe 
/tmp/jenkins5272199491344354398.sh
+ echo 'Pushing image...'
Pushing image...
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe 
/tmp/jenkins2493098167671188565.sh
+ docker push 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
The push refers to repository 
[gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server]
cc5b8f6da91b: Preparing
09e4376309cb: Preparing
9249809e65e4: Preparing
6b84f79b6d95: Preparing
3c9a565ae0aa: Preparing
ac3e2c206c49: Preparing
3663b7fed4c9: Preparing
832f129ebea4: Preparing
6670e930ed33: Preparing
c7f27a4eb870: Preparing
e70dfb4c3a48: Preparing
1c76bd0dc325: Preparing
6670e930ed33: Waiting
e70dfb4c3a48: Waiting
1c76bd0dc325: Waiting
ac3e2c206c49: Waiting
3663b7fed4c9: Waiting
832f129ebea4: Waiting
cc5b8f6da91b: Pushed
09e4376309cb: Pushed
ac3e2c206c49: Layer already exists
3663b7fed4c9: Layer already exists
832f129ebea4: Layer already exists
6670e930ed33: Layer already exists
9249809e65e4: Pushed
e70dfb4c3a48: Layer already exists
c7f27a4eb870: Layer already exists
1c76bd0dc325: Layer already exists
3c9a565ae0aa: Pushed
6b84f79b6d95: Pushed
latest: digest: 
sha256:d5e9223d88d8120b61f4bd59acd787e226bb738624e267241ba71d2b351295ef size: 
2841
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
CLUSTER_NAME=beam-loadtests-java-portable-flink-streaming-10
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest
FLINK_NUM_WORKERS=5
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.10.0/flink-1.10.0-bin-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-streaming-10
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe 
/tmp/jenkins3214212142738852914.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe 
/tmp/jenkins4749361340216384720.sh
+ cd 
<https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh create
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-java-portable-flink-streaming-10-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ TASK_MANAGER_MEM=10240
+ YARN_APPLICATION_MASTER=
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh 
init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                
/ [1 files][  2.3 KiB/  2.3 KiB]                                                
Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                
/ [2 files][  6.0 KiB/  6.0 KiB]                                                
Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.4 KiB]                                                
/ [3 files][ 13.4 KiB/ 13.4 KiB]                                                
-
Operation completed over 3 objects/13.4 KiB.                                    
 
+ create_cluster
+ local 
metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.0/flink-1.10.0-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ 
metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest ]]
+ 
metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest
+ [[ -n 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]]
+ 
metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create 
beam-loadtests-java-portable-flink-streaming-10 --region=global --num-workers=6 
--initialization-actions 
gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh
 --metadata 
flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.0/flink-1.10.0-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest,
 --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation 
[projects/apache-beam-testing/regions/global/operations/6860445a-030a-33a0-9077-8e2b73d3c02e].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 
1TB or larger to ensure consistently high I/O performance. See 
https://cloud.google.com/compute/docs/disks/performance for information on disk 
I/O performance.
.....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................WARNING:
 Cluster beam-loadtests-java-portable-flink-streaming-10 failed to create. 
Beginning automated resource cleanup process.
done.
ERROR: (gcloud.dataproc.clusters.create) Operation 
[projects/apache-beam-testing/regions/global/operations/6860445a-030a-33a0-9077-8e2b73d3c02e]
 failed: Initialization action timed out. Failed action 
'gs://beam-flink-cluster/init-actions/flink.sh', see output in: 
gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/d5abfe1c-6208-464d-a2fc-36bd9314a4ff/beam-loadtests-java-portable-flink-streaming-10-m/dataproc-initialization-script-2_output.
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to