See 
<https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/2/display/redirect?page=changes>

Changes:

[mzobii.baig] Beam-2535 : Pursue pull request 4700 with manual apply changes 
from

[mzobii.baig] Beam-2535 : Pursue pull request 4700 with manual apply changes 
from

[mzobii.baig] Beam-2535 : Pursue pull request 4700 with manual apply changes 
from

[mzobii.baig] Beam-2535 : Replace timeStamp with outputTimeStamp

[mzobii.baig] Beam-2535 : Apply Spotless

[mzobii.baig] Beam-2535 : Pass outputTimestamp param in onTimer method

[mzobii.baig] Beam-2535 : Minor changed

[rehman.muradali] [BEAM-2535] : Add Commit State in ParDoEvaluator

[rehman.muradali] [BEAM-2535] : Add outputTimestamp in compare method, Revert

[mzobii.baig] Beam-2535 : Modifying default minimum target and GC time

[rehman.muradali] BEAM-2535 : Removal of extra lines

[mzobii.baig] Beam-2535 : Proposed changes

[mzobii.baig] Beam-2535 : Added original PR watermark hold functionality.

[rehman.muradali] [BEAM-2535] Apply Spotless

[mzobii.baig] [Beam-2535] Variable renaming and added output timestamp in

[mzobii.baig] Beam-2535 : Apply Spotless

[mzobii.baig] [Beam-2535] Modify test case

[mzobii.baig] [Beam-2535] Added comments

[mzobii.baig] [Beam-2535] Apply Spotless

[mzobii.baig] [Beam-2535] Set Processing Time with outputTimestamp

[mzobii.baig] [Beam-2535] Minor renaming

[rehman.muradali] [BEAM-2535] Revert Processing Time, Addition of 
OutputTimestamp

[rehman.muradali] [BEAM-2535] Revert TimerReceiver outputTimestamp

[kirillkozlov] Modify AggregateProjectMergeRule to have a condition

[ehudm] [BEAM-8269] Convert from_callable type hints to Beam types

[kirillkozlov] SpotlesApply

[kirillkozlov] Test for a query with a predicate

[kirillkozlov] A list of visited nodes should be unique per onMatch invocation

[rehman.muradali] [BEAM-2535] Revert TimerReceiver outputTimestamp

[kirillkozlov] Make sure all nodes are explored

[dcavazos] [BEAM-7390] Add code snippet for Min

[dcavazos] [BEAM-7390] Add code snippet for Max

[rehman.muradali] [BEAM-2535] Making OnTimer compatible

[rehman.muradali] [BEAM-2535] Making OnTimer compatible

[kirillkozlov] Add a new Jenkins job for SQL perf tests

[kirillkozlov] Test boilerplate

[rehman.muradali] Adding OutputTimestamp in Timer Object

[rehman.muradali] Apply Spotless and checkstyle

[kirillkozlov] Table proxy to add TimeMonitor after the IO

[kirillkozlov] Tests for direct_read w/o push-down and default methods

[mzobii.baig] [Beam-2535] Added watermark functionality for the dataflow runner

[kenn] Use more informative assertions in some py tests

[mzobii.baig] [Beam-2535] Used boolean instead boxed type

[kirillkozlov] Cleanup

[dcavazos] [BEAM-7390] Add code snippet for Sum

[mzobii.baig] [Beam-2535] Modify required watermark hold functionality

[kirillkozlov] Monitor total number of fields read from an IO

[ehudm] Fix _get_args for typing.Tuple in Py3.5.2

[kcweaver] Add FlinkMiniClusterEntryPoint for testing the uber jar submission

[kcweaver] [BEAM-8512] Add integration tests for flink_runner.py.

[kcweaver] Build mini cluster jar excluding unwanted classes.

[kcweaver] Rename to testFlinkUberJarPyX.Y

[kcweaver] Increase timeout on beam_PostCommit_PortableJar_Flink.

[kamil.wasilewski] [BEAM-1440] Provide functions for waiting for BQ job and 
exporting

[kamil.wasilewski] [BEAM-1440] Create _BigQuerySource that implements 
iobase.BoundedSource

[kamil.wasilewski] [BEAM-1440] Reorganised BigQuery read IT tests

[kamil.wasilewski] [BEAM-1440] Create postCommitIT jobs running on Flink Runner

[kamil.wasilewski] [BEAM-1440] Convert strings to bytes on Python 3 if field 
type is BYTES

[kamil.wasilewski] [BEAM-1440]: Support RECORD fields in coder

[kamil.wasilewski] [BEAM-1440] Remove json files after reading

[kamil.wasilewski] [BEAM-1440] Marked classes as private

[kamil.wasilewski] [BEAM-1440] Do not force to create temp dataset when using 
dry run

[echauchot] [BEAM-5192] Migrate ElasticsearchIO to v7

[echauchot] [BEAM-5192] Minor change of ESIO public configuration API:

[robinyqiu] BeamZetaSqlCalcRel prototype

[valentyn] Install SDK after tarball is generated to avoid a race in proto stubs

[kamil.wasilewski] [BEAM-8671] Add Python 3.7 support for LoadTestBuilder

[kamil.wasilewski] [BEAM-8671] Add ParDo test running on Python 3.7

[ehudm] Fix cleanPython race with :clean

[robinyqiu] Fix bug in SingleRowScanConverter

[robinyqiu] Use BeamBigQuerySqlDialect

[boyuanz] [BEAM-8536] Migrate using requested_execution_time to

[pabloem] Initialize logging configuration in Pipeline object

[daniel.o.programmer] [BEAM-7970] Touch-up on Go protobuf generation 
instructions.

[kamil.wasilewski] [BEAM-8979] Remove mypy-protobuf dependency

[echauchot] [BEAM-5192] Fix missing ifs for ES7 specificities.

[echauchot] [BEAM-5192] Remove unneeded transitive dependencies, upgrade ES and

[echauchot] [BEAM-5192] Disable MockHttpTransport plugin to enabe http dialog to

[mikhail] Update release docs

[relax] Merge pull request #10311: [BEAM-8810] Detect stuck commits in

[kcweaver] Import freezegun for Python time testing.

[kcweaver] Allow message stream to yield duplicates.

[mikhail] Blogpost stub

[kcweaver] [BEAM-8891] Create and submit Spark portable jar in Python.

[kcweaver] [BEAM-8296] containerize spark job server

[robinyqiu] Address comments

[github] [GoSDK] Make data channel splits idempotent (#10406)

[pabloem] Initialize logging configuration in PipelineOptions object.

[rehman.muradali] EarliestTimestamp Fix for outputTimestamp

[lukasz.gajowy] [BEAM-5495] Make PipelineResourcesDetectorAbstractFactory an 
inner

[lukasz.gajowy] [BEAM-5495] Change detect() return type to List

[lukasz.gajowy] [BEAM-5495] Minor docs and test fixes

[mxm] [BEAM-8959] Invert metrics flag in Flink Runner

[lukasz.gajowy] [BEAM-5495] Re-add test verifying order of resources detection

[echauchot] [BEAM-5192] Fix util class, elasticsearch changed their json output 
of

[mikhail] Add blogpost file

[tysonjh] CachingShuffleBatchReader use bytes to limit size.

[mikhail] Add blogpost highlights

[github] Update release guide for cherry picks (#10399)

[heejong] [BEAM-8902] parameterize input type of Java external transform

[lukasz.gajowy] [BEAM-5495] Prevent nested jar scanning (jarfiles in jarfiles)

[ehudm] Dicts are not valid DoFn.process return values

[github] Update release notes version to correct one.

[valentyn] Sickbay VR tests that don't pass

[chamikara] Makes environment ID a top level attribute of PTransform.

[angoenka] [BEAM-8944] Change to use single thread in py sdk bundle progress 
report

[aaltay] [BEAM-8335] Background caching job (#10405)

[ehudm] Light cleanup of opcodes.py

[chamikara] Setting environment ID for ParDo and Combine transforms

[pawel.pasterz] [BEAM-8978] Publish table size of data written during 
HadoopFormatIOIT

[echauchot] [BEAM-5192] Set a custom json serializer for document metadata to be

[echauchot] [BEAM-5192] Remove testWritePartialUpdateWithErrors because 
triggering

[sunjincheng121] [BEAM-7949] Add time-based cache threshold support in the data 
service

[mxm] [BEAM-8996] Auto-generate pipeline options documentation for FlinkRunner

[mxm] Regenerate Flink options table with the latest master

[mxm] [BEAM-8996] Improvements to the Flink runner page

[ehudm] Upgrade parameterized version to 0.7.0+

[kawaigin] [BEAM-8977] Resolve test flakiness

[dpcollins] Modify PubsubClient to use the proto message throughout.

[lukecwik] [BEAM-9004] Migrate org.mockito.Matchers#anyString to

[suztomo] GenericJsonAssert

[suztomo] Refactoring with assertEqualsAsJson

[chamikara] Fixes Go formatting.

[robertwb] [BEAM-8335] Add a TestStreamService Python Implementation (#10120)

[lukecwik] Minor cleanup of tests using TestStream. (#10188)

[pabloem] [BEAM-2572] Python SDK S3 Filesystem (#9955)

[github] [BEAM-8974] Wait for log messages to be processed before checking them.

[sunjincheng121] [BEAM-7949] Introduce PeriodicThread for time-based cache 
threshold

[github] Merge pull request #10356: [BEAM-7274] Infer a Beam Schema from a

[echauchot] [BEAM-9019] Improve Encoders: replace as much as possible of 
catalyst

[lukecwik] [BEAM-8623] Add status_endpoint field to provision api ProvisionInfo

[github] [BEAM-8999] Respect timestamp combiners in PGBKCVOperation. (#10425)

[github] Update dataflow container images to beam-master-20191220 (#10448)

[zyichi] [BEAM-8824] Add support to allow specify window allowed_lateness in

[chamikara] Adds documentation for environment_id fields.

[lukecwik] [BEAM-8846] Update documentation about stream observers and 
factories,

[apilloud] [BEAM-9023] Upgrade to ZetaSQL 2019.12.1

[lukecwik] [BEAM-7951] Allow runner to configure customization WindowedValue 
coder.

[chamikara] Sets missing environmentId in several locations.

[bhulette] [BEAM-8988] RangeTracker for _CustomBigQuerySource (#10412)

[ehudm] Set TMPDIR for tox environments

[robinyqiu] Address comments

[robinyqiu] Address comments

[kcweaver] Refactor shared uber jar generation code into common subclass.

[sunjincheng121] [BEAM-8935] Fail fast if sdk harness startup failed.

[echauchot] [BEAM-5192] use <= and >= in version specific code instead of == to 
be

[relax] Merge pull request #10444: [BEAM-9010] Proper TableRow size calculation

[github] Merge pull request #10449: [BEAM-7274] Implement the Protobuf schema

[sunjincheng121] [BEAM-9030] Bump grpc to 1.26.0

[github] Python example parameters fix

[bhulette] [BEAM-9026] Clean up RuntimeValueProvider.runtime_options (#10457)

[kirillkozlov] Fix BytesValue unparsing

[kirillkozlov] Fix floating point literals

[kirillkozlov] Fix string literals

[kirillkozlov] Add null check for SqlTypeFamily

[kirillkozlov] ZetaSqlCalcRule should be disaled by defualt

[kirillkozlov] spotles

[ehudm] [BEAM-9025] Update Dataflow Java container worker

[heejong] [BEAM-9034] Update environment_id for ExternalTransform in Python SDK

[sunjincheng121] [BEAM-9030] Update the dependencies to make sure the 
dependency linkage

[mxm] [BEAM-8962] Report Flink metric accumulator only when pipeline ends

[github] Revert "[BEAM-8932]  Modify PubsubClient to use the proto message

[ehudm] junitxml_report: Add failure tag support

[github] Catch __module__ is None.

[relax] Merge pull request #10422: [BEAM-2535] TimerData signature update

[rehman.muradali] Rebase TimerData PR

[udim] [BEAM-9012] Change __init__ hints so they work with pytype (#10466)

[github] [BEAM-9039] Fix race on reading channel readErr. (#10456)

[lcwik] [BEAM-5605] Increase precision of fraction used during splitting.

[github] [BEAM-8487] Convert forward references to Any (#9888)

[lukecwik] [BEAM-9020] LengthPrefixUnknownCodersTest to avoid relying on

[lukecwik] [BEAM-7951] Improve the docs for beam_runner_api.proto and

[sunjincheng121] [BEAM-9006] Improve ProcessManager for shutdown hook handling.

[kamil.wasilewski] [BEAM-8671] Fix Python 3.7 ParDo test job name

[github] [BEAM-5600] Add unimplemented split API to Runner side SDF libraries.

[github] [BEAM-5605] Fix type used to describe channel splits to match type used

[github] [BEAM-5605] Ensure that split calls are routed to the active bundle

[suztomo] protobuf 3.11.1

[jeff] BEAM-8745 More fine-grained controls for the size of a BigQuery Load job

[kcweaver] Make Spark REST URL a separate pipeline option.

[kirillkozlov] Address comments

[aaltay] [BEAM-8335] On Unbounded Source change (#10442)

[aaltay] [BEAM-9013] TestStream fix for DataflowRunner (#10445)

[angoenka] [BEAM-8575] Refactor test_do_fn_with_windowing_in_finish_bundle to 
work

[sunjincheng121] [BEAM-9055] Unify the config names of Fn Data API across 
languages.

[rehman.muradali] onTimer/setTimer signature updates

[sunjincheng121] fixup

[davidsabater] [BEAM-9053] Improve error message when unable to get the correct

[mxm] [BEAM-8577] Initialize FileSystems during Coder deserialization in

[github] Update _posts_2019-12-16-beam-2.17.0.md

[github] Cleanup formatting.

[suztomo] google_auth_version 0.19.0

[github] Update release date.

[lcwik] [BEAM-9059] Migrate PTransformTranslation to use string constants

[iemejia] [BEAM-5546] Update commons-codec to version 1.14

[iemejia] [BEAM-8701] Remove unused commons-io_1x dependency

[iemejia] [BEAM-8701] Update commons-io to version 2.6

[iemejia] [BEAM-5544] Update cassandra-all dependency to version 3.11.5

[iemejia] [BEAM-8749] Update cassandra-driver-mapping to version 3.8.0

[mxm] Rename FlinkClassloading to Workarounds

[mxm] [BEAM-9060] Restore stdout/stderr in case Flink's

[echauchot] Fix link in javadoc to accumulators

[github] Restrict the upper bound for pyhamcrest, since new version does not 
work

[apilloud] [BEAM-9027] [SQL] Fix ZetaSQL Byte Literals

[github] [BEAM-9058] Fix line-too-long exclusion regex and re-enable

[altay] Readability/Lint fixes

[hannahjiang] BEAM-8780 reuse RC images instead of recreate images

[iemejia] [BEAM-8716] Update commons-csv to version 1.7

[iemejia] [BEAM-9041] Add missing equals methods for GenericRecord <-> Row

[iemejia] [BEAM-9042] Fix RowToGenericRecordFn Avro schema serialization

[iemejia] [BEAM-9042] Update SchemaCoder doc with info about functions requiring

[iemejia] [BEAM-9042] Test serializability and equality of Row<->GenericRecord

[tvalentyn] [BEAM-9062] Improve assertion error for equal_to (#10504)

[iemejia] [BEAM-8717] Update commons-lang3 to version 3.9

[iemejia] [BEAM-8717] Make non core modules use only the repackaged 
commons-lang3

[chamikara] [BEAM-8960]: Add an option for user to opt out of using insert id 
for

[ehudm] Small fixes to verify_release_build.sh

[kirillkozlov] Metric name should not be constant

[36090911+boyuanzz] [BEAM-8932] [BEAM-9036] Revert reverted commit to use 
PubsubMessage as

[sunjincheng121] fixup

[github] Update ParDoTest.java

[rehman.muradali] Apply spotless

[rehman.muradali] Compilation Fix PardoTest

[rehman.muradali] Reverting outputTimestamp validation

[rehman.muradali] CheckStyle Fix

[rehman.muradali] Adding Category to exclude Flink Runner

[jkai] [BEAM-8496] remove SDF translators in flink streaming transform

[github] Fix blogpost typo (#10532)

[kcweaver] [BEAM-9070] tests use absolute paths for job server jars

[12602502+Ardagan] Fix headings in downloads.md

[github] Add # pytype: skip-file before first import statement in each py file

[apilloud] [BEAM-9027] Unparse DOY/DOW/WEEK Enums properly for ZetaSQL

[33895511+aromanenko-dev] [BEAM-8953] Extend ParquetIO read builders for 
AvroParquetReader

[brad.g.west] [BEAM-9078] Pass total_size to storage.Upload

[hannahjiang] BEAM-7861 add direct_running_mode option

[github] [BEAM-9075] Disable JoinCommuteRule for ZetaSQL planner (#10542)

[bhulette] [BEAM-9075] add a test case. (#10545)

[12602502+Ardagan] [BEAM-8821] Document Python SDK 2.17.0 deps (#10212)

[kirillkozlov] Missing commit

[hannahjiang] [BEAM-7861] rephrase direct_running_mode option checking

[kcweaver] [BEAM-8337] Hard-code Flink versions.

[echauchot] [BEAM-9019] Remove BeamCoderWrapper to avoid extra object 
allocation and

[lukecwik] [BEAM-8624] Implement Worker Status FnService in Dataflow runner

[github] [BEAM-5605] Add support for executing pair with restriction, split

[kcweaver] fix indentation

[kcweaver] Update release guide

[lostluck] [BEAM-9080] Support KVs in the Go SDK's Partition

[github] Rephrasing lull logging to avoid alarming users (#10446)

[robertwb] [BEAM-8575] Added counter tests for CombineFn (#10190)

[github] [BEAM-8490] Fix instance_to_type for empty containers (#9894)

[apilloud] [BEAM-8630] Use column numbers for BeamZetaSqlCalRel

[apilloud] [BEAM-9027] Backport BigQuerySqlDialect fixes

[robertwb] [BEAM-8575] Test hot-key fanout with accumulation modes. (#10159)

[github] [BEAM-9059] Use string constants in PTransformTranslation instead of

[iemejia] [BEAM-8956] Begin unifying contributor instructions into a single

[pawel.pasterz] [BEAM-7115] Fix metrics being incorrectly gathered

[mxm] Remove incorrectly tagged test annotation from test case

[mxm] [BEAM-6008] Propagate errors during pipeline execution in Java's

[github] Tighten language and remove distracting link

[pabloem] [BEAM-7390] Add code snippet for Top (#10179)

[bhulette] [BEAM-8993] [SQL] MongoDB predicate push down. (#10417)

[lukecwik] [BEAM-8740] Remove unused dependency from Spark runner (#10564)

[robertwb] [BEAM-6587] Remove hacks due to missing common string coder.

[kirillkozlov] Update data source for SQL performance tests

[github] [BEAM-5605] Add support for channel splitting to the gRPC read "source"

[github] [BEAM-5605] Add support for additional parameters to SplittableDofn

[chadrik] [BEAM-7746] Address changes in code since annotations were introduced

[chadrik] [BEAM-7746]  Typing fixes that require runtime code changes

[chadrik] [BEAM-7746] Avoid creating attributes dynamically, so that they can be

[chadrik] [BEAM-7746] Bugfix: coder id is expected to be str in python3

[chadrik] [BEAM-7746] Explicitly unpack tuple to avoid inferring unbounded tuple

[chadrik] [BEAM-7746] Generate files with protobuf urns as part of gen_protos

[chadrik] [BEAM-7746] Move name and coder to base StateSpec class

[chadrik] [BEAM-7746] Remove reference to missing attribute in

[chadrik] [BEAM-7746] Non-Optional arguments cannot default to None

[chadrik] [BEAM-7746] Avoid reusing variables with different data types

[chadrik] [BEAM-7746] Add StateHandler abstract base class

[chadrik] [BEAM-7746] Add TODO about fixing assignment to

[chadrik] [BEAM-7746] Fix functions that were defined twice

[chadrik] [BEAM-7746] Fix tests that have the same name

[iemejia] [BEAM-9040] Add skipQueries option to skip queries in a Nexmark suite

[iemejia] [BEAM-9040] Add Spark Structured Streaming Runner to Nexmark 
PostCommit

[valentyn] Switch to unittest.SkipTest instead of using nose.

[mxm] [BEAM-6008] Make sure to end stream only after sending all messages and

[chamikara] Sets the correct coder when clustering is enabled for the

[robertwb] Always initalize output processor on construction.

[github] [Go SDK Doc] Update Dead Container Link (#10585)

[github] Merge pull request #10582 for [INFRA-19670] Add .asf.yaml for Github


------------------------------------------
[...truncated 72.32 KB...]
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh 
init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                
/ [1 files][  2.3 KiB/  2.3 KiB]                                                
Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                
/ [2 files][  6.0 KiB/  6.0 KiB]                                                
-Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
- [2 files][  6.0 KiB/ 13.4 KiB]                                                
- [3 files][ 13.4 KiB/ 13.4 KiB]                                                
Operation completed over 3 objects/13.4 KiB.                                    
 
+ create_cluster
+ local 
metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ 
metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/java_sdk:latest ]]
+ 
metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest 
]]
+ 
metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=17
+ gcloud dataproc clusters create streaming-2 --region=global --num-workers=17 
--initialization-actions 
gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh
 --metadata 
flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest,
 --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation 
[projects/apache-beam-testing/regions/global/operations/e34295eb-ece1-3713-82e9-49349c987993].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 
1TB or larger to ensure consistently high I/O performance. See 
https://cloud.google.com/compute/docs/disks/performance for information on disk 
I/O performance.
..........................................................................................................................................................................................done.
Created 
[https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/streaming-2]
 Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@streaming-2-m 
'--command=yarn application -list'
++ grep streaming-2
Warning: Permanently added 'compute.120247318236805896' (ECDSA) to the list of 
known hosts.
20/01/15 12:33:39 INFO client.RMProxy: Connecting to ResourceManager at 
streaming-2-m/10.128.1.115:8032
+ read line
+ echo application_1579091538647_0001 flink-dataproc Apache Flink yarn default 
RUNNING UNDEFINED 100% 
http://streaming-2-w-14.c.apache-beam-testing.internal:33777
application_1579091538647_0001 flink-dataproc Apache Flink yarn default RUNNING 
UNDEFINED 100% http://streaming-2-w-14.c.apache-beam-testing.internal:33777
++ echo application_1579091538647_0001 flink-dataproc Apache Flink yarn default 
RUNNING UNDEFINED 100% 
http://streaming-2-w-14.c.apache-beam-testing.internal:33777
++ sed 's/ .*//'
+ application_ids[$i]=application_1579091538647_0001
++ echo application_1579091538647_0001 flink-dataproc Apache Flink yarn default 
RUNNING UNDEFINED 100% 
http://streaming-2-w-14.c.apache-beam-testing.internal:33777
++ sed 's/.*streaming-2/streaming-2/'
++ sed 's/ .*//'
+ application_masters[$i]=streaming-2-w-14.c.apache-beam-testing.internal:33777
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=streaming-2-w-14.c.apache-beam-testing.internal:33777
+ echo 'Using Yarn Application master: 
streaming-2-w-14.c.apache-beam-testing.internal:33777'
Using Yarn Application master: 
streaming-2-w-14.c.apache-beam-testing.internal:33777
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest 
]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@streaming-2-m 
'--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 
8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud 
gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest 
--flink-master=streaming-2-w-14.c.apache-beam-testing.internal:33777 
--artifacts-dir=gs://beam-flink-cluster/streaming-2'
c3cdee9cac30aa5d640768cff906facb6f64eb65323984be7b6b958c045f8179
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@streaming-2-m 
'--command=curl -s 
"http://streaming-2-w-14.c.apache-beam-testing.internal:33777/jobmanager/config";'
+ local 
'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1579091538647_0001"},{"key":"jobmanager.rpc.address","value":"streaming-2-w-14.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-87e0051e-f012-447d-92c3-30911aa38c5d"},{"key":"jobmanager.rpc.port","value":"45747"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1579091538647_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"16"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"streaming-2-w-14.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo streaming-2-w-14.c.apache-beam-testing.internal:33777
++ cut -d : -f1
+ local 
yarn_application_master_host=streaming-2-w-14.c.apache-beam-testing.internal
++ python -c 'import sys, json; print [ e['\''value'\''] for e in 
json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0]'
++ echo 
'[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1579091538647_0001"},{"key":"jobmanager.rpc.address","value":"streaming-2-w-14.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-87e0051e-f012-447d-92c3-30911aa38c5d"},{"key":"jobmanager.rpc.port","value":"45747"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1579091538647_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"16"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"streaming-2-w-14.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local jobmanager_rpc_port=45747
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest 
]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 
8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet 
yarn@streaming-2-m -- -L 
8081:streaming-2-w-14.c.apache-beam-testing.internal:33777 -L 
45747:streaming-2-w-14.c.apache-beam-testing.internal:45747 -L 
8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  -Nf 
>& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@streaming-2-m -- -L 
8081:streaming-2-w-14.c.apache-beam-testing.internal:33777 -L 
45747:streaming-2-w-14.c.apache-beam-testing.internal:45747 -L 
8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf 
'>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@streaming-2-m -- -L 
8081:streaming-2-w-14.c.apache-beam-testing.internal:33777 -L 
45747:streaming-2-w-14.c.apache-beam-testing.internal:45747 -L 
8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Java_Combine_Portable_Flink_Streaming] $ /bin/bash -xe 
/tmp/jenkins5845669632463696402.sh
+ echo src Load test: fanout 4 times with 2GB 10-byte records total on Flink in 
Portable mode src
src Load test: fanout 4 times with 2GB 10-byte records total on Flink in 
Portable mode src
[Gradle] - Launching build.
[src] $ 
<https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Streaming/ws/src/gradlew>
 -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest 
-Prunner=:runners:portability:java 
'-PloadTest.args=--project=apache-beam-testing 
--appName=load_tests_Java_Portable_Flink_streaming_Combine_4 
--tempLocation=gs://temp-storage-for-perf-tests/loadtests 
--publishToBigQuery=true --bigQueryDataset=load_test 
--bigQueryTable=java_portable_flink_streaming_Combine_4 
--sourceOptions={"numRecords":5000000,"keySizeBytes":10,"valueSizeBytes":90} 
--fanout=4 --iterations=1 --topCount=20 --sdkWorkerParallelism=16 
--perKeyCombiner=TOP_LARGEST --streaming=true --jobEndpoint=localhost:8099 
--defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest
 --defaultEnvironmentType=DOCKER --runner=PortableRunner' --continue 
--max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g 
:sdks:java:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :model:job-management:processResources UP-TO-DATE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :runners:local-java:jar
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:jar
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :runners:portability:java:compileJava FROM-CACHE
> Task :runners:portability:java:classes UP-TO-DATE
> Task :runners:portability:java:jar
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :runners:direct-java:shadowJar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.
Exception in thread "main" java.lang.IllegalStateException: GroupByKey cannot 
be applied to non-bounded PCollection in the GlobalWindow without a trigger. 
Use a Window.into or Window.triggering transform prior to GroupByKey.
        at 
org.apache.beam.sdk.transforms.GroupByKey.applicableTo(GroupByKey.java:156)
        at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:226)
        at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:110)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:539)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:473)
        at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:355)
        at 
org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1596)
        at 
org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1485)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:539)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:490)
        at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
        at 
org.apache.beam.sdk.loadtests.CombineLoadTest.loadTest(CombineLoadTest.java:134)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:96)
        at 
org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
        at 
org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5s
61 actionable tasks: 8 executed, 7 from cache, 46 up-to-date

Publishing build scan...
https://gradle.com/s/373uiefrcaoik

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org

Reply via email to