See
<https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/7/display/redirect?page=changes>
Changes:
[dpcollins] Move external PubsubIO hooks outside of PubsubIO.
[github] [BEAM-9188] CassandraIO split performance improvement - cache size of
[robertwb] Only cache first page of paginated state.
[robertwb] Perform bundle-level caching if no cache token is given.
[robertwb] [BEAM-8298] Support side input cache tokens.
[radoslaws] spotless fixes
[robertwb] fix continuation token iter
[robertwb] lint for side input tokens
[github] Rename "word" to "line" for better readability
[github] Rename "words" to "line" also in docs
[radoslaws] comments and tests
[suztomo] bigtable-client-core 1.13.0 and exclusion and gax
[robinyqiu] Cleanup ZetaSQLQueryPlanner and ExpressionConverter code
[suztomo] Controlling grpc-grpclb and grpc-core
[robertwb] Fix state cache test.
[robertwb] TODO about two-level caching.
[robertwb] CachingStateHandler unit test.
[github] "Upgrade" google-cloud-spanner version to 1.13.0
[github] Removing none instead of bare return
[michal.walenia] [BEAM-9226] Set max age of 3h for Dataproc Flink clusters
[je.ik] [BEAM-8550] @RequiresTimeSortedInput: working with legacy flink and
[kamil.wasilewski] Generate 100kB records in GroupByKey Load test 3
[robertwb] [BEAM-9227] Defer bounded source size estimation to the workers.
[chadrik] [BEAM-8271] Properly encode/decode StateGetRequest/Response
[github] [BEAM-8042] [ZetaSQL] Fix aggregate column reference (#10649)
[robertwb] test lint
[robertwb] Fix extending non-list.
[robertwb] Fix some missing (but unused) output_processor constructor arguments.
[chadrik] [BEAM-7746] Avoid errors about Unsupported operand types for >= ("int"
[robertwb] Fix flink counters test.
[github] [BEAM-8590] Support unsubscripted native types (#10042)
[github] Revert "[BEAM-9226] Set max age of 3h for Dataproc Flink clusters"
[radoslaws] spottless
[mxm] [BEAM-9132] Avoid logging misleading error messages during pipeline
[github] [BEAM-8889] Cleanup Beam to GCS connector interfacing code so it uses
[heejong] [BEAM-7961] Add tests for all runner native transforms and some widely
[github] [BEAM-9233] Support -buildmode=pie -ldflags=-w with unregistered Go
[github] [BEAM-9167] Metrics extraction refactoring. (#10716)
[kenn] Clarify exceptions in SQL modules
[github] Update Beam Python container release
[github] No longer reporting Lulls as errors in the worker.
[iemejia] [BEAM-9236] Mark missing Schema based classes and methods as
[iemejia] [BEAM-9236] Remove unneeded schema related class
FieldValueSetterFactory
[iemejia] [BEAM-9236] Remove unused schema related class FieldValueGetterFactory
[iemejia] [BEAM-6857] Recategorize UsesTimerMap tests to ValidatesRunner
[hsuryawirawan] Update Beam Katas Java to use Beam version 2.18.0
[kamil.wasilewski] Remove some tests in Python GBK on Flink suite
[hsuryawirawan] Update Beam Katas Python to use Beam version 2.18.0
[kamil.wasilewski] [BEAM-9234] Avoid using unreleased versions of
PerfKitBenchmarker
[github] Adding new source tests for Py BQ source (#10732)
[suztomo] Introducing google-http-client.version
[github] [BEAM-8280][BEAM-8629] Make IOTypeHints immutable (#10735)
[heejong] [BEAM-9230] Enable CrossLanguageValidateRunner test for Spark runner
[suztomo] Property google-api-client
[ehudm] [BEAM-8095] Remove no_xdist for test
[zyichi] Remove managing late data not supported by python sdk note
[echauchot] Embed audio podcasts players to webpage instead of links that play
the
[iemejia] [BEAM-9236] Mark missing Schema based classes and methods as
[yoshiki.obata] [BEAM-9163] update sphinx_rtd_theme to newest
[iemejia] [BEAM-7310] Add support of Confluent Schema Registry for KafkaIO
[altay] Add CHANGES.md file
[robinyqiu] Support all ZetaSQL TIMESTAMP functions
[github] [BEAM-4150] Remove fallback case for coder not specified within
[github] [BEAM-9009] Add pytest-timeout plugin, set timeout (#10437)
[github] [BEAM-3221] Expand/clarify timestamp comments within
[boyuanz] Add new release 2.19.0 to beam website.
[boyuanz] Update beam 2.19.0 release blog
[ehudm] Convert repo.spring.io to use https + 1 other
[ehudm] [BEAM-9251] Fix :sdks:java:io:kafka:updateOfflineRepository
[gleb] Fix AvroIO javadoc for deprecated methods
[github] [BEAM-5605] Migrate splittable DoFn methods to use "new" DoFn style
[github] [BEAM-6703] Make Dataflow ValidatesRunner test use Java 11 in test
[daniel.o.programmer] [BEAM-3301] Small cleanup to FullValue code.
[apilloud] [BEAM-8630] Add logical types, make public
[github] [BEAM-9037] Instant and duration as logical type (#10486)
[github] [BEAM-2645] Define the display data model type
[kamil.wasilewski] [BEAM-9175] Add yapf autoformatter
[kamil.wasilewski] [BEAM-9175] Yapf everywhere!
[kamil.wasilewski] [BEAM-9175] Fix pylint issues
[kamil.wasilewski] [BEAM-9175] Add pre-commit Jenkins job
[kamil.wasilewski] [BEAM-9175] Disable bad-continuation check in pylint
[amyrvold] [BEAM-9261] Add LICENSE and NOTICE to Docker images
[github] [BEAM-8951] Stop using nose in load tests (#10435)
[robertwb] [BEAM-7746] Cleanup historical DnFnRunner-as-Receiver cruft.
[robertwb] [BEAM-8976] Initalize logging configuration at a couple of other
entry
[chadrik] [BEAM-7746] Add typing for try_split
[zyichi] Fix race exception in python worker status thread dump
[iemejia] [BEAM-9264] Upgrade Spark to version 2.4.5
[hsuryawirawan] Update Beam Katas Java to use Beam version 2.19.0
[hsuryawirawan] Update Beam Katas Python to use Beam version 2.19.0
[hsuryawirawan] Update Beam Katas Python on Stepik
[hsuryawirawan] Update Built-in IOs task type to theory
[hsuryawirawan] Update Beam Katas Java on Stepik
[kamil.wasilewski] Fix method name in Combine and coGBK tests
[github] [BEAM-3453] Use project specified in pipeline_options when creating
[robertwb] [BEAM-9266] Remove unused fields from provisioning API.
[github] [BEAM-9262] Clean-up endpoints.proto to a stable state (#10789)
[lcwik] [BEAM-3595] Migrate to "v1" URNs for standard window fns.
[daniel.o.programmer] [BEAM-3301] (Go SDK) Adding restriction plumbing to graph
construction.
[robertwb] Remove one more reference to provision resources.
[github] Merge pull request #10766: [BEAM-4461] Add Selected.flattenedSchema
[robertwb] Reject unsupported WindowFns and Window types.
[github] Merge pull request #10804: [BEAM-2535] Fix timer map
[github] Merge pull request #10627:[BEAM-2535] Support outputTimestamp and
[iemejia] [BEAM-7092] Fix invalid import of Guava coming from transitive Spark
dep
[alex] [BEAM-9241] Fix inconsistent proto nullability
[kamil.wasilewski] Move imports and variables out of global namespace
[iemejia] [BEAM-9281] Update commons-csv to version 1.8
[iemejia] [website] Update Java 11 and Spark roadmap
[apilloud] [BEAM-8630] Validate prepared expression on expand
[github] [BEAM-9268] SpannerIO: Add more documentation and warnings for unknown
[iemejia] [BEAM-9231] Add Experimental(Kind.PORTABILITY) and tag related classes
[iemejia] [BEAM-9231] Tag SplittableDoFn related classes/methods as Experimental
[iemejia] [BEAM-9231] Make Experimental annotations homogeneous in
[iemejia] [BEAM-9231] Untag Experimental/Internal classes not needed to write
[iemejia] [BEAM-9231] Tag beam-sdks-java-core internal classes as Internal
[iemejia] [BEAM-9231] Tag DoFn.OnTimerContext as Experimental(Kind.TIMERS)
[iemejia] [BEAM-9231] Tag Experimental/Internal packages in beam-sdks-java-core
[iemejia] [BEAM-9231] Tag Experimental/Internal packages in IOs and extensions
[iemejia] [BEAM-9231] Tag public but internal IOs and extensions classes as
[yoshiki.obata] [BEAM-7198] rename ToStringCoder to ToBytesCoder for proper
[iemejia] [BEAM-9160] Update AWS SDK to support Pod Level Identity
[yoshiki.obata] [BEAM-7198] add comment
[ankurgoenka] [BEAM-9290] Support runner_harness_container_image in released
python
[boyuanz] Move ThreadsafeRestrictionTracker and RestrictionTrackerView out from
[github] Remove tables and refer to dependency locations in code (#10745)
[ehudm] fix lint
[valentyn] Cleanup MappingProxy reducer since dill supports it natively now.
[suztomo] beam-linkage-check.sh
[iemejia] Enable probot autolabeler action to label github pull requests
[iemejia] Remove prefixes in autolabeler configuration to improve readability
[iemejia] [BEAM-9160] Removed WebIdentityTokenCredentialsProvider explicit json
[suztomo] copyright
[yoshiki.obata] [BEAM-7198] fixup: reformatted with yapf
[github] [BEAM-3221] Clarify documentation for StandardTransforms.Primitives,
[aromanenko.dev] [BEAM-9292] Provide an ability to specify additional maven
repositories
[aromanenko.dev] [BEAM-9292] KafkaIO: add io.confluent repository to published
POM
[github] [BEAM-8201] Add other endpoint fields to provision API. (#10839)
[github] [BEAM-9269] Add commit deadline for Spanner writes. (#10752)
[github] [AVRO-2737] Exclude a buggy avro version from requirements spec.
[iemejia] Refine labels/categories for PR autolabeling
[github] Update roadmap page for python 3 support
[iemejia] [BEAM-9160] Removed WebIdentityTokenCredentialsProvider explicit json
[iemejia] Remove unused ReduceFnRunnerHelper class
[iemejia] Do not set options.filesToStage in case of spark local execution in
[iemejia] Do not set options.filesToStage in case of spark local execution in
[github] [BEAM-6522] [BEAM-7455] Unskip Avro IO tests that are now passing.
[github] [BEAM-5605] Convert all BoundedSources to SplittableDoFns when using
[github] [BEAM-8758] Google-cloud-spanner upgrade to 1.49.1 (#10765)
[github] Ensuring appropriate write_disposition and create_disposition for jobs
[github] [BEAM-3545] Return metrics as MonitoringInfos (#10777)
[github] Modify the TestStreamFileRecord to use TestStreamPayload events.
[iemejia] [BEAM-9280] Update commons-compress to version 1.20
------------------------------------------
[...truncated 74.14 KB...]
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
]]
+
metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-loadtests-java-portable-flink-batch-7
--region=global --num-workers=6 --initialization-actions
gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh
--metadata
flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest,
--image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation
[projects/apache-beam-testing/regions/global/operations/a36ba782-85d9-3580-b69e-dd4915b47c83].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning
1TB or larger to ensure consistently high I/O performance. See
https://cloud.google.com/compute/docs/disks/performance for information on disk
I/O performance.
..................................................................................................................................................done.
Created
[https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-java-portable-flink-batch-7]
Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet
yarn@beam-loadtests-java-portable-flink-batch-7-m '--command=yarn application
-list'
++ grep beam-loadtests-java-portable-flink-batch-7
Warning: Permanently added 'compute.6917959991117425346' (ECDSA) to the list of
known hosts.
20/02/14 12:38:26 INFO client.RMProxy: Connecting to ResourceManager at
beam-loadtests-java-portable-flink-batch-7-m/10.128.0.38:8032
+ read line
+ echo application_1581683849865_0001 flink-dataproc Apache Flink yarn default
RUNNING UNDEFINED 100%
http://beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
application_1581683849865_0001 flink-dataproc Apache Flink yarn default RUNNING
UNDEFINED 100%
http://beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
++ echo application_1581683849865_0001 flink-dataproc Apache Flink yarn default
RUNNING UNDEFINED 100%
http://beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
++ sed 's/ .*//'
+ application_ids[$i]=application_1581683849865_0001
++ echo application_1581683849865_0001 flink-dataproc Apache Flink yarn default
RUNNING UNDEFINED 100%
http://beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
++ sed
's/.*beam-loadtests-java-portable-flink-batch-7/beam-loadtests-java-portable-flink-batch-7/'
++ sed 's/ .*//'
+
application_masters[$i]=beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+
YARN_APPLICATION_MASTER=beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
+ echo 'Using Yarn Application master:
beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161'
Using Yarn Application master:
beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet
yarn@beam-loadtests-java-portable-flink-batch-7-m '--command=sudo --user yarn
docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097
--volume ~/.config/gcloud:/root/.config/gcloud
gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
--flink-master=beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
--artifacts-dir=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-batch-7'
a99791781c123f8b0716bf373e7df3c608ede5fc98f2b98b7246a7c432b71ee7
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a
yarn@beam-loadtests-java-portable-flink-batch-7-m '--command=curl -s
"http://beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161/jobmanager/config"'
+ local
'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1581683849865_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-901b7595-b07c-4fa8-917c-2a8f3e5c5f4e"},{"key":"jobmanager.rpc.port","value":"41357"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1581683849865_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo
beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
++ cut -d : -f1
+ local
yarn_application_master_host=beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal
++ python -c 'import sys, json; print([e['\''value'\''] for e in
json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
++ echo
'[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1581683849865_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-901b7595-b07c-4fa8-917c-2a8f3e5c5f4e"},{"key":"jobmanager.rpc.port","value":"41357"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1581683849865_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local jobmanager_rpc_port=41357
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest
]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L
8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet
yarn@beam-loadtests-java-portable-flink-batch-7-m -- -L
8081:beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
-L
41357:beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:41357
-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080
-Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet
yarn@beam-loadtests-java-portable-flink-batch-7-m -- -L
8081:beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
-L
41357:beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:41357
-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080
-Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet
yarn@beam-loadtests-java-portable-flink-batch-7-m -- -L
8081:beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:40161
-L
41357:beam-loadtests-java-portable-flink-batch-7-w-3.c.apache-beam-testing.internal:41357
-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080
-Nf
[beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe
/tmp/jenkins6759950651280737447.sh
+ echo src Load test: 2GB of 10B records on Flink in Portable mode src
src Load test: 2GB of 10B records on Flink in Portable mode src
[Gradle] - Launching build.
[src] $
<https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/ws/src/gradlew>
-PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest
-Prunner=:runners:portability:java
'-PloadTest.args=--project=apache-beam-testing
--appName=load_tests_Java_Portable_Flink_batch_Combine_1
--tempLocation=gs://temp-storage-for-perf-tests/loadtests
--publishToBigQuery=true --bigQueryDataset=load_test
--bigQueryTable=java_portable_flink_batch_Combine_1
--sourceOptions={"numRecords":200000000,"keySizeBytes":1,"valueSizeBytes":9}
--fanout=1 --iterations=1 --topCount=20 --sdkWorkerParallelism=5
--perKeyCombiner=TOP_LARGEST --streaming=false --jobEndpoint=localhost:8099
--defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest
--defaultEnvironmentType=DOCKER --runner=PortableRunner' --continue
--max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g
:sdks:java:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources
> NO-SOURCE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :model:job-management:processResources UP-TO-DATE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :runners:local-java:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :runners:local-java:jar
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:io:kinesis:jar
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :runners:portability:java:compileJava FROM-CACHE
> Task :runners:portability:java:classes UP-TO-DATE
> Task :runners:portability:java:jar
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :runners:direct-java:shadowJar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar
> Task :sdks:java:testing:load-tests:run
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further
details.
ERROR StatusLogger Log4j2 could not find a logging implementation. Please add
log4j-core to the classpath. Using SimpleLogger to log to the console...
Exception in thread "main" java.lang.RuntimeException:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: The Runner
experienced the following error during execution:
java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes
to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1
primitives, but transform Read input executes in environment Optional[urn:
"beam:env:docker:v1"
payload: "\n;gcr.io/apache-beam-testing/beam_portability/java_sdk:latest"
]
at
org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:98)
at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:99)
at
org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
at
org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException:
The Runner experienced the following error during execution:
java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes
to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1
primitives, but transform Read input executes in environment Optional[urn:
"beam:env:docker:v1"
payload: "\n;gcr.io/apache-beam-testing/beam_portability/java_sdk:latest"
]
at
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1928)
at
org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:90)
... 3 more
Caused by: java.lang.RuntimeException: The Runner experienced the following
error during execution:
java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes
to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1
primitives, but transform Read input executes in environment Optional[urn:
"beam:env:docker:v1"
payload: "\n;gcr.io/apache-beam-testing/beam_portability/java_sdk:latest"
]
at
org.apache.beam.runners.portability.JobServicePipelineResult.propagateErrors(JobServicePipelineResult.java:165)
at
org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:110)
at
java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
at
java.util.concurrent.CompletableFuture$AsyncSupply.exec(CompletableFuture.java:1596)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at
java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at
java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
> Task :sdks:java:testing:load-tests:run FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with
> non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1m 23s
61 actionable tasks: 8 executed, 7 from cache, 46 up-to-date
Publishing build scan...
https://gradle.com/s/637yskmzdqivk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]