See <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/9/display/redirect?page=changes>
Changes: [amogh.tiwari] lzo-addons [amogh.tiwari] 3rd dec 2019, 12:43AM [amogh.tiwari] PR corrections [amogh.tiwari] PR javaPreCommit update [amogh.tiwari] PR changes: added testLzopSpilttale() [amogh.tiwari] updated gradle for supporting optional dependency of lzo- 2:39 AM IST [iemejia] [BEAM-9162] Upgrade Jackson to version 2.10.2 [veblush] Upgrades gcsio to 2.0.0 [jonathan.warburton09] [BEAM-8916] Rename external_test_it so that it is picked up by pytest [huangry] Create validation runner test for metrics (limited to user counter in [millsd] Update Dataflow monitoring URL [ankurgoenka] [BEAM-9287] Add Python Streaming Validates runner tests for Unified [robertwb] Add capabilities and requirements to beam protos. [github] Change static Map fields in ReflectUtils to be concurrent [iemejia] [BEAM-8561] Add ThriftIO to support IO for Thrift files [github] [BEAM-9258] Integrate Google Cloud Data loss prevention functionality [github] [BEAM-9291] Upload graph option in dataflow's python sdk (#10829) [amogh.tiwari] update 19/02/2020 2:32 AM added static class, removed wrappers, updated [chlarsen] Removed compile time generation of test Thrift class. [github] [BEAM-1080] Skip tests that required GCP credentials [github] Exclude tests that are not passing under currect Avro IO requirements. [lcwik] [BEAM-5605] Honor the bounded source timestamps timestamp. [chlarsen] Added ThriftIO to list of supported I/O on website and to change log. [github] [BEAM-7246] Added Google Spanner Write Transform (#10712) [github] Apply suggestions from code review [github] [BEAM-1833] Fixes BEAM-1833 [bhulette] Don't exclude UsesUnboundedPCollections in Dataflow VR tests [heejong] [BEAM-9335] update hard-coded coder id when translating Java external [huangry] Fixups. [github] [BEAM-9146] Integrate GCP Video Intelligence functionality for Python [iemejia] Mark Test categories as internal and improve categorization [github] Add DataCatalogPipelineOptionsRegistrar (#10896) [github] Allow unknown non-merging WindowFns of know window type. (#10875) [iemejia] [BEAM-9326] Make JsonToRow transform input <String> instead of <? [github] [BEAM-8575] Removed MAX_TIMESTAMP from testing data (#10835) [github] Update python sdk container to beam-master-20200219 (#10903) [heejong] [BEAM-9338] add postcommit XVR spark badges [github] [BEAM-3545] Fix race condition w/plan metrics. (#10906) [robertwb] Update go beam runner generated protos. [heejong] [BEAM-9341] postcommit xvr flink fix [github] Update [github] Update [github] Update [github] Update [github] Update [github] Update [github] Update [github] Update [shubham.srivastava] finishing touch 20/02/2020 6:43PM [github] [BEAM-9085] Fix performance regression in SyntheticSource (#10885) [github] Update google-cloud-videointelligence dependency [robertwb] Add standard protocol capabilities to protos. [github] [BEAM-8280] no_annotations decorator (#10904) [kcweaver] [BEAM-9225] Fix Flink uberjar job termination bug. [kcweaver] Reuse get_state method. [chamikara] Updates DataflowRunner to support multiple SDK environments. [github] [BEAM-8280] Enable and improve IOTypeHints debug_str traceback (#10894) [github] [BEAM-9343]Upgrade ZetaSQL to 2020.02.1 (#10918) [robertwb] [BEAM-9339] Declare capabilities for Go SDK. [lcwik] [BEAM-5605] Eagerly close the BoundedReader once we have read everything [github] [BEAM-9229] Adding dependency information to Environment proto (#10733) [lcwik] [BEAM-9349] Update joda-time version [lcwik] fixup! Fix SpotBugs failure [kcweaver] [BEAM-9022] publish Spark job server Docker image [drubinstein] Bump google cloud bigquery to 1.24.0 [github] Revert "[BEAM-9085] Fix performance regression in SyntheticSource [github] [BEAM-8537] Provide WatermarkEstimator to track watermark (#10375) [github] Make sure calling try_claim(0) more than once also trows exception. [robertwb] [BEAM-9339] Declare capabilities for Python SDK. [robertwb] Add some standard requirement URNs to the protos. [kcweaver] [BEAM-9356] reduce Flink test logs to warn [github] [BEAM-9063] migrate docker images to apache (#10612) [github] [BEAM-9252] Exclude jboss's Main and module-info.java (#10930) [boyuanz] Clean up and add type-hints to SDF API [robertwb] [BEAM-9340] Populate requirements for Python DoFn properties. [hannahjiang] fix postcommit failure [robertwb] [BEAM-8019] Branch on having multiple environments. [github] [BEAM-9359] Switch to Data Catalog client (#10917) [github] [BEAM-9344] Add support for bundle finalization execution to the Beam [iemejia] [BEAM-9342] Upgrade vendored bytebuddy to version 1.10.8 [chadrik] Create a class to encapsulate the work required to submit a pipeline to [iemejia] Add Dataflow Java11 ValidatesRunner badge to the PR template [github] Merge pull request #10944: [BEAM-7274] optimize oneOf handling [github] [BEAM-8280] Fix IOTypeHints origin traceback on partials (#10927) [relax] Support null fields in rows with ByteBuddy generated code. [robertwb] Allow metrics update to be tolerant to uninitalized metric containers. [github] [GoSDK] Fix race condition in statemgr & test (#10941) [rohde.samuel] Move TestStream implementation to replacement transform [github] [BEAM-9347] Don't overwrite default runner harness for unified worker [boyuanz] Update docstring of ManualWatermarkEstimator.set_watermark() [kcweaver] [BEAM-9373] Spark/Flink tests fix string concat [boyuanz] Address comments [boyuanz] Address comments again [github] [BEAM-9228] Support further partition for FnApi ListBuffer (#10847) [github] [BEAM-7926] Data-centric Interactive Part3 (#10731) [boyuanz] Use NoOpWatermarkEstimator in sdf_direct_runner [chamikara] Updates Dataflow client [github] [BEAM-9240]: Check for Nullability in typesEqual() method of FieldType [amogh.tiwari] 25/02/2020 updated imports Amogh Tiwari & Shubham Srivastava [iemejia] [BEAM-8616] Make hadoop-client a provided dependency on ParquetIO [mxm] [BEAM-9345] Remove workaround to restore stdout/stderr during JobGraph [iemejia] [BEAM-9364] Refactor KafkaIO to use DeserializerProviders [mxm] [BEAM-9345] Add end-to-end Flink job submission test [iemejia] [BEAM-9352] Align version of transitive jackson dependencies with Beam [michal.walenia] [BEAM-9258] Add integration test for Cloud DLP [iemejia] [BEAM-9329] Support request of schemas by version on KafkaIO + CSR [lcwik] [BEAM-9252] Update to vendored gRPC without problematic [github] Update [github] Update [github] Update [lcwik] [BEAM-2822, BEAM-2939, BEAM-6189, BEAM-4374] Enable passing completed [crites] Changes TestStreamTranscriptTest to only emit two elements so that its [alex] [BEAM-7274] Add DynamicMessage Schema support [github] [BEAM-9322] Fix tag output names within Dataflow to be consistent with [iemejia] [BEAM-9342] Exclude module-info.class from vendored Byte Buddy 1.10.8 [iemejia] Add KafkaIO support for Confluent Schema Registry to the CHANGEs file [github] [BEAM-9247] Integrate GCP Vision API functionality (#10959) [github] Fix kotlin warnings (#10976) [github] Update python sdk container version to beam-master-20200225 (#10965) [github] [BEAM-9248] Integrate Google Cloud Natural Language functionality for [iemejia] Refine access level for `sdks/java/extensions/protobuf` [github] [BEAM-9355] Basic support for NewType (#10928) [github] [BEAM-8979] reintroduce mypy-protobuf stub generation (#10734) [github] [BEAM-8335] Background Caching job (#10899) [github] [BEAM-8458] Add option to set temp dataset in BigQueryIO.Read (#9852) [iemejia] Make logger naming consistent with Apache Beam LOG standard [kcweaver] [BEAM-9300] convert struct literal in ZetaSQL [github] fix breakage (#10934) [github] Merge pull request #10901 from [BEAM-8965] Remove duplicate sideinputs [pabloem] Fix formatting [github] [BEAM-8618] Tear down unused DoFns periodically in Python SDK harness. [alex] [BEAM-9394] DynamicMessage handling of empty map violates schema [github] Merge pull request #10854: State timers documentation [lcwik] [BEAM-5524] Fix minor issue in style guide. [github] [BEAM-8201] Pass all other endpoints through provisioning service. [suztomo] Linkage Checker 1.1.4 [robinyqiu] Bump Dataflow Java worker container version [kcweaver] Test schema does not need to be nullable. [github] [BEAM-9396] Match Docker image names in Jenkins jobs with those [github] [BEAM-9392] Fix Multi TestStream assertion errors (#10982) ------------------------------------------ [...truncated 74.41 KB...] + [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest ]] + metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest + local image_version=1.2 + echo 'Starting dataproc cluster. Dataproc version: 1.2' Starting dataproc cluster. Dataproc version: 1.2 + local num_dataproc_workers=6 + gcloud dataproc clusters create beam-loadtests-java-portable-flink-batch-9 --region=global --num-workers=6 --initialization-actions gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-7.0/flink-shaded-hadoop-2-uber-2.8.3-7.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest, --image-version=1.2 --zone=us-central1-a --quiet Waiting on operation [projects/apache-beam-testing/regions/global/operations/e9b25126-0b16-3418-aef0-0fb28347ec5a]. Waiting for cluster creation operation... WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 1TB or larger to ensure consistently high I/O performance. See https://cloud.google.com/compute/docs/disks/performance for information on disk I/O performance. .................................................................................................................................................................done. Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-java-portable-flink-batch-9] Cluster placed in zone [us-central1-a]. + get_leader + local i=0 + local application_ids + local application_masters + echo 'Yarn Applications' Yarn Applications ++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-9-m '--command=yarn application -list' ++ grep beam-loadtests-java-portable-flink-batch-9 Warning: Permanently added 'compute.4417651939462133680' (ECDSA) to the list of known hosts. 20/02/28 12:39:30 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-java-portable-flink-batch-9-m/10.128.0.195:8032 + read line + echo application_1582893511539_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561 application_1582893511539_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561 ++ echo application_1582893511539_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561 ++ sed 's/ .*//' + application_ids[$i]=application_1582893511539_0001 ++ echo application_1582893511539_0001 flink-dataproc Apache Flink yarn default RUNNING UNDEFINED 100% http://beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561 ++ sed 's/.*beam-loadtests-java-portable-flink-batch-9/beam-loadtests-java-portable-flink-batch-9/' ++ sed 's/ .*//' + application_masters[$i]=beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561 + i=1 + read line + '[' 1 '!=' 1 ']' + YARN_APPLICATION_MASTER=beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561 + echo 'Using Yarn Application master: beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561' Using Yarn Application master: beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561 + [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest ]] + start_job_server + gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-9-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest --flink-master=beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-batch-9' 5e4161313bb76c5a3dd5dd297efe838acc63be99eec0534037d611402dbfe72e + start_tunnel ++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-java-portable-flink-batch-9-m '--command=curl -s "http://beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561/jobmanager/config"' + local 'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1582893511539_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-d4fcbc31-cbe1-41b0-a7fd-83782187721b"},{"key":"jobmanager.rpc.port","value":"44311"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1582893511539_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]' + local key=jobmanager.rpc.port ++ echo beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561 ++ cut -d : -f1 + local yarn_application_master_host=beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal ++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])' ++ echo '[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1582893511539_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-d4fcbc31-cbe1-41b0-a7fd-83782187721b"},{"key":"jobmanager.rpc.port","value":"44311"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1582893511539_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]' + local jobmanager_rpc_port=44311 ++ [[ true == \t\r\u\e ]] ++ echo ' -Nf >& /dev/null' + local 'detached_mode_params= -Nf >& /dev/null' ++ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.9_job_server:latest ]] ++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097' + local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097' + local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-9-m -- -L 8081:beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561 -L 44311:beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:44311 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf >& /dev/null' + eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-9-m -- -L 8081:beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561 -L 44311:beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:44311 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null ++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-java-portable-flink-batch-9-m -- -L 8081:beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:39561 -L 44311:beam-loadtests-java-portable-flink-batch-9-w-0.c.apache-beam-testing.internal:44311 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf [beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe /tmp/jenkins2942965095152362924.sh + echo src Load test: 2GB of 10B records on Flink in Portable mode src src Load test: 2GB of 10B records on Flink in Portable mode src [Gradle] - Launching build. [src] $ <https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest -Prunner=:runners:portability:java '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_Portable_Flink_batch_Combine_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_portable_flink_batch_Combine_1 --sourceOptions={"numRecords":200000000,"keySizeBytes":1,"valueSizeBytes":9} --fanout=1 --iterations=1 --topCount=20 --sdkWorkerParallelism=5 --perKeyCombiner=TOP_LARGEST --streaming=false --jobEndpoint=localhost:8099 --defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/beam_java_sdk:latest --defaultEnvironmentType=DOCKER --runner=PortableRunner' --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:java:testing:load-tests:run > Task :buildSrc:compileJava NO-SOURCE > Task :buildSrc:compileGroovy UP-TO-DATE > Task :buildSrc:pluginDescriptors UP-TO-DATE > Task :buildSrc:processResources UP-TO-DATE > Task :buildSrc:classes UP-TO-DATE > Task :buildSrc:jar UP-TO-DATE > Task :buildSrc:assemble UP-TO-DATE > Task :buildSrc:spotlessGroovy UP-TO-DATE > Task :buildSrc:spotlessGroovyCheck UP-TO-DATE > Task :buildSrc:spotlessGroovyGradle UP-TO-DATE > Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE > Task :buildSrc:spotlessCheck UP-TO-DATE > Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE > Task :buildSrc:compileTestJava NO-SOURCE > Task :buildSrc:compileTestGroovy NO-SOURCE > Task :buildSrc:processTestResources NO-SOURCE > Task :buildSrc:testClasses UP-TO-DATE > Task :buildSrc:test NO-SOURCE > Task :buildSrc:validateTaskProperties UP-TO-DATE > Task :buildSrc:check UP-TO-DATE > Task :buildSrc:build UP-TO-DATE Configuration on demand is an incubating feature. > Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE > Task :runners:core-java:processResources NO-SOURCE > Task :sdks:java:core:generateAvroProtocol NO-SOURCE > Task :runners:core-construction-java:processResources NO-SOURCE > Task :sdks:java:harness:processResources NO-SOURCE > Task :sdks:java:extensions:google-cloud-platform-core:processResources > NO-SOURCE > Task :sdks:java:fn-execution:processResources NO-SOURCE > Task :runners:java-fn-execution:processResources NO-SOURCE > Task :model:fn-execution:extractProto UP-TO-DATE > Task :model:job-management:extractProto UP-TO-DATE > Task :sdks:java:core:generateAvroJava NO-SOURCE > Task :runners:local-java:processResources NO-SOURCE > Task :runners:direct-java:processResources NO-SOURCE > Task :model:job-management:processResources UP-TO-DATE > Task :runners:portability:java:processResources NO-SOURCE > Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE > Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE > Task :model:fn-execution:processResources UP-TO-DATE > Task :sdks:java:io:kafka:processResources NO-SOURCE > Task :sdks:java:io:kinesis:processResources NO-SOURCE > Task :sdks:java:io:synthetic:processResources NO-SOURCE > Task :sdks:java:extensions:protobuf:processResources NO-SOURCE > Task :sdks:java:testing:test-utils:processResources NO-SOURCE > Task :sdks:java:testing:load-tests:processResources NO-SOURCE > Task :sdks:java:core:generateGrammarSource UP-TO-DATE > Task :sdks:java:core:processResources UP-TO-DATE > Task :model:pipeline:extractIncludeProto UP-TO-DATE > Task :model:pipeline:extractProto UP-TO-DATE > Task :model:pipeline:generateProto UP-TO-DATE > Task :model:pipeline:compileJava UP-TO-DATE > Task :model:pipeline:processResources UP-TO-DATE > Task :model:pipeline:classes UP-TO-DATE > Task :model:pipeline:jar UP-TO-DATE > Task :model:pipeline:shadowJar UP-TO-DATE > Task :model:fn-execution:extractIncludeProto UP-TO-DATE > Task :model:job-management:extractIncludeProto UP-TO-DATE > Task :model:fn-execution:generateProto UP-TO-DATE > Task :model:job-management:generateProto UP-TO-DATE > Task :model:job-management:compileJava UP-TO-DATE > Task :model:job-management:classes UP-TO-DATE > Task :model:fn-execution:compileJava UP-TO-DATE > Task :model:fn-execution:classes UP-TO-DATE > Task :model:job-management:shadowJar UP-TO-DATE > Task :model:fn-execution:shadowJar UP-TO-DATE > Task :sdks:java:core:compileJava UP-TO-DATE > Task :sdks:java:core:classes UP-TO-DATE > Task :sdks:java:core:shadowJar UP-TO-DATE > Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE > Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE > Task :sdks:java:fn-execution:compileJava UP-TO-DATE > Task :sdks:java:fn-execution:classes UP-TO-DATE > Task :runners:core-construction-java:compileJava UP-TO-DATE > Task :sdks:java:fn-execution:jar UP-TO-DATE > Task :runners:local-java:compileJava FROM-CACHE > Task :runners:local-java:classes UP-TO-DATE > Task :runners:core-construction-java:classes UP-TO-DATE > Task :sdks:java:io:kafka:compileJava UP-TO-DATE > Task :sdks:java:io:synthetic:compileJava FROM-CACHE > Task :runners:core-construction-java:jar UP-TO-DATE > Task :sdks:java:io:kafka:classes UP-TO-DATE > Task :sdks:java:io:synthetic:classes UP-TO-DATE > Task :runners:local-java:jar > Task :sdks:java:io:kafka:jar UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE > Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE > Task :sdks:java:extensions:protobuf:classes UP-TO-DATE > Task :sdks:java:extensions:protobuf:jar UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE > Task :sdks:java:io:synthetic:jar > Task :sdks:java:io:kinesis:compileJava FROM-CACHE > Task :runners:core-java:compileJava UP-TO-DATE > Task :sdks:java:io:kinesis:classes UP-TO-DATE > Task :runners:core-java:classes UP-TO-DATE > Task :runners:core-java:jar UP-TO-DATE > Task :sdks:java:io:kinesis:jar > Task :sdks:java:testing:test-utils:compileJava FROM-CACHE > Task :sdks:java:testing:test-utils:classes UP-TO-DATE > Task :sdks:java:testing:test-utils:jar > Task :sdks:java:harness:compileJava UP-TO-DATE > Task :sdks:java:harness:classes UP-TO-DATE > Task :sdks:java:harness:jar UP-TO-DATE > Task :sdks:java:harness:shadowJar UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE > Task :runners:java-fn-execution:compileJava UP-TO-DATE > Task :runners:java-fn-execution:classes UP-TO-DATE > Task :runners:java-fn-execution:jar UP-TO-DATE > Task :runners:portability:java:compileJava FROM-CACHE > Task :runners:portability:java:classes UP-TO-DATE > Task :runners:portability:java:jar > Task :runners:direct-java:compileJava FROM-CACHE > Task :runners:direct-java:classes UP-TO-DATE > Task :runners:direct-java:shadowJar > Task :sdks:java:testing:load-tests:compileJava FROM-CACHE > Task :sdks:java:testing:load-tests:classes UP-TO-DATE > Task :sdks:java:testing:load-tests:jar > Task :sdks:java:testing:load-tests:run SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. ERROR StatusLogger Log4j2 could not find a logging implementation. Please add log4j-core to the classpath. Using SimpleLogger to log to the console... Exception in thread "main" java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: The Runner experienced the following error during execution: java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1 primitives, but transform Read input executes in environment Optional[urn: "beam:env:docker:v1" payload: "\[email protected]/apache-beam-testing/beam_portability/beam_java_sdk:latest" ] at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:98) at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:99) at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66) at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169) Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: The Runner experienced the following error during execution: java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1 primitives, but transform Read input executes in environment Optional[urn: "beam:env:docker:v1" payload: "\[email protected]/apache-beam-testing/beam_portability/beam_java_sdk:latest" ] at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357) at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1928) at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:90) ... 3 more Caused by: java.lang.RuntimeException: The Runner experienced the following error during execution: java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1 primitives, but transform Read input executes in environment Optional[urn: "beam:env:docker:v1" payload: "\[email protected]/apache-beam-testing/beam_portability/beam_java_sdk:latest" ] at org.apache.beam.runners.portability.JobServicePipelineResult.propagateErrors(JobServicePipelineResult.java:165) at org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:110) at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604) at java.util.concurrent.CompletableFuture$AsyncSupply.exec(CompletableFuture.java:1596) at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692) at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157) > Task :sdks:java:testing:load-tests:run FAILED FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:testing:load-tests:run'. > Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with > non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1m 21s 61 actionable tasks: 8 executed, 7 from cache, 46 up-to-date Publishing build scan... https://gradle.com/s/cfmvfhrzd6ius Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
