See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch/332/display/redirect?page=changes>

Changes:

[shehzaad] [BEAM-10961] enable strict dependency checking for

[samuelw] [BEAM-11910] Increase the bag page limit for continuation pages

[shehzaad] [BEAM-10961] remove dependencies blocks containing, which only 
contained

[Kenneth Knowles] Attach portable proto to DataflowPipelineJob

[Kenneth Knowles] Unsickbay metrics tests in runner v2

[Kenneth Knowles] Set Dataflow container version correctly in set_version.sh

[Kenneth Knowles] Set Go SDK version in set_version.sh

[Kenneth Knowles] Explain set_version.sh in comments more clearly

[Fokko Driesprong] [BEAM-11926] Improve error when missing Beam schema for 
BigqueryIO

[Kenneth Knowles] Add knowledge to `git add` to set_version.sh

[Boyuan Zhang] Add more comments to describe PubSubReadPayload and 
PubSubWritePayload.

[noreply] [BEAM-1251] Use Python 3 semantics in Cython-compiled modules. 
(#14198)

[noreply] [BEAM-11797] Fixed the flaky test (#14220)

[Ismaël Mejía] [BEAM-9282] Move structured streaming runner into Spark 2 
specific

[Ismaël Mejía] [BEAM-9282] Separate modules for Spark 2/3

[Ismaël Mejía] [BEAM-9282] Separate modules for Spark 2/3 job-server

[Ismaël Mejía] [BEAM-9282] Separate modules for Spark 2/3 job-server container

[Ismaël Mejía] [BEAM-7092] Run PostCommit tests for Spark 3 module too

[Ismaël Mejía] [BEAM-7092] Update tests invocation for Spark 2 module

[Ismaël Mejía] [BEAM-9283] Add Spark 3 test jobs to the CI (Java 11)

[Ismaël Mejía] [BEAM-11654] Publish Spark 2 and 3 specific Job-Server containers

[Ismaël Mejía] [BEAM-7092] Add paranamer 2.8 license to container (Spark 3 / 
Avro)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-10 (beam) in workspace 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch/ws/src>
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.7.4'
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # 
 > timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 153876fda2023cac78b5a111e6ce16dc38895635 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 153876fda2023cac78b5a111e6ce16dc38895635 # timeout=10
Commit message: "Merge pull request #14216: [BEAM-7093] Support Spark 3 in 
Spark runner"
 > git rev-list --no-walk fffb85a35df6ae3bdb2934c077856f6b27559aa7 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch] $ /bin/bash -xe 
/tmp/jenkins9096153494819283226.sh
+ echo '*** Load test: ParDo 2GB 100 byte records 10 times ***'
*** Load test: ParDo 2GB 100 byte records 10 times ***
[Gradle] - Launching build.
[src] $ 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch/ws/src/gradlew>
 -PloadTest.mainClass=org.apache.beam.sdk.loadtests.ParDoLoadTest 
-Prunner=:runners:spark -PwithDataflowWorkerJar=false 
'-PloadTest.args=--project=apache-beam-testing 
--appName=load_tests_Java_SparkStructuredStreaming_batch_ParDo_1 
--tempLocation=gs://temp-storage-for-perf-tests/loadtests 
--publishToBigQuery=true --bigQueryDataset=load_test 
--bigQueryTable=java_sparkstructuredstreaming_batch_ParDo_1 
--influxMeasurement=java_batch_pardo_1 --publishToInfluxDB=true 
--sourceOptions={"numRecords":20000000,"keySizeBytes":10,"valueSizeBytes":90} 
--iterations=10 --numberOfCounters=1 --numberOfCounterOperations=0 
--streaming=false --influxDatabase=beam_test_metrics 
--influxHost=http://10.128.0.96:8086 --runner=SparkStructuredStreamingRunner' 
--continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g -Pdocker-pull-licenses 
:sdks:java:testing:load-tests:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build
Configuration on demand is an incubating feature.

FAILURE: Build failed with an exception.

* What went wrong:
Could not determine the dependencies of task 
':sdks:java:testing:load-tests:run'.
> Could not resolve all task dependencies for configuration 
> ':sdks:java:testing:load-tests:gradleRun'.
   > Could not resolve project :runners:spark.
     Required by:
         project :sdks:java:testing:load-tests
      > Project :sdks:java:testing:load-tests declares a dependency from 
configuration 'gradleRun' to configuration 'default' which is not declared in 
the descriptor for project :runners:spark.

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 15s

Publishing build scan...
https://gradle.com/s/t2457zmt2pilw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to