See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch/528/display/redirect?page=changes>

Changes:

[kileysok] Update windmill state map to resolve *IfAbsent methods immediately

[samuelw] [BEAM-12942] Validate pubsub messages before they are published in

[clairem] [BEAM-12628] make useReflectApi default to true

[noreply] [BEAM-12856] Change hard-coded limits for reading from a 
UnboundedReader

[piotr.szczepanik] [BEAM-12356] Fixed last non-cached usage of DatasetService 
in BigQuery

[noreply] [BEAM-12977] Translates Reshuffle in Portable Mode with Samza native

[noreply] [BEAM-12982] Help users debug which JvmInitializer is running and 
when.


------------------------------------------
[...truncated 657 B...]
 > git init 
 > <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch/ws/src>
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.7.4'
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # 
 > timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6beeafff496f69499cdb14bd58c6ac2d9e84d116 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6beeafff496f69499cdb14bd58c6ac2d9e84d116 # timeout=10
Commit message: "Merge pull request #15621: [BEAM-12356] Fixed last non-cached 
usage of DatasetService in BigQuery WriteTables"
 > git rev-list --no-walk 0111cff88025f0dc783a0890078b769139c8ae36 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch] $ /bin/bash -xe 
/tmp/jenkins9024016474051387760.sh
+ echo '*** Load test: ParDo 2GB 100 byte records 10 times ***'
*** Load test: ParDo 2GB 100 byte records 10 times ***
[Gradle] - Launching build.
[src] $ 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch/ws/src/gradlew>
 -PloadTest.mainClass=org.apache.beam.sdk.loadtests.ParDoLoadTest 
-Prunner=:runners:spark:2 '-PloadTest.args=--project=apache-beam-testing 
--appName=load_tests_Java_SparkStructuredStreaming_batch_ParDo_1 
--tempLocation=gs://temp-storage-for-perf-tests/loadtests 
--publishToBigQuery=true --bigQueryDataset=load_test 
--bigQueryTable=java_sparkstructuredstreaming_batch_ParDo_1 
--influxMeasurement=java_batch_pardo_1 --publishToInfluxDB=true 
--sourceOptions={"numRecords":20000000,"keySizeBytes":10,"valueSizeBytes":90} 
--iterations=10 --numberOfCounters=1 --numberOfCounterOperations=0 
--streaming=false --influxDatabase=beam_test_metrics 
--influxHost=http://10.128.0.96:8086 --runner=SparkStructuredStreamingRunner' 
--continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses 
:sdks:java:testing:load-tests:run
Starting a Gradle Daemon, 3 busy Daemons could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :runners:core-construction-java:createCheckerFrameworkManifest
> Task :sdks:java:core:createCheckerFrameworkManifest
> Task :model:fn-execution:createCheckerFrameworkManifest
> Task :model:job-management:createCheckerFrameworkManifest
> Task :sdks:java:expansion-service:createCheckerFrameworkManifest
> Task :model:pipeline:createCheckerFrameworkManifest
> Task :sdks:java:harness:createCheckerFrameworkManifest
> Task :runners:core-java:createCheckerFrameworkManifest
> Task 
> :sdks:java:extensions:google-cloud-platform-core:createCheckerFrameworkManifest
> Task :sdks:java:fn-execution:createCheckerFrameworkManifest
> Task :runners:java-job-service:createCheckerFrameworkManifest
> Task :runners:java-fn-execution:createCheckerFrameworkManifest
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:spark:2:copyResourcesOverrides NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:java-job-service:processResources NO-SOURCE
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:extensions:arrow:createCheckerFrameworkManifest
> Task :sdks:java:extensions:arrow:processResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :sdks:java:io:synthetic:createCheckerFrameworkManifest
> Task :sdks:java:io:kinesis:createCheckerFrameworkManifest
> Task :sdks:java:io:kafka:createCheckerFrameworkManifest
> Task :sdks:java:testing:load-tests:createCheckerFrameworkManifest
> Task :sdks:java:extensions:protobuf:createCheckerFrameworkManifest
> Task :sdks:java:testing:test-utils:createCheckerFrameworkManifest
> Task :sdks:java:io:google-cloud-platform:createCheckerFrameworkManifest
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :model:job-management:processResources
> Task :runners:spark:2:copySourceOverrides
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :runners:spark:2:copyTestResourcesOverrides NO-SOURCE
> Task :runners:spark:2:createCheckerFrameworkManifest
> Task :runners:spark:2:processResources
> Task :sdks:java:core:processResources
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto FROM-CACHE
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto FROM-CACHE
> Task :model:fn-execution:generateProto FROM-CACHE
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar FROM-CACHE
> Task :model:job-management:shadowJar FROM-CACHE
> Task :model:fn-execution:shadowJar FROM-CACHE
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar FROM-CACHE
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:jar
> Task :sdks:java:extensions:arrow:compileJava FROM-CACHE
> Task :sdks:java:extensions:arrow:classes UP-TO-DATE
> Task :sdks:java:extensions:arrow:jar
> Task :sdks:java:extensions:protobuf:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :runners:java-job-service:compileJava FROM-CACHE
> Task :runners:java-job-service:classes UP-TO-DATE
> Task :runners:java-job-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar
> Task :runners:spark:2:compileJava FROM-CACHE
> Task :runners:spark:2:classes
> Task :runners:spark:2:jar

> Task :sdks:java:testing:load-tests:run
21/09/30 12:27:56 WARN org.apache.beam.sdk.Pipeline: The following transforms 
do not have stable unique names: ParDo(TimeMonitor)
21/09/30 12:27:56 INFO 
org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner:
 *** SparkStructuredStreamingRunner is based on spark structured streaming 
framework and is no more 
 based on RDD/DStream API. See
 
https://spark.apache.org/docs/latest/structured-streaming-programming-guide.html
 It is still experimental, its coverage of the Beam model is partial. ***
21/09/30 12:27:56 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
21/09/30 12:27:58 INFO 
org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator: 
Instantiated metrics accumulator: MetricQueryResults()
21/09/30 12:27:58 INFO 
org.apache.beam.runners.spark.structuredstreaming.aggregators.AggregatorsAccumulator:
 Instantiated aggregators accumulator: 
Load test results for test (ID): 103f9151-cc51-452d-b0ae-06cdf6397f7b and 
timestamp: 2021-09-30T12:27:55.877000000Z:
                 Metric:                    Value:
sparkstructuredstreaming_runtime_sec                    51.038
sparkstructuredstreaming_total_bytes_count                     2.0E9

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 1m 28s
84 actionable tasks: 54 executed, 30 from cache

Publishing build scan...
https://gradle.com/s/rbgyt76mcwxfe

Build step 'Invoke Gradle script' changed build result to SUCCESS
[beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch] $ /bin/bash -xe 
/tmp/jenkins7562761311656128262.sh
+ echo '*** Load test: ParDo 2GB 100 byte records 200  times ***'
*** Load test: ParDo 2GB 100 byte records 200  times ***
[Gradle] - Launching build.
[src] $ 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch/ws/src/gradlew>
 -PloadTest.mainClass=org.apache.beam.sdk.loadtests.ParDoLoadTest 
-Prunner=:runners:spark:2 '-PloadTest.args=--project=apache-beam-testing 
--appName=load_tests_Java_SparkStructuredStreaming_batch_ParDo_2 
--tempLocation=gs://temp-storage-for-perf-tests/loadtests 
--publishToBigQuery=true --bigQueryDataset=load_test 
--bigQueryTable=java_sparkstructuredstreaming_batch_ParDo_2 
--influxMeasurement=java_batch_pardo_2 --publishToInfluxDB=true 
--sourceOptions={"numRecords":20000000,"keySizeBytes":10,"valueSizeBytes":90} 
--iterations=200 --numberOfCounters=1 --numberOfCounterOperations=0 
--streaming=false --influxDatabase=beam_test_metrics 
--influxHost=http://10.128.0.96:8086 --runner=SparkStructuredStreamingRunner' 
--continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses 
:sdks:java:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=40a599db-1ef5-49c3-9b67-2ee9399d99a4, 
currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 1001
  log file: /home/jenkins/.gradle/daemon/6.9.1/daemon-1001.out.log
----- Last  20 lines from daemon log file - daemon-1001.out.log -----
2021-09-30T12:28:57.205+0000 [DEBUG] 
[org.gradle.launcher.daemon.server.DefaultDaemonConnection] thread 244: 
Received non-IO message from client: 
Build{id=40a599db-1ef5-49c3-9b67-2ee9399d99a4, 
currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch/ws/src}>
2021-09-30T12:28:57.206+0000 [INFO] 
[org.gradle.launcher.daemon.server.DefaultIncomingConnectionHandler] Received 
command: Build{id=40a599db-1ef5-49c3-9b67-2ee9399d99a4, 
currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch/ws/src}.>
2021-09-30T12:28:57.206+0000 [DEBUG] 
[org.gradle.launcher.daemon.server.DefaultIncomingConnectionHandler] Starting 
executing command: Build{id=40a599db-1ef5-49c3-9b67-2ee9399d99a4, 
currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch/ws/src}>
 with connection: socket connection from /127.0.0.1:43343 to /127.0.0.1:55960.
2021-09-30T12:28:57.206+0000 [ERROR] 
[org.gradle.launcher.daemon.server.DaemonStateCoordinator] Command execution: 
started DaemonCommandExecution[command = 
Build{id=40a599db-1ef5-49c3-9b67-2ee9399d99a4, 
currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch/ws/src},>
 connection = DefaultDaemonConnection: socket connection from /127.0.0.1:43343 
to /127.0.0.1:55960] after 0.0 minutes of idle
2021-09-30T12:28:57.207+0000 [INFO] 
[org.gradle.launcher.daemon.server.DaemonRegistryUpdater] Marking the daemon as 
busy, address: [32e715c5-02db-46e7-a49d-ffd4e1c3ba4c port:43343, 
addresses:[localhost/127.0.0.1]]
2021-09-30T12:28:57.207+0000 [DEBUG] 
[org.gradle.launcher.daemon.registry.PersistentDaemonRegistry] Marking busy by 
address: [32e715c5-02db-46e7-a49d-ffd4e1c3ba4c port:43343, 
addresses:[localhost/127.0.0.1]]
2021-09-30T12:28:57.207+0000 [DEBUG] 
[org.gradle.cache.internal.DefaultFileLockManager] Waiting to acquire exclusive 
lock on daemon addresses registry.
2021-09-30T12:28:57.207+0000 [DEBUG] 
[org.gradle.cache.internal.DefaultFileLockManager] Lock acquired on daemon 
addresses registry.
2021-09-30T12:28:57.208+0000 [DEBUG] 
[org.gradle.cache.internal.DefaultFileLockManager] Releasing lock on daemon 
addresses registry.
2021-09-30T12:28:57.208+0000 [DEBUG] 
[org.gradle.launcher.daemon.server.DaemonStateCoordinator] resetting idle timer
2021-09-30T12:28:57.208+0000 [DEBUG] 
[org.gradle.launcher.daemon.server.DaemonStateCoordinator] daemon is running. 
Sleeping until state changes.
2021-09-30T12:28:57.208+0000 [INFO] 
[org.gradle.launcher.daemon.server.exec.StartBuildOrRespondWithBusy] Daemon is 
about to start building Build{id=40a599db-1ef5-49c3-9b67-2ee9399d99a4, 
currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_SparkStructuredStreaming_Batch/ws/src}.>
 Dispatching build started information...
2021-09-30T12:28:57.209+0000 [DEBUG] 
[org.gradle.launcher.daemon.server.SynchronizedDispatchConnection] thread 27: 
dispatching org.gradle.launcher.daemon.protocol.BuildStarted@3b751847
2021-09-30T12:28:57.209+0000 [DEBUG] 
[org.gradle.launcher.daemon.server.exec.EstablishBuildEnvironment] Configuring 
env variables: [PATH, INFLUXDB_USER, RUN_DISPLAY_URL, HUDSON_HOME, 
RUN_CHANGES_DISPLAY_URL, JOB_URL, HUDSON_COOKIE, MAIL, JENKINS_SERVER_COOKIE, 
LOGNAME, PWD, RUN_TESTS_DISPLAY_URL, JENKINS_URL, SHELL, BUILD_TAG, 
ROOT_BUILD_CAUSE, BUILD_CAUSE_TIMERTRIGGER, OLDPWD, GIT_CHECKOUT_DIR, 
JENKINS_HOME, sha1, NODE_NAME, BUILD_DISPLAY_NAME, JOB_DISPLAY_URL, GIT_BRANCH, 
SETUPTOOLS_USE_DISTUTILS, SHLVL, WORKSPACE_TMP, GIT_PREVIOUS_COMMIT, 
INFLUXDB_USER_PASSWORD, JAVA_HOME, BUILD_ID, LANG, XDG_SESSION_ID, JOB_NAME, 
SPARK_LOCAL_IP, BUILD_CAUSE, NODE_LABELS, HUDSON_URL, WORKSPACE, 
ROOT_BUILD_CAUSE_TIMERTRIGGER, _, GIT_COMMIT, CI, EXECUTOR_NUMBER, 
HUDSON_SERVER_COOKIE, SSH_CLIENT, JOB_BASE_NAME, USER, SSH_CONNECTION, 
BUILD_NUMBER, BUILD_URL, RUN_ARTIFACTS_DISPLAY_URL, GIT_URL, XDG_RUNTIME_DIR, 
HOME]
2021-09-30T12:28:57.210+0000 [DEBUG] 
[org.gradle.launcher.daemon.server.exec.LogToClient] About to start relaying 
all logs to the client via the connection.
2021-09-30T12:28:57.210+0000 [INFO] 
[org.gradle.launcher.daemon.server.exec.LogToClient] The client will now 
receive all logging from the daemon (pid: 1001). The daemon log file: 
/home/jenkins/.gradle/daemon/6.9.1/daemon-1001.out.log
2021-09-30T12:28:57.212+0000 [INFO] 
[org.gradle.launcher.daemon.server.exec.LogAndCheckHealth] Starting 2nd build 
in daemon [uptime: 7 mins 4.522 secs, performance: 100%, GC rate: 0.24/s, heap 
usage: 1% of 2.6 GiB]
2021-09-30T12:28:57.213+0000 [DEBUG] 
[org.gradle.launcher.daemon.server.exec.ExecuteBuild] The daemon has started 
executing the build.
2021-09-30T12:28:57.213+0000 [DEBUG] 
[org.gradle.launcher.daemon.server.exec.ExecuteBuild] Executing build with 
daemon context: 
DefaultDaemonContext[uid=304bbccf-1721-472f-aa78-0477d885fae1,javaHome=/usr/lib/jvm/java-8-openjdk-amd64,daemonRegistryDir=/home/jenkins/.gradle/daemon,pid=1001,idleTimeout=10800000,priority=NORMAL,daemonOpts=-Xmx4g,-Dfile.encoding=UTF-8,-Duser.country=US,-Duser.language=en,-Duser.variant]
Daemon vm is shutting down... The daemon has exited normally or was terminated 
in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may 
have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
> Task :buildSrc:compileGroovy
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to