See
<https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1162/display/redirect?page=changes>
Changes:
[Pablo Estrada] Updating BigQuery client for Python
[Andrew Pilloud] [BEAM-11165] ZetaSQL Calc only convert referenced columns
[Robin Qiu] Support read/write ZetaSQL DATETIME/NUMERIC types from/to BigQuery
[Robin Qiu] Address comments
[Kenneth Knowles] Suppress nullness errors in new files since last round of
suppressions
[Kenneth Knowles] Fix position of @Nullable annotations since last round
[Kenneth Knowles] Exclude nonexistent org.checkerframework:jdk8 from all
configurations
[Kenneth Knowles] Fix nullness error in Kotlin WriteOneFilePerWindow
[Kenneth Knowles] Allow checkerframework on API surfaces
[Kenneth Knowles] Enable checkerframework globally
[je.ik] [BEAM-11191] fix ClassCastException when clearing watermark state
[noreply] [BEAM-3736] Let users know that CombineFn.setup and teardown are not
[noreply] [BEAM-11151] Adds the ToString well-known transform URN (#13214)
[noreply] Merge pull request #13164 from Refactoring BigQuery Read utilities
into
[Robert Burke] Moving to 2.27.0-SNAPSHOT on master branch.
[Andrew Pilloud] [BEAM-11165] Use the ZetaSQL Streaming API synchronously
[noreply] [BEAM-11159] Use GCP pubsub client for TestPubsub (#13273)
------------------------------------------
[...truncated 3.78 MB...]
INFO: Started 0 remote fetches in 1 ms
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 5.0 (TID 26). 5149 bytes result sent to driver
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 5.0 (TID 25). 5149 bytes result sent to driver
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 5.0 (TID 27). 5149 bytes result sent to driver
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 5.0 (TID 26) in 44 ms on localhost (executor
driver) (1/4)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 5.0 (TID 25) in 45 ms on localhost (executor
driver) (2/4)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 5.0 (TID 27) in 45 ms on localhost (executor
driver) (3/4)
Nov 06, 2020 11:05:56 AM
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 4 file results
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
createMissingEmptyShards
INFO: Finalizing for destination null num shards 4.
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
moveToOutputFiles
INFO: Will copy temporary file
FileResult{tempFilename=/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/5ac80302-6cb3-41ba-8407-e12a67381145,
shard=0, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@59023e3,
paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0,
onTimeIndex=0}} to final location
/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/counts-00000-of-00004
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
moveToOutputFiles
INFO: Will copy temporary file
FileResult{tempFilename=/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/24b867c2-b8b5-4ee2-b3d4-41b95169dfb8,
shard=1, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@59023e3,
paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0,
onTimeIndex=0}} to final location
/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/counts-00001-of-00004
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
moveToOutputFiles
INFO: Will copy temporary file
FileResult{tempFilename=/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/f47c807a-28f8-4b57-856d-2af8f82e2519,
shard=2, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@59023e3,
paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0,
onTimeIndex=0}} to final location
/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/counts-00002-of-00004
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
moveToOutputFiles
INFO: Will copy temporary file
FileResult{tempFilename=/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/37babf3d-f394-4103-b3fe-bed925835cbe,
shard=3, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@59023e3,
paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0,
onTimeIndex=0}} to final location
/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/counts-00003-of-00004
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
removeTemporaryFiles
INFO: Will remove known temporary file
/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/5ac80302-6cb3-41ba-8407-e12a67381145
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
removeTemporaryFiles
INFO: Will remove known temporary file
/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/37babf3d-f394-4103-b3fe-bed925835cbe
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
removeTemporaryFiles
INFO: Will remove known temporary file
/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/f47c807a-28f8-4b57-856d-2af8f82e2519
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
removeTemporaryFiles
INFO: Will remove known temporary file
/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/24b867c2-b8b5-4ee2-b3d4-41b95169dfb8
Nov 06, 2020 11:05:56 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
removeTemporaryFiles
WARNING: Failed to match temporary files under:
[/tmp/groovy-generated-7264449248025570994-tmpdir/word-count-beam/.temp-beam-352271a3-9096-4c8d-a85c-1652a5693f3f/].
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24). 12857 bytes result sent to driver
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24) in 114 ms on localhost (executor
driver) (4/4)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 5.0, whose tasks have all completed, from pool
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 5 (repartition at GroupCombineFunctions.java:191)
finished in 0.127 s
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 6)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 6
(WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output
MapPartitionsRDD[113] at values at TransformTranslator.java:434), which has no
missing parents
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7 stored as values in memory (estimated size 16.0 KB,
free 13.5 GB)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7_piece0 stored as bytes in memory (estimated size 7.3
KB, free 13.5 GB)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_7_piece0 in memory on localhost:45073 (size: 7.3 KB,
free: 13.5 GB)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 7 from broadcast at DAGScheduler.scala:1184
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 6
(WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output
MapPartitionsRDD[113] at values at TransformTranslator.java:434) (first 15
tasks are for partitions Vector(0, 1, 2, 3))
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 6.0 with 4 tasks
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 6.0 (TID 28, localhost, executor driver,
partition 0, NODE_LOCAL, 7938 bytes)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 6.0 (TID 29, localhost, executor driver,
partition 1, NODE_LOCAL, 7938 bytes)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 6.0 (TID 30, localhost, executor driver,
partition 2, NODE_LOCAL, 7938 bytes)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 6.0 (TID 31, localhost, executor driver,
partition 3, NODE_LOCAL, 7938 bytes)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 6.0 (TID 28)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 6.0 (TID 31)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 6.0 (TID 29)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 6.0 (TID 30)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28). 6453 bytes result sent to driver
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30). 6453 bytes result sent to driver
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29). 6453 bytes result sent to driver
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31). 6453 bytes result sent to driver
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28) in 17 ms on localhost (executor
driver) (1/4)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30) in 16 ms on localhost (executor
driver) (2/4)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31) in 17 ms on localhost (executor
driver) (3/4)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29) in 18 ms on localhost (executor
driver) (4/4)
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 6.0, whose tasks have all completed, from pool
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 6 (foreach at BoundedDataset.java:127) finished in 0.030 s
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:127, took 0.208097 s
Nov 06, 2020 11:05:56 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Nov 06, 2020 11:05:56 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@2ea562da{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://localhost:4040
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Nov 06, 2020 11:05:56 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]
> Task :runners:direct-java:runMobileGamingJavaDirect
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java
(default-cli) on project word-count-beam: An exception occured while executing
the Java class. java.lang.NoSuchMethodError:
com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder;
-> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please
read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
> Task :runners:google-cloud-dataflow-java:runQuickstartJavaDataflow
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java
(default-cli) on project word-count-beam: An exception occured while executing
the Java class. Failed to construct instance from factory method
DataflowRunner#fromOptions(interface
org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException:
com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder;
-> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please
read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java
(default-cli) on project word-count-beam: An exception occured while executing
the Java class. Failed to construct instance from factory method
DataflowRunner#fromOptions(interface
org.apache.beam.sdk.options.PipelineOptions): InvocationTargetException:
com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder;
-> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please
read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
> Task :runners:direct-java:runMobileGamingJavaDirect FAILED
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java
(default-cli) on project word-count-beam: An exception occured while executing
the Java class. java.lang.NoSuchMethodError:
com.google.common.cache.CacheBuilder.expireAfterWrite(Ljava/time/Duration;)Lcom/google/common/cache/CacheBuilder;
-> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please
read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command
> Task :runners:google-cloud-dataflow-java:runQuickstartJavaDataflow FAILED
[ERROR] Failed command
> Task :runners:twister2:runQuickstartJavaTwister2
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/tmp/groovy-generated-8920322553541514689-tmpdir/.m2/repository/org/slf4j/slf4j-jdk14/1.7.30/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/tmp/groovy-generated-8920322553541514689-tmpdir/.m2/repository/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
Nov 06, 2020 11:10:16 AM org.apache.beam.runners.twister2.Twister2Runner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from
the classpath: will stage {} files. Enable logging at DEBUG level to see which
files will be staged376
Nov 06, 2020 11:10:17 AM org.apache.beam.runners.twister2.Twister2Runner run
INFO: Translating pipeline to Twister2 program.
Nov 06, 2020 11:10:28 AM org.apache.beam.runners.twister2.Twister2Runner run
WARNING: Twister2 Local Mode currently only supports single worker
Nov 06, 2020 11:10:28 AM edu.iu.dsc.tws.rsched.core.ResourceAllocator loadConfig
INFO: Loaded configuration with twister2_home: /tmp and configuration:
/tmp/conf/ and cluster: standalone
grep Foundation counts*
Foundation: 1
Verified Foundation: 1
[SUCCESS]
FAILURE: Build completed with 3 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with
> non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runMobileGamingJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with
> non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with
> non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 10m 12s
8 actionable tasks: 7 executed, 1 from cache
Publishing build scan...
https://gradle.com/s/gdic46w3qilq4
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]