See <https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/420/display/redirect>
Changes: ------------------------------------------ [...truncated 22.49 KB...] ---> Using cache ---> eb2846f69899 Successfully built eb2846f69899 Successfully tagged golicenses-java:latest > Task :sdks:java:container:downloadCloudProfilerAgent version.txt NOTICES profiler_java_agent.so > Task :sdks:java:core:generateTestAvroJava > Task :sdks:java:core:generateTestGrammarSource NO-SOURCE > Task :sdks:java:core:processTestResources > Task :model:pipeline:extractIncludeProto > Task :model:pipeline:extractProto > Task :model:pipeline:generateProto FROM-CACHE > Task :model:pipeline:compileJava FROM-CACHE > Task :model:pipeline:processResources > Task :model:pipeline:classes > Task :sdks:java:container:goPrepare UP-TO-DATE > Task :model:pipeline:jar > Task :model:pipeline:shadowJar FROM-CACHE > Task :release:go-licenses:java:dockerRun + rm -rf '/output/*' + export GO111MODULE=off + GO111MODULE=off + go get github.com/apache/beam/sdks/java/container > Task :model:fn-execution:extractIncludeProto > Task :model:job-management:extractIncludeProto > Task :model:fn-execution:generateProto FROM-CACHE > Task :model:job-management:generateProto FROM-CACHE > Task :model:job-management:compileJava FROM-CACHE > Task :model:job-management:classes > Task :model:fn-execution:compileJava FROM-CACHE > Task :model:fn-execution:classes > Task :model:job-management:shadowJar FROM-CACHE > Task :model:fn-execution:shadowJar FROM-CACHE > Task :sdks:java:container:goBuild > Task :sdks:java:container:java8:copySdkHarnessLauncher Execution optimizations have been disabled for task ':sdks:java:container:java8:copySdkHarnessLauncher' to ensure correctness due to the following reasons: - Gradle detected a problem with the following location: '<https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/ws/src/sdks/java/container/build/target'.> Reason: Task ':sdks:java:container:java8:copySdkHarnessLauncher' uses this output of task ':sdks:java:container:downloadCloudProfilerAgent' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.3.2/userguide/validation_problems.html#implicit_dependency for more details about this problem. > Task :sdks:java:core:compileJava FROM-CACHE > Task :sdks:java:core:classes > Task :sdks:java:core:shadowJar FROM-CACHE > Task :sdks:java:fn-execution:compileJava FROM-CACHE > Task :sdks:java:fn-execution:classes UP-TO-DATE > Task :sdks:java:core:jar > Task :sdks:java:fn-execution:jar > Task :runners:core-construction-java:compileJava FROM-CACHE > Task :runners:core-construction-java:classes UP-TO-DATE > Task :runners:core-construction-java:jar > Task :runners:core-java:compileJava FROM-CACHE > Task :runners:core-java:classes UP-TO-DATE > Task :runners:core-java:jar > Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE > Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:jar > Task :sdks:java:harness:compileJava FROM-CACHE > Task :sdks:java:harness:classes UP-TO-DATE > Task :sdks:java:harness:jar > Task :runners:java-fn-execution:compileJava FROM-CACHE > Task :runners:java-fn-execution:classes UP-TO-DATE > Task :runners:java-fn-execution:jar > Task :sdks:java:expansion-service:compileJava FROM-CACHE > Task :sdks:java:expansion-service:classes UP-TO-DATE > Task :sdks:java:expansion-service:jar > Task :runners:java-job-service:compileJava FROM-CACHE > Task :runners:java-job-service:classes UP-TO-DATE > Task :runners:java-job-service:jar > Task :sdks:java:io:kafka:compileJava FROM-CACHE > Task :sdks:java:io:kafka:classes UP-TO-DATE > Task :sdks:java:io:kafka:jar > Task :sdks:java:core:compileTestJava FROM-CACHE > Task :sdks:java:core:testClasses > Task :sdks:java:container:generateLicenseReport > Task :sdks:java:core:shadowTestJar FROM-CACHE > Task :runners:spark:2:compileJava FROM-CACHE > Task :runners:spark:2:classes > Task :runners:spark:2:jar > Task :runners:core-java:compileTestJava FROM-CACHE > Task :runners:core-java:testClasses UP-TO-DATE > Task :runners:core-java:testJar > Task :runners:core-construction-java:compileTestJava FROM-CACHE > Task :runners:core-construction-java:testClasses UP-TO-DATE > Task :runners:core-construction-java:testJar > Task :sdks:java:harness:shadowJar FROM-CACHE > Task :sdks:java:container:java8:copyDockerfileDependencies > Task :sdks:java:container:pullLicenses Copying already-fetched licenses from <https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/ws/src/sdks/java/container/build/reports/dependency-license> to <https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/ws/src/sdks/java/container/build/target/java_third_party_licenses> Collecting pip Using cached pip-22.0.4-py3-none-any.whl (2.1 MB) Collecting setuptools Using cached setuptools-62.1.0-py3-none-any.whl (1.1 MB) Collecting wheel Using cached wheel-0.37.1-py2.py3-none-any.whl (35 kB) Installing collected packages: pip, setuptools, wheel Attempting uninstall: pip Found existing installation: pip 20.0.2 Uninstalling pip-20.0.2: Successfully uninstalled pip-20.0.2 Attempting uninstall: setuptools Found existing installation: setuptools 44.0.0 Uninstalling setuptools-44.0.0: Successfully uninstalled setuptools-44.0.0 Successfully installed pip-22.0.4 setuptools-62.1.0 wheel-0.37.1 Collecting beautifulsoup4<5.0,>=4.9.0 Using cached beautifulsoup4-4.11.1-py3-none-any.whl (128 kB) Collecting pyyaml<6.0.0,>=3.12 Using cached PyYAML-5.4.1-cp38-cp38-manylinux1_x86_64.whl (662 kB) Collecting tenacity<6.0,>=5.0.2 Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB) Collecting soupsieve>1.2 Using cached soupsieve-2.3.2-py3-none-any.whl (37 kB) Collecting six>=1.9.0 Using cached six-1.16.0-py2.py3-none-any.whl (11 kB) Installing collected packages: soupsieve, six, pyyaml, tenacity, beautifulsoup4 Successfully installed beautifulsoup4-4.11.1 pyyaml-5.4.1 six-1.16.0 soupsieve-2.3.2 tenacity-5.1.5 Executing python <https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py> --license_index=<https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/ws/src/sdks/java/container/build/reports/dependency-license/index.json> --output_dir=<https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/ws/src/sdks/java/container/build/target/java_third_party_licenses> --dep_url_yaml=<https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> --manual_license_path=<https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/ws/src/sdks/java/container/license_scripts/manual_licenses> INFO:root:Pulling license for 225 dependencies using 16 threads. > Task :runners:spark:2:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:spark:2:testClasses > Task :sdks:java:container:pullLicenses INFO:root:Replaced local file URL with file://<https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/ws/src/sdks/java/container/license_scripts/manual_licenses/xz/COPYING> for xz-1.5 > Task :runners:spark:2:testJar > Task :sdks:java:container:pullLicenses INFO:root:pull_licenses_java.py succeed. It took 10.198848 seconds with 16 threads. Copying licenses from <https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/ws/src/sdks/java/container/build/target/java_third_party_licenses> to <https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/ws/src/sdks/java/container/build/target/third_party_licenses.> Finished license_scripts.sh > Task :sdks:java:container:java8:copyJavaThirdPartyLicenses > Task :runners:portability:java:compileJava > Task :runners:portability:java:classes > Task :runners:portability:java:jar > Task :runners:spark:3:compileJava > Task :runners:portability:java:compileTestJava Note: <https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/ws/src/runners/portability/java/src/test/java/org/apache/beam/runners/portability/CloseableResourceTest.java> uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :runners:portability:java:testClasses > Task :runners:portability:java:testJar > Task :release:go-licenses:java:dockerRun package golang.org/x/net/trace: unrecognized import path "golang.org/x/net/trace": reading https://golang.org/x/net/trace?go-get=1: 500 Internal Server Error package golang.org/x/sys/unix: unrecognized import path "golang.org/x/sys/unix": reading https://golang.org/x/sys/unix?go-get=1: 500 Internal Server Error > Task :runners:spark:2:job-server:validatesPortableRunnerBatch > Task :runners:spark:3:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:spark:3:classes > Task :release:go-licenses:java:dockerRun package golang.org/x/net/http2: unrecognized import path "golang.org/x/net/http2": reading https://golang.org/x/net/http2?go-get=1: 500 Internal Server Error package golang.org/x/net/http2/hpack: unrecognized import path "golang.org/x/net/http2/hpack": reading https://golang.org/x/net/http2/hpack?go-get=1: 500 Internal Server Error > Task :runners:spark:3:jar > Task :runners:spark:3:compileTestJava FROM-CACHE > Task :runners:spark:3:testClasses > Task :runners:spark:3:testJar > Task :runners:spark:3:job-server:validatesPortableRunnerBatch > Task :release:go-licenses:java:dockerRun FAILED > Task :runners:spark:2:job-server:validatesPortableRunnerBatch [shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called [shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-251159bd-7a9f-438c-8173-2197429c0ef0 > Task :runners:spark:3:job-server:validatesPortableRunnerBatch [grpc-default-executor-3] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Logging client hanged up. [shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called [shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-8d36f1f6-6bae-4424-b406-e98ea142c123 > Task :runners:spark:2:job-server:validatesPortableRunnerBatch [grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Logging client hanged up. [shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called [shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-a7a55a9a-6aa6-46ac-94af-0d75a83c68dc > Task :runners:spark:3:job-server:validatesPortableRunnerBatch [grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Logging client hanged up. [shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called [shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-b34d4c1d-d625-408c-8442-2617abaaf37a > Task :runners:spark:2:job-server:validatesPortableRunnerBatch [shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called [shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-17163b29-c36a-439a-8883-37d54d889e71 > Task :runners:spark:3:job-server:validatesPortableRunnerBatch [grpc-default-executor-34] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Logging client hanged up. [shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called [shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-f1852f87-7d14-4eb4-b2f3-dbd1fcc87b5f > Task :runners:spark:2:job-server:validatesPortableRunnerBatch [grpc-default-executor-63] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Logging client hanged up. [shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called [shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-e871ae31-3435-40ca-8b9b-4f31b8a7a540 > Task :runners:spark:3:job-server:validatesPortableRunnerBatch [grpc-default-executor-70] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Logging client hanged up. [shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called [shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-c3ac3c52-b20c-42f1-aac9-f5316738bb54 FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':release:go-licenses:java:dockerRun'. > Process 'command 'docker'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness. Please consult deprecation warnings for more details. BUILD FAILED in 1h 40m 35s 111 actionable tasks: 75 executed, 32 from cache, 4 up-to-date Publishing build scan... https://gradle.com/s/w5dw62wasumqq Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
