See 
<https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/21/display/redirect?page=changes>

Changes:

[dpcollins] [BEAM-13402] Simplify PubsubLiteSink

[Kyle Weaver] [BEAM-13571] Fix ClassNotFound exception in Flink tests

[Kyle Weaver] [BEAM-13498] [BEAM-13573] exclude new tests on Flink

[noreply] Exclude UsesOnWindowExpiration by category from Dataflow v2 streaming

[noreply] [BEAM-13052] Increment pubsub python version and fix breakages. 
(#16126)

[noreply] [BEAM-13052] Add Pub/Sub Lite xlang transforms in python (#15727)

[noreply] [BEAM-13402] Version bump Pub/Sub Lite and implement changes to ensure


------------------------------------------
[...truncated 23.68 KB...]
> Task :model:fn-execution:shadowJar FROM-CACHE
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar FROM-CACHE
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :sdks:java:core:compileTestJava

> Task :runners:java-job-service:compileJava
Note: 
<https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/ws/src/runners/java-job-service/src/main/java/org/apache/beam/runners/jobsubmission/PortablePipelineJarCreator.java>
 uses unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:java-job-service:classes
> Task :runners:java-job-service:jar
> Task :runners:portability:java:compileJava
> Task :runners:spark:2:compileJava
> Task :runners:spark:3:compileJava
> Task :runners:portability:java:classes
> Task :runners:portability:java:jar

> Task :runners:spark:3:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:spark:3:classes
> Task :runners:spark:3:jar

> Task :runners:spark:2:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:spark:2:classes
> Task :runners:spark:2:jar

> Task :sdks:java:core:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:core:testClasses
> Task :sdks:java:core:shadowTestJar FROM-CACHE
> Task :runners:core-java:compileTestJava FROM-CACHE
> Task :runners:core-java:testClasses UP-TO-DATE
> Task :runners:core-java:testJar
> Task :runners:core-construction-java:compileTestJava FROM-CACHE
> Task :runners:core-construction-java:testClasses UP-TO-DATE
> Task :runners:core-construction-java:testJar

> Task :runners:portability:java:compileTestJava
Note: 
<https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/ws/src/runners/portability/java/src/test/java/org/apache/beam/runners/portability/CloseableResourceTest.java>
 uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :runners:portability:java:testClasses
> Task :runners:portability:java:testJar

> Task :runners:spark:2:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:spark:3:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:spark:3:testClasses
> Task :runners:spark:2:testClasses
> Task :runners:spark:3:testJar
> Task :runners:spark:2:testJar
> Task :runners:spark:3:job-server:validatesPortableRunnerBatch
> Task :runners:spark:2:job-server:validatesPortableRunnerBatch

org.apache.beam.sdk.transforms.PerKeyOrderingTest > 
testMultipleStatefulOrderingWithShuffle FAILED
    java.lang.RuntimeException at PerKeyOrderingTest.java:280

org.apache.beam.sdk.transforms.PerKeyOrderingTest > 
testSingleCallOrderingWithoutShuffle FAILED
    java.lang.RuntimeException at PerKeyOrderingTest.java:193

org.apache.beam.sdk.transforms.PerKeyOrderingTest > 
testSingleCallOrderingWithShuffle FAILED
    java.lang.RuntimeException at PerKeyOrderingTest.java:135

org.apache.beam.sdk.transforms.PerKeyOrderingTest > 
testMultipleStatefulOrderingWithoutShuffle FAILED
    java.lang.RuntimeException at PerKeyOrderingTest.java:321

org.apache.beam.runners.core.metrics.MetricsPusherTest > pushesUserMetrics 
FAILED
    java.lang.RuntimeException at MetricsPusherTest.java:70

org.apache.beam.runners.core.metrics.MetricsPusherTest > pushesSystemMetrics 
FAILED
    java.lang.RuntimeException at MetricsPusherTest.java:91
[shutdown-hook-0] INFO org.apache.spark.SparkContext - Invoking stop() from 
shutdown hook
[dispatcher-event-loop-2] INFO org.apache.spark.MapOutputTrackerMasterEndpoint 
- MapOutputTrackerMasterEndpoint stopped!
[shutdown-hook-0] INFO org.apache.spark.storage.memory.MemoryStore - 
MemoryStore cleared
[shutdown-hook-0] INFO org.apache.spark.storage.BlockManager - BlockManager 
stopped
[shutdown-hook-0] INFO org.apache.spark.storage.BlockManagerMaster - 
BlockManagerMaster stopped
[dispatcher-event-loop-1] INFO 
org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint
 - OutputCommitCoordinator stopped!
[shutdown-hook-0] INFO org.apache.spark.SparkContext - Successfully stopped 
SparkContext
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown 
hook called
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting 
directory /tmp/spark-2c49f868-1835-4cf6-9ee6-d67c04d300bc

> Task :runners:spark:3:job-server:validatesPortableRunnerBatch

org.apache.beam.sdk.transforms.PerKeyOrderingTest > 
testMultipleStatefulOrderingWithShuffle FAILED
    java.lang.RuntimeException at PerKeyOrderingTest.java:280

org.apache.beam.sdk.transforms.PerKeyOrderingTest > 
testSingleCallOrderingWithoutShuffle FAILED
    java.lang.RuntimeException at PerKeyOrderingTest.java:193

org.apache.beam.sdk.transforms.PerKeyOrderingTest > 
testSingleCallOrderingWithShuffle FAILED
    java.lang.RuntimeException at PerKeyOrderingTest.java:135

org.apache.beam.sdk.transforms.PerKeyOrderingTest > 
testMultipleStatefulOrderingWithoutShuffle FAILED
    java.lang.RuntimeException at PerKeyOrderingTest.java:321

org.apache.beam.runners.core.metrics.MetricsPusherTest > pushesUserMetrics 
FAILED
    java.lang.RuntimeException at MetricsPusherTest.java:70

org.apache.beam.runners.core.metrics.MetricsPusherTest > pushesSystemMetrics 
FAILED
    java.lang.RuntimeException at MetricsPusherTest.java:91
[shutdown-hook-0] INFO org.apache.spark.SparkContext - Invoking stop() from 
shutdown hook
[dispatcher-event-loop-1] INFO org.apache.spark.MapOutputTrackerMasterEndpoint 
- MapOutputTrackerMasterEndpoint stopped!
[shutdown-hook-0] INFO org.apache.spark.storage.memory.MemoryStore - 
MemoryStore cleared
[shutdown-hook-0] INFO org.apache.spark.storage.BlockManager - BlockManager 
stopped
[shutdown-hook-0] INFO org.apache.spark.storage.BlockManagerMaster - 
BlockManagerMaster stopped
[dispatcher-event-loop-0] INFO 
org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint
 - OutputCommitCoordinator stopped!
[shutdown-hook-0] INFO org.apache.spark.SparkContext - Successfully stopped 
SparkContext
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown 
hook called
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting 
directory /tmp/spark-13f49547-1ef7-4ef7-a36d-94a79bb335c6

> Task :runners:spark:2:job-server:validatesPortableRunnerBatch
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown 
hook called
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting 
directory /tmp/spark-e0e8c5b5-211b-471a-81bd-63f5d372abf0

> Task :runners:spark:3:job-server:validatesPortableRunnerBatch
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown 
hook called
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting 
directory /tmp/spark-7c568cef-5b12-41a7-9fe4-6581273f2004

> Task :runners:spark:2:job-server:validatesPortableRunnerBatch

org.apache.beam.sdk.transforms.ParDoTest$TimestampTests > 
testProcessElementSkew FAILED
    java.lang.RuntimeException at ParDoTest.java:2146

> Task :runners:spark:3:job-server:validatesPortableRunnerBatch

org.apache.beam.sdk.transforms.ParDoTest$TimestampTests > 
testProcessElementSkew FAILED
    java.lang.RuntimeException at ParDoTest.java:2146

> Task :runners:spark:2:job-server:validatesPortableRunnerBatch
[grpc-default-executor-16] INFO 
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Logging client 
hanged up.
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown 
hook called
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting 
directory /tmp/spark-d6eaccdb-4b5a-4208-a980-697439a14ea9

> Task :runners:spark:3:job-server:validatesPortableRunnerBatch
[grpc-default-executor-18] INFO 
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Logging client 
hanged up.
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown 
hook called
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting 
directory /tmp/spark-87059c6e-8d0f-49e3-971f-6bd441c73955

> Task :runners:spark:2:job-server:validatesPortableRunnerBatch
[grpc-default-executor-18] INFO 
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Logging client 
hanged up.
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown 
hook called
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting 
directory /tmp/spark-24743754-0118-4b85-9df3-e0a257cae906

232 tests completed, 7 failed, 1 skipped

> Task :runners:spark:2:job-server:validatesPortableRunnerBatch FAILED

> Task :runners:spark:3:job-server:validatesPortableRunnerBatch
[grpc-default-executor-21] INFO 
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Logging client 
hanged up.
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Shutdown 
hook called
[shutdown-hook-0] INFO org.apache.spark.util.ShutdownHookManager - Deleting 
directory /tmp/spark-9317a5fa-50df-4559-8bf6-c38b726b3e24

232 tests completed, 7 failed, 1 skipped

> Task :runners:spark:3:job-server:validatesPortableRunnerBatch FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:spark:2:job-server:validatesPortableRunnerBatch'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/ws/src/runners/spark/2/job-server/build/reports/tests/validatesPortableRunnerBatch/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:spark:3:job-server:validatesPortableRunnerBatch'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/ws/src/runners/spark/3/job-server/build/reports/tests/validatesPortableRunnerBatch/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 4 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 1h 11m 52s
80 actionable tasks: 50 executed, 28 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/jnkd7ngajj77y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to