See 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/2115/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] update shortcode languages from duplicate go to 
typescript

[cushon] Use a ClassLoadingStrategy that is compatible with Java 17+

[noreply] [Website] update case-studies logo images #22799 (#22793)

[noreply] [Website] change media-query max-width variable to ak-breakpoint-xl

[noreply] [Website] add overflow to code tags #22888 (#22427)

[noreply] Clean up Kafka Cluster and pubsub topic in rc validation script 
(#23021)

[noreply] Fix assertions in the Spanner IO IT tests (#23098)

[noreply] Use existing pickle_library flag in expansion service. (#23111)


------------------------------------------
[...truncated 679.42 KB...]
        role_urn: "beam:artifact:role:staging_to:v1"
        role_payload: 
"\n7nashorn-B0CAb3QpJET_6VsT3hcC_89VtvhV616H8N2bHAIaqWo.jar"
      >
      dependencies: <
        type_urn: "beam:artifact:type:file:v1"
        type_payload: 
"\nG/tmp/artifacts/cldrdata-gclqV9lOGqQOKesG-a6TAUTGEYlyIDqpyd4M_k7gtWo.jar\022@81c96a57d94e1aa40e29eb06f9ae930144c6118972203aa9c9de0cfe4ee0b56a"
        role_urn: "beam:artifact:role:staging_to:v1"
        role_payload: 
"\n8cldrdata-gclqV9lOGqQOKesG-a6TAUTGEYlyIDqpyd4M_k7gtWo.jar"
      >
      dependencies: <
        type_urn: "beam:artifact:type:file:v1"
        type_payload: 
"\nD/tmp/artifacts/dnsns-byzoQxdXengTvA5IlbH9gSIzfhNQElRYltuMJN0vjcc.jar\022@6f2ce84317577a7813bc0e4895b1fd8122337e135012545896db8c24dd2f8dc7"
        role_urn: "beam:artifact:role:staging_to:v1"
        role_payload: "\n5dnsns-byzoQxdXengTvA5IlbH9gSIzfhNQElRYltuMJN0vjcc.jar"
      >
      dependencies: <
        type_urn: "beam:artifact:type:file:v1"
        type_payload: 
"\nr/tmp/artifacts/beam-sdks-java-io-expansion-service-2.43.0-SNAPSHOT-Meq-FF-uYgZuI2xyZzkYmmJpdOL4QQzuqNO_ntKwFH8.jar\022@31eabe145fae62066e236c726739189a626974e2f8410ceea8d3bf9ed2b0147f"
        role_urn: "beam:artifact:role:staging_to:v1"
        role_payload: 
"\ncbeam-sdks-java-io-expansion-service-2.43.0-SNAPSHOT-Meq-FF-uYgZuI2xyZzkYmmJpdOL4QQzuqNO_ntKwFH8.jar"
      >
    >
  >
  environments: <
    key: "go"
    value: <
      urn: "beam:env:docker:v1"
      payload: "\n\026apache/beam_go_sdk:dev"
      capabilities: "beam:protocol:progress_reporting:v1"
      capabilities: "beam:protocol:multi_core_bundle_processing:v1"
      capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
      capabilities: "beam:protocol:worker_status:v1"
      capabilities: "beam:protocol:monitoring_info_short_ids:v1"
      capabilities: "beam:version:sdk_base:go"
      capabilities: "beam:coder:bytes:v1"
      capabilities: "beam:coder:bool:v1"
      capabilities: "beam:coder:varint:v1"
      capabilities: "beam:coder:double:v1"
      capabilities: "beam:coder:string_utf8:v1"
      capabilities: "beam:coder:length_prefix:v1"
      capabilities: "beam:coder:kv:v1"
      capabilities: "beam:coder:iterable:v1"
      capabilities: "beam:coder:state_backed_iterable:v1"
      capabilities: "beam:coder:windowed_value:v1"
      capabilities: "beam:coder:global_window:v1"
      capabilities: "beam:coder:interval_window:v1"
      capabilities: "beam:coder:row:v1"
      capabilities: "beam:coder:nullable:v1"
      dependencies: <
        type_urn: "beam:artifact:type:file:v1"
        role_urn: "beam:artifact:role:go_worker_binary:v1"
      >
    >
  >
>
root_transform_ids: "s1"
2022/09/09 20:38:48 Cross-compiling 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/sdks/go/test/integration/io/xlang/kafka/kafka_test.go>
 as /tmp/worker-1-1662755928356302767
2022/09/09 20:38:50 Prepared job with id: 
go-testkafkaio_basicreadwrite-444_5459fd7d-d543-4cf3-9f5e-3acd6729bab7 and 
staging token: 
go-testkafkaio_basicreadwrite-444_5459fd7d-d543-4cf3-9f5e-3acd6729bab7
2022/09/09 20:38:50 Staged binary artifact with token: 
2022/09/09 20:38:50 Submitted job: 
go0testkafkaio0basicreadwrite0444-jenkins-0909203850-f6a0b054_7ff50945-6ee4-4313-89be-e03dabf33755
2022/09/09 20:38:50 Job state: STOPPED
2022/09/09 20:38:50 Job state: STARTING
2022/09/09 20:38:50 Job state: RUNNING
2022/09/09 20:39:01  (): 
org.apache.beam.sdk.Pipeline$PipelineExecutionException: 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalStateException: No container running for id 
22d098584b043f18a7b0292477cd27fbe623edcf139be29b948cf28b78dba067
        at 
org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:73)
        at 
org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:104)
        at 
org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:92)
        at 
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:186)
        at 
org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)
Caused by: 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalStateException: No container running for id 
22d098584b043f18a7b0292477cd27fbe623edcf139be29b948cf28b78dba067
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:451)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:436)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:303)
        at 
org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:38)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:202)
        at 
org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:142)
        at 
org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:81)
        at 
org.apache.spark.api.java.JavaRDDLike.$anonfun$mapPartitions$1(JavaRDDLike.scala:153)
        at org.apache.spark.rdd.RDD.$anonfun$mapPartitions$2(RDD.scala:863)
        at 
org.apache.spark.rdd.RDD.$anonfun$mapPartitions$2$adapted(RDD.scala:863)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:337)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:337)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
        at org.apache.spark.scheduler.Task.run(Task.scala:131)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:497)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:500)
        ... 3 more
Caused by: java.lang.IllegalStateException: No container running for id 
22d098584b043f18a7b0292477cd27fbe623edcf139be29b948cf28b78dba067
        at 
org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.createEnvironment(DockerEnvironmentFactory.java:137)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:252)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:231)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
        ... 28 more
        Suppressed: java.io.IOException: Received exit code 1 for command 
'docker kill 22d098584b043f18a7b0292477cd27fbe623edcf139be29b948cf28b78dba067'. 
stderr: Error response from daemon: Cannot kill container: 
22d098584b043f18a7b0292477cd27fbe623edcf139be29b948cf28b78dba067: Container 
22d098584b043f18a7b0292477cd27fbe623edcf139be29b948cf28b78dba067 is not running
                at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:237)
                at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:171)
                at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.killContainer(DockerCommand.java:151)
                at 
org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.createEnvironment(DockerEnvironmentFactory.java:161)
                ... 34 more
2022/09/09 20:39:01  (): java.lang.IllegalStateException: No container running 
for id 22d098584b043f18a7b0292477cd27fbe623edcf139be29b948cf28b78dba067
2022/09/09 20:39:01 Job state: FAILED
    ptest.go:108: Failed to execute job: job 
go0testkafkaio0basicreadwrite0444-jenkins-0909203850-f6a0b054_7ff50945-6ee4-4313-89be-e03dabf33755
 failed
--- FAIL: TestKafkaIO_BasicReadWrite (15.65s)
FAIL
FAIL    github.com/apache/beam/sdks/v2/go/test/integration/io/xlang/kafka       
21.708s
FAIL

> Task :runners:spark:3:job-server:validatesCrossLanguageRunnerGoUsingJava 
> FAILED
> Task :runners:spark:3:job-server:validatesCrossLanguageRunnerJavaUsingPython

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest$CoGroupByKeyTest
 > test FAILED
    java.lang.RuntimeException at JobServicePipelineResult.java:176

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest$CombineGloballyTest
 > test FAILED
    java.lang.RuntimeException at JobServicePipelineResult.java:176

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest$CombinePerKeyTest
 > test FAILED
    java.lang.RuntimeException at JobServicePipelineResult.java:176

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest$FlattenTest > 
test FAILED
    java.lang.RuntimeException at JobServicePipelineResult.java:176

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest$GroupByKeyTest
 > test FAILED
    java.lang.RuntimeException at JobServicePipelineResult.java:176

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest$MultiInputOutputWithSideInputTest
 > test FAILED
    java.lang.RuntimeException at JobServicePipelineResult.java:176

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest$PartitionTest 
> test FAILED
    java.lang.RuntimeException at JobServicePipelineResult.java:176

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest$PythonDependenciesTest
 > test FAILED
    java.lang.RuntimeException at JobServicePipelineResult.java:176

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest$SingleInputOutputTest
 > test FAILED
    java.lang.RuntimeException at JobServicePipelineResult.java:176

org.apache.beam.sdk.extensions.python.transforms.DataframeTransformTest > 
testDataframeSum FAILED
    java.lang.RuntimeException at JobServicePipelineResult.java:176

org.apache.beam.sdk.extensions.python.transforms.PythonMapTest > testPythonMap 
FAILED
    java.lang.RuntimeException at JobServicePipelineResult.java:176

org.apache.beam.sdk.extensions.python.transforms.PythonMapTest > 
testPythonFlatMap FAILED
    java.lang.RuntimeException at JobServicePipelineResult.java:176

org.apache.beam.sdk.extensions.python.transforms.RunInferenceTransformTest > 
testRunInferenceWithKVs FAILED
    java.lang.RuntimeException at JobServicePipelineResult.java:176

org.apache.beam.sdk.extensions.python.transforms.RunInferenceTransformTest > 
testRunInference FAILED
    java.lang.RuntimeException at JobServicePipelineResult.java:176

16 tests completed, 14 failed, 2 skipped

> Task :runners:spark:3:job-server:validatesCrossLanguageRunnerJavaUsingPython 
> FAILED
> Task :runners:spark:3:job-server:validatesCrossLanguageRunnerPythonUsingJava
> Task :runners:spark:3:job-server:validatesCrossLanguageRunnerPythonUsingJava 
> FAILED
> Task :runners:spark:3:job-server:validatesCrossLanguageRunnerPythonUsingSql
> Task :runners:spark:3:job-server:validatesCrossLanguageRunnerPythonUsingSql 
> FAILED
> Task :runners:spark:3:job-server:sparkJobServerCleanup
> Task :runners:spark:3:job-server:validatesCrossLanguageRunnerCleanup

FAILURE: Build completed with 4 failures.

1: Task failed with an exception.
-----------
* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/sdks/go/test/build.gradle'>
 line: 199

* What went wrong:
Execution failed for task 
':runners:spark:3:job-server:validatesCrossLanguageRunnerGoUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:spark:3:job-server:validatesCrossLanguageRunnerJavaUsingPython'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/runners/spark/3/job-server/build/reports/tests/validatesCrossLanguageRunnerJavaUsingPython/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:spark:3:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
==============================================================================

4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:spark:3:job-server:validatesCrossLanguageRunnerPythonUsingSql'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 29m 13s
263 actionable tasks: 200 executed, 51 from cache, 12 up-to-date

Publishing build scan...
https://gradle.com/s/brfc3tm3zch7k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to