See 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_GoUsingJava_Dataflow/959/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] update table text content overflow #23460

[Moritz Mack] [Spark dataset runner] Fix translation to run in the evaluation 
thread

[Moritz Mack] [Metrics] Add 'performance tests' tag to JMH dashboard (related to

[noreply] Bump github.com/aws/aws-sdk-go-v2/credentials in /sdks (#24318)

[noreply] Update apache beam installation in notebook (#24336)

[noreply] Adds GCP core dependency to the test expansion service (#24308)

[noreply] Update dataflow containers to coincide with objsize 0.6.1 update

[noreply] Add test configurations for deterministic outputs on Dataflow (#24325)


------------------------------------------
[...truncated 826.37 KB...]
          {
            "location": 
"gs://temp-storage-for-end-to-end-tests/staging-validatesrunner-test/test26191/go-5-1669238350919156219/xlang/jaccess-Rlp7GfuSFOBCBGdO8af2XxrTz5LqaQqwSMAXumdbHN0.jar",
            "name": "jaccess-Rlp7GfuSFOBCBGdO8af2XxrTz5LqaQqwSMAXumdbHN0.jar"
          },
          {
            "location": 
"gs://temp-storage-for-end-to-end-tests/staging-validatesrunner-test/test26191/go-5-1669238350919156219/xlang/localedata-Nuzln2shadVmT-PppqaNc7GHBbUyMUb9pXsJTjRiAzs.jar",
            "name": "localedata-Nuzln2shadVmT-PppqaNc7GHBbUyMUb9pXsJTjRiAzs.jar"
          },
          {
            "location": 
"gs://temp-storage-for-end-to-end-tests/staging-validatesrunner-test/test26191/go-5-1669238350919156219/xlang/nashorn-1iQRhJvU2nRCSB4ucg6f4b69TGiwE1yq0_LGG2MU5N0.jar",
            "name": "nashorn-1iQRhJvU2nRCSB4ucg6f4b69TGiwE1yq0_LGG2MU5N0.jar"
          },
          {
            "location": 
"gs://temp-storage-for-end-to-end-tests/staging-validatesrunner-test/test26191/go-5-1669238350919156219/xlang/cldrdata-MTK3mMuUzlWYlYkweGQNzI7dFpuLG0_8F8oxT0khQDc.jar",
            "name": "cldrdata-MTK3mMuUzlWYlYkweGQNzI7dFpuLG0_8F8oxT0khQDc.jar"
          }
        ],
        "sdkHarnessContainerImages": [
          {
            "containerImage": 
"us.gcr.io/apache-beam-testing/jenkins/beam_go_sdk:20221123-204910"
          },
          {
            "containerImage": 
"us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221123202611"
          }
        ],
        "workerHarnessContainerImage": 
"us.gcr.io/apache-beam-testing/jenkins/beam_go_sdk:20221123-204910"
      }
    ]
  },
  "name": "go-testxlang_flatten-125",
  "projectId": "apache-beam-testing",
  "type": "JOB_TYPE_BATCH"
}
2022/11/23 21:19:21 Submitted job: 2022-11-23_13_19_21-100758232470030708
2022/11/23 21:19:21 Console: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-23_13_19_21-100758232470030708?project=apache-beam-testing
2022/11/23 21:19:21 Logs: 
https://console.cloud.google.com/logs/viewer?project=apache-beam-testing&resource=dataflow_step%2Fjob_id%2F2022-11-23_13_19_21-100758232470030708
2022/11/23 21:19:21 Job state: JOB_STATE_PENDING ...
2022/11/23 21:19:51 Job still running ...
2022/11/23 21:20:21 Job still running ...
2022/11/23 21:20:51 Job still running ...
2022/11/23 21:21:21 Job still running ...
2022/11/23 21:21:51 Job still running ...
2022/11/23 21:22:21 Job still running ...
2022/11/23 21:22:51 Job still running ...
2022/11/23 21:23:21 Job still running ...
2022/11/23 21:23:51 Job still running ...
2022/11/23 21:24:22 Job 2022-11-23_13_19_21-100758232470030708 succeeded!
--- PASS: TestXLang_Flatten (311.30s)
=== RUN   TestXLang_GroupBy
--- FAIL: TestXLang_GroupBy (0.00s)
panic:  tried cross-language for beam:transforms:xlang:test:gbk against 
localhost:38609 and failed
        expanding external transform
        expanding transform with ExpansionRequest: 
components:{pcollections:{key:"n3"  value:{unique_name:"n3"  
coder_id:"c2@LKFAQWbvXA"  is_bounded:BOUNDED  
windowing_strategy_id:"w0@LKFAQWbvXA"}}  
windowing_strategies:{key:"w0@LKFAQWbvXA"  
value:{window_fn:{urn:"beam:window_fn:global_windows:v1"}  
merge_status:NON_MERGING  window_coder_id:"c3@LKFAQWbvXA"  trigger:{default:{}} 
 accumulation_mode:DISCARDING  output_time:END_OF_WINDOW  
closing_behavior:EMIT_IF_NONEMPTY  on_time_behavior:FIRE_IF_NONEMPTY  
environment_id:"go"}}  coders:{key:"c0@LKFAQWbvXA"  
value:{spec:{urn:"beam:coder:string_utf8:v1"}}}  coders:{key:"c1@LKFAQWbvXA"  
value:{spec:{urn:"beam:coder:varint:v1"}}}  coders:{key:"c2@LKFAQWbvXA"  
value:{spec:{urn:"beam:coder:kv:v1"}  component_coder_ids:"c0@LKFAQWbvXA"  
component_coder_ids:"c1@LKFAQWbvXA"}}  coders:{key:"c3@LKFAQWbvXA"  
value:{spec:{urn:"beam:coder:global_window:v1"}}}}  
transform:{unique_name:"External"  spec:{urn:"beam:transforms:xlang:test:gbk"}  
inputs:{key:"i0"  value:"n3"}  environment_id:"go"}  namespace:"LKFAQWbvXA"
expansion failed
        caused by:
rpc error: code = Unavailable desc = connection error: desc = "transport: Error 
while dialing dial tcp 127.0.0.1:38609: connect: connection refused" [recovered]
        panic:  tried cross-language for beam:transforms:xlang:test:gbk against 
localhost:38609 and failed
        expanding external transform
        expanding transform with ExpansionRequest: 
components:{pcollections:{key:"n3"  value:{unique_name:"n3"  
coder_id:"c2@LKFAQWbvXA"  is_bounded:BOUNDED  
windowing_strategy_id:"w0@LKFAQWbvXA"}}  
windowing_strategies:{key:"w0@LKFAQWbvXA"  
value:{window_fn:{urn:"beam:window_fn:global_windows:v1"}  
merge_status:NON_MERGING  window_coder_id:"c3@LKFAQWbvXA"  trigger:{default:{}} 
 accumulation_mode:DISCARDING  output_time:END_OF_WINDOW  
closing_behavior:EMIT_IF_NONEMPTY  on_time_behavior:FIRE_IF_NONEMPTY  
environment_id:"go"}}  coders:{key:"c0@LKFAQWbvXA"  
value:{spec:{urn:"beam:coder:string_utf8:v1"}}}  coders:{key:"c1@LKFAQWbvXA"  
value:{spec:{urn:"beam:coder:varint:v1"}}}  coders:{key:"c2@LKFAQWbvXA"  
value:{spec:{urn:"beam:coder:kv:v1"}  component_coder_ids:"c0@LKFAQWbvXA"  
component_coder_ids:"c1@LKFAQWbvXA"}}  coders:{key:"c3@LKFAQWbvXA"  
value:{spec:{urn:"beam:coder:global_window:v1"}}}}  
transform:{unique_name:"External"  spec:{urn:"beam:transforms:xlang:test:gbk"}  
inputs:{key:"i0"  value:"n3"}  environment_id:"go"}  namespace:"LKFAQWbvXA"
expansion failed
        caused by:
rpc error: code = Unavailable desc = connection error: desc = "transport: Error 
while dialing dial tcp 127.0.0.1:38609: connect: connection refused"

goroutine 1177 [running]:
testing.tRunner.func1.2({0xda6580, 0xc0002a7040})
        /home/jenkins/sdk/go1.19.3/src/testing/testing.go:1396 +0x24e
testing.tRunner.func1()
        /home/jenkins/sdk/go1.19.3/src/testing/testing.go:1399 +0x39f
panic({0xda6580, 0xc0002a7040})
        /home/jenkins/sdk/go1.19.3/src/runtime/panic.go:884 +0x212
github.com/apache/beam/sdks/v2/go/pkg/beam.CrossLanguage({0xc000542c40?, 
0xc00094b2c0?}, {0xed0de0, 0x1e}, {0x0?, 0xc0000c8e10?, 0x1?}, {0x7fff74d4d46b, 
0xf}, 0xc000a0fc08, ...)
        
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_GoUsingJava_Dataflow/ws/src/sdks/go/pkg/beam/xlang.go>:162
 +0x136
github.com/apache/beam/sdks/v2/go/examples/xlang.GroupByKey({0xc000542a40?, 
0xc00094b2c0?}, {0x7fff74d4d46b, 0xf}, {0xc00097fc70?})
        
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_GoUsingJava_Dataflow/ws/src/sdks/go/examples/xlang/transforms.go>:89
 +0x3ee
github.com/apache/beam/sdks/v2/go/test/integration/xlang.TestXLang_GroupBy(0xc000362340)
        
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_GoUsingJava_Dataflow/ws/src/sdks/go/test/integration/xlang/xlang_test.go>:217
 +0x2b7
testing.tRunner(0xc000362340, 0xf19e48)
        /home/jenkins/sdk/go1.19.3/src/testing/testing.go:1446 +0x10b
created by testing.(*T).Run
        /home/jenkins/sdk/go1.19.3/src/testing/testing.go:1493 +0x35f
FAIL    github.com/apache/beam/sdks/v2/go/test/integration/xlang        
1975.409s
=== RUN   TestBigQueryIO_BasicWriteRead
    bigquery_test.go:212: Created BigQuery table 
apache-beam-testing.beam_bigquery_io_test_temp.go_bqio_it_temp_1669236713902039214
--- FAIL: TestBigQueryIO_BasicWriteRead (18.74s)
panic:  tried cross-language for 
beam:transform:org.apache.beam:schemaio_bigquery_write:v1 against 
localhost:35537 and failed
        expanding external transform
        expanding transform with ExpansionRequest: 
components:{pcollections:{key:"n2"  value:{unique_name:"n2"  
coder_id:"c0@OjdiVXtHXX"  is_bounded:BOUNDED  
windowing_strategy_id:"w0@OjdiVXtHXX"}}  
windowing_strategies:{key:"w0@OjdiVXtHXX"  
value:{window_fn:{urn:"beam:window_fn:global_windows:v1"}  
merge_status:NON_MERGING  window_coder_id:"c1@OjdiVXtHXX"  trigger:{default:{}} 
 accumulation_mode:DISCARDING  output_time:END_OF_WINDOW  
closing_behavior:EMIT_IF_NONEMPTY  on_time_behavior:FIRE_IF_NONEMPTY  
environment_id:"go"}}  coders:{key:"c0@OjdiVXtHXX"  
value:{spec:{urn:"beam:coder:row:v1"  
payload:"\n\r\n\x07counter\x1a\x02\x10\x04\nZ\n\trand_data\x1aM2K\nI\n\n\n\x04flip\x1a\x02\x10\x08\n\t\n\x03num\x1a\x02\x10\x04\n\n\n\x04word\x1a\x02\x10\x07\x12$293279a3-cab2-4082-b6be-1729f646ba84\x12$40dfa07a-fadd-4e7a-ac47-7534633021ad"}}}
  coders:{key:"c1@OjdiVXtHXX"  
value:{spec:{urn:"beam:coder:global_window:v1"}}}}  
transform:{unique_name:"External"  
spec:{urn:"beam:transform:org.apache.beam:schemaio_bigquery_write:v1"  
payload:"\nX\n\x0e\n\x08location\x1a\x02\x10\x07\n\x0c\n\x06config\x1a\x02\x10\t\n\x12\n\ndataSchema\x1a\x04\x08\x01\x10\t\x12$3fc24beb-ef0b-4fd0-b491-64340b33ca1e\x12a\x03\x01\x04\x00\\\x04\x01\x06Rapache-beam-testing.beam_bigquery_io_test_temp.go_bqio_it_temp_1669236713902039214\x05Never"}
  inputs:{key:"i0"  value:"n2"}  environment_id:"go"}  namespace:"OjdiVXtHXX"
expansion failed
        caused by:
java.lang.IllegalArgumentException: Triggering frequency or number of file 
shards can be specified only when writing an unbounded PCollection via 
FILE_LOADS or STORAGE_API_WRITES, but: the collection was BOUNDED and the 
method was STORAGE_WRITE_API
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:440)
        at 
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:2868)
        at 
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:1932)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:548)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:482)
        at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:360)
        at 
org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider$BigQuerySchemaIO$2.expand(BigQuerySchemaIOProvider.java:227)
        at 
org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider$BigQuerySchemaIO$2.expand(BigQuerySchemaIOProvider.java:196)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:548)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at 
org.apache.beam.sdk.expansion.service.ExpansionService$TransformProvider.apply(ExpansionService.java:396)
        at 
org.apache.beam.sdk.expansion.service.ExpansionService.expand(ExpansionService.java:516)
        at 
org.apache.beam.sdk.expansion.service.ExpansionService.expand(ExpansionService.java:596)
        at 
org.apache.beam.model.expansion.v1.ExpansionServiceGrpc$MethodHandlers.invoke(ExpansionServiceGrpc.java:220)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:182)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.halfClosed(ServerCallImpl.java:354)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1HalfClosed.runInContext(ServerImpl.java:866)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:133)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)
 [recovered]
        panic:  tried cross-language for 
beam:transform:org.apache.beam:schemaio_bigquery_write:v1 against 
localhost:35537 and failed
        expanding external transform
        expanding transform with ExpansionRequest: 
components:{pcollections:{key:"n2"  value:{unique_name:"n2"  
coder_id:"c0@OjdiVXtHXX"  is_bounded:BOUNDED  
windowing_strategy_id:"w0@OjdiVXtHXX"}}  
windowing_strategies:{key:"w0@OjdiVXtHXX"  
value:{window_fn:{urn:"beam:window_fn:global_windows:v1"}  
merge_status:NON_MERGING  window_coder_id:"c1@OjdiVXtHXX"  trigger:{default:{}} 
 accumulation_mode:DISCARDING  output_time:END_OF_WINDOW  
closing_behavior:EMIT_IF_NONEMPTY  on_time_behavior:FIRE_IF_NONEMPTY  
environment_id:"go"}}  coders:{key:"c0@OjdiVXtHXX"  
value:{spec:{urn:"beam:coder:row:v1"  
payload:"\n\r\n\x07counter\x1a\x02\x10\x04\nZ\n\trand_data\x1aM2K\nI\n\n\n\x04flip\x1a\x02\x10\x08\n\t\n\x03num\x1a\x02\x10\x04\n\n\n\x04word\x1a\x02\x10\x07\x12$293279a3-cab2-4082-b6be-1729f646ba84\x12$40dfa07a-fadd-4e7a-ac47-7534633021ad"}}}
  coders:{key:"c1@OjdiVXtHXX"  
value:{spec:{urn:"beam:coder:global_window:v1"}}}}  
transform:{unique_name:"External"  
spec:{urn:"beam:transform:org.apache.beam:schemaio_bigquery_write:v1"  
payload:"\nX\n\x0e\n\x08location\x1a\x02\x10\x07\n\x0c\n\x06config\x1a\x02\x10\t\n\x12\n\ndataSchema\x1a\x04\x08\x01\x10\t\x12$3fc24beb-ef0b-4fd0-b491-64340b33ca1e\x12a\x03\x01\x04\x00\\\x04\x01\x06Rapache-beam-testing.beam_bigquery_io_test_temp.go_bqio_it_temp_1669236713902039214\x05Never"}
  inputs:{key:"i0"  value:"n2"}  environment_id:"go"}  namespace:"OjdiVXtHXX"
expansion failed
        caused by:
java.lang.IllegalArgumentException: Triggering frequency or number of file 
shards can be specified only when writing an unbounded PCollection via 
FILE_LOADS or STORAGE_API_WRITES, but: the collection was BOUNDED and the 
method was STORAGE_WRITE_API
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:440)
        at 
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:2868)
        at 
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:1932)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:548)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:482)
        at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:360)
        at 
org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider$BigQuerySchemaIO$2.expand(BigQuerySchemaIOProvider.java:227)
        at 
org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider$BigQuerySchemaIO$2.expand(BigQuerySchemaIOProvider.java:196)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:548)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at 
org.apache.beam.sdk.expansion.service.ExpansionService$TransformProvider.apply(ExpansionService.java:396)
        at 
org.apache.beam.sdk.expansion.service.ExpansionService.expand(ExpansionService.java:516)
        at 
org.apache.beam.sdk.expansion.service.ExpansionService.expand(ExpansionService.java:596)
        at 
org.apache.beam.model.expansion.v1.ExpansionServiceGrpc$MethodHandlers.invoke(ExpansionServiceGrpc.java:220)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:182)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.halfClosed(ServerCallImpl.java:354)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1HalfClosed.runInContext(ServerImpl.java:866)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:133)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)


goroutine 50 [running]:
testing.tRunner.func1.2({0xf3b0a0, 0xc000610440})
        /home/jenkins/sdk/go1.19.3/src/testing/testing.go:1396 +0x24e
testing.tRunner.func1()
        /home/jenkins/sdk/go1.19.3/src/testing/testing.go:1399 +0x39f
panic({0xf3b0a0, 0xc000610440})
        /home/jenkins/sdk/go1.19.3/src/runtime/panic.go:884 +0x212
github.com/apache/beam/sdks/v2/go/pkg/beam.CrossLanguage({0xc0001e04c0?, 
0xc00013c000?}, {0x10ff82b, 0x39}, {0xc0001d2180?, 0xa91b40?, 0x1?}, 
{0xc00011ce60, 0xf}, 0xc000563d20, ...)
        
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_GoUsingJava_Dataflow/ws/src/sdks/go/pkg/beam/xlang.go>:162
 +0x136
github.com/apache/beam/sdks/v2/go/pkg/beam/io/xlang/bigqueryio.Write({0xc0001e0320?,
 0xc00013c000?}, {0xc00004e480, 0x52}, {0x1?}, {0xc000563ea8, 0x2, 
0x7fb39f04b4d0?})
        
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_GoUsingJava_Dataflow/ws/src/sdks/go/pkg/beam/io/xlang/bigqueryio/bigquery.go>:247
 +0x3d7
github.com/apache/beam/sdks/v2/go/test/integration/io/xlang/bigquery.WritePipeline({0xc00011ce60,
 0xf}, {0xc00004e480, 0x52}, {0xe8b640, 0xc000044110})
        
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_GoUsingJava_Dataflow/ws/src/sdks/go/test/integration/io/xlang/bigquery/bigquery_test.go>:125
 +0x21f
github.com/apache/beam/sdks/v2/go/test/integration/io/xlang/bigquery.TestBigQueryIO_BasicWriteRead(0xc0005d7380)
        
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_GoUsingJava_Dataflow/ws/src/sdks/go/test/integration/io/xlang/bigquery/bigquery_test.go>:215
 +0x1ec
testing.tRunner(0xc0005d7380, 0x112ec28)
        /home/jenkins/sdk/go1.19.3/src/testing/testing.go:1446 +0x10b
created by testing.(*T).Run
        /home/jenkins/sdk/go1.19.3/src/testing/testing.go:1493 +0x35f
FAIL    github.com/apache/beam/sdks/v2/go/test/integration/io/xlang/bigquery    
21.971s
=== RUN   TestDebeziumIO_BasicRead
    integration.go:309: Test TestDebeziumIO_BasicRead is currently filtered for 
runner dataflow
--- SKIP: TestDebeziumIO_BasicRead (0.00s)
PASS
ok      github.com/apache/beam/sdks/v2/go/test/integration/io/xlang/debezium    
3.218s
=== RUN   TestJDBCIO_BasicReadWrite
    integration.go:309: Test TestJDBCIO_BasicReadWrite is currently filtered 
for runner dataflow
--- SKIP: TestJDBCIO_BasicReadWrite (0.00s)
=== RUN   TestJDBCIO_PostgresReadWrite
    integration.go:309: Test TestJDBCIO_PostgresReadWrite is currently filtered 
for runner dataflow
--- SKIP: TestJDBCIO_PostgresReadWrite (0.00s)
PASS
ok      github.com/apache/beam/sdks/v2/go/test/integration/io/xlang/jdbc        
3.144s
=== RUN   TestKafkaIO_BasicReadWrite
    integration.go:309: Test TestKafkaIO_BasicReadWrite is currently filtered 
for runner dataflow
--- SKIP: TestKafkaIO_BasicReadWrite (0.00s)
PASS
ok      github.com/apache/beam/sdks/v2/go/test/integration/io/xlang/kafka       
6.199s
FAIL
WARNING: Successfully resolved tag to sha256, but it is recommended to use 
sha256 directly.
Digests:
- 
us.gcr.io/apache-beam-testing/jenkins/beam_go_sdk@sha256:ceedd0a34f05eb0d56b67e289446ad5cec5ea938b264ffef570d8b28bcc4d2fb
  Associated tags:
 - 20221123-204910
Tags:
- us.gcr.io/apache-beam-testing/jenkins/beam_go_sdk:20221123-204910
Deleted [us.gcr.io/apache-beam-testing/jenkins/beam_go_sdk:20221123-204910].
Deleted 
[us.gcr.io/apache-beam-testing/jenkins/beam_go_sdk@sha256:ceedd0a34f05eb0d56b67e289446ad5cec5ea938b264ffef570d8b28bcc4d2fb].

> Task 
> :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerGoUsingJava 
> FAILED
> Task :runners:google-cloud-dataflow-java:cleanupXVR UP-TO-DATE

> Task :runners:google-cloud-dataflow-java:cleanUpDockerPythonImages
WARNING: Successfully resolved tag to sha256, but it is recommended to use 
sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/python:20221123202611]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:de48974c7240df17d71b9c8807b9c7723b549ba8381605bc8093e8df01b745dc]

Deleted 
[[us.gcr.io/apache-beam-testing/java-postcommit-it/python:20221123202611] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:de48974c7240df17d71b9c8807b9c7723b549ba8381605bc8093e8df01b745dc])].
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:de48974c7240df17d71b9c8807b9c7723b549ba8381605bc8093e8df01b745dc
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:de48974c7240df17d71b9c8807b9c7723b549ba8381605bc8093e8df01b745dc].

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
WARNING: Successfully resolved tag to sha256, but it is recommended to use 
sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221123202611]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:645774e24c5c59e53fe3ee438477b5b11aeba4efe47e739652d617ff5df8bb97]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221123202611] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:645774e24c5c59e53fe3ee438477b5b11aeba4efe47e739652d617ff5df8bb97])].
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:645774e24c5c59e53fe3ee438477b5b11aeba4efe47e739652d617ff5df8bb97
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:645774e24c5c59e53fe3ee438477b5b11aeba4efe47e739652d617ff5df8bb97].

> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerCleanup

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_GoUsingJava_Dataflow/ws/src/sdks/go/test/build.gradle'>
 line: 195

* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerGoUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 59m 28s
215 actionable tasks: 150 executed, 53 from cache, 12 up-to-date

Publishing build scan...
https://gradle.com/s/d5lxhjunan65w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to