See
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Direct/374/display/redirect?page=changes>
Changes:
[noreply] Fix cleanUpDockerJavaImages fail (#27287)
------------------------------------------
[...truncated 563.79 KB...]
3cf8eb34c5a278dd80885833c9eb6383c9fcfcc1c0d8e33ab663aecf996e316d
[32mINFO [0m
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:188 Deleting table
[test-table-1687984944-93d277]
[32mPASSED[0m
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_delete_row_mutation
[1m-------------------------------- live log call
---------------------------------[0m
[32mINFO [0m
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:184 Created table
[test-table-1687984961-713f8b]
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function annotate_downstream_side_inputs at
0x7f3d1aca5d00> ====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function fix_side_input_pcoll_coders at 0x7f3d1aca5e40>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function pack_combiners at 0x7f3d1aca63e0>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function lift_combiners at 0x7f3d1aca6480>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function expand_sdf at 0x7f3d1aca6660>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function expand_gbk at 0x7f3d1aca6700>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function sink_flattens at 0x7f3d1aca6840>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function greedily_fuse at 0x7f3d1aca68e0>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function read_to_impulse at 0x7f3d1aca6980>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function impulse_to_input at 0x7f3d1aca6a20>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function sort_stages at 0x7f3d1aca6ca0>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function add_impulse_to_dangling_transforms at
0x7f3d1aca6de0> ====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function setup_timer_mapping at 0x7f3d1aca6c00>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function populate_data_channel_coders at 0x7f3d1aca6d40>
====================
[32mINFO [0m apache_beam.runners.worker.statecache:statecache.py:234
Creating state cache with size 104857600
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:903
Created Worker handler
<apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler
object at 0x7f3ceaa41fd0> for environment
ref_Environment_default_environment_2 (beam:env:embedded_python:v1, b'')
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:510
starting control server on port 40015
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:511
starting data server on port 43651
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:512
starting state server on port 35699
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:513
starting logging server on port 34883
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:903
Created Worker handler
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
object at 0x7f3cebdfb890> for environment external_15beam:env:docker:v1
(beam:env:docker:v1, b'\n apache/beam_java8_sdk:2.49.0.dev')
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:764
Attempting to pull image apache/beam_java8_sdk:2.49.0.dev
Error response from daemon: manifest for apache/beam_java8_sdk:2.49.0.dev not
found: manifest unknown: manifest unknown
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:767
Unable to pull image apache/beam_java8_sdk:2.49.0.dev, defaulting to local
image if it exists
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:788
Waiting for docker to start up. Current status is running
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:792
Docker container is running. container_id =
b'742c1b1c5a6fe46a78c6dce99e78a9b7e2c5f18d068f0ea425ad75f495b035bd', worker_id
= worker_23
[32mINFO [0m root:worker_handlers.py:415 severity: INFO
timestamp {
seconds: 1687984971
nanos: 787000000
}
message: "Fn Harness started"
log_location: "org.apache.beam.fn.harness.FnHarness"
thread: "1"
[32mINFO [0m root:worker_handlers.py:415 severity: INFO
timestamp {
seconds: 1687984971
nanos: 971000000
}
message: "Entering instruction processing loop"
log_location: "org.apache.beam.fn.harness.FnHarness"
thread: "1"
[32mINFO [0m root:worker_handlers.py:415 severity: INFO
timestamp {
seconds: 1687984973
nanos: 344000000
}
message: "Started Bigtable service with settings
BigtableDataSettings{stubSettings=EnhancedBigtableStubSettings{projectId=apache-beam-testing,
instanceId=bt-write-xlang-1687984907-c93242, appProfileId=,
isRefreshingChannel=true, primedTableIds=[],
jwtAudienceMapping={batch-bigtable.googleapis.com=https://bigtable.googleapis.com/},
readRowsSettings=ServerStreamingCallSettings{idleTimeout=PT5M,
waitTimeout=PT5M, retryableCodes=[DEADLINE_EXCEEDED, UNAVAILABLE, ABORTED],
retrySettings=RetrySettings{totalTimeout=PT12H, initialRetryDelay=PT0.01S,
retryDelayMultiplier=2.0, maxRetryDelay=PT1M, maxAttempts=10, jittered=true,
initialRpcTimeout=PT30M, rpcTimeoutMultiplier=2.0, maxRpcTimeout=PT30M}},
readRowSettings=UnaryCallSettings{retryableCodes=[DEADLINE_EXCEEDED,
UNAVAILABLE, ABORTED], retrySettings=RetrySettings{totalTimeout=PT10M,
initialRetryDelay=PT0.01S, retryDelayMultiplier=2.0, maxRetryDelay=PT1M,
maxAttempts=10, jittered=true, initialRpcTimeout=PT30M,
rpcTimeoutMultiplier=2.0, maxRpcTimeout=PT30M}},
sampleRowKeysSettings=UnaryCallSettings{retryableCodes=[DEADLINE_EXCEEDED,
UNAVAILABLE], retrySettings=RetrySettings{totalTimeout=PT10M,
initialRetryDelay=PT0.01S, retryDelayMultiplier=2.0, maxRetryDelay=PT1M,
maxAttempts=0, jittered=true, initialRpcTimeout=PT20S,
rpcTimeoutMultiplier=1.0, maxRpcTimeout=PT20S}},
mutateRowSettings=UnaryCallSettings{retryableCodes=[DEADLINE_EXCEEDED,
UNAVAILABLE], retrySettings=RetrySettings{totalTimeout=PT1M,
initialRetryDelay=PT0.01S, retryDelayMultiplier=2.0, maxRetryDelay=PT1M,
maxAttempts=0, jittered=true, initialRpcTimeout=PT1M, rpcTimeoutMultiplier=1.0,
maxRpcTimeout=PT1M}},
bulkMutateRowsSettings=BigtableBatchingCallSettings{batchingCallSettings=BatchingCallSettings{retryableCodes=[DEADLINE_EXCEEDED,
UNAVAILABLE], retrySettings=RetrySettings{totalTimeout=PT10M,
initialRetryDelay=PT0.01S, retryDelayMultiplier=2.0, maxRetryDelay=PT1M,
maxAttempts=0, jittered=true, initialRpcTimeout=PT6M, rpcTimeoutMultiplier=1.0,
maxRpcTimeout=PT6M},
batchingSettings=BatchingSettings{elementCountThreshold=100,
requestByteThreshold=20971520, delayThreshold=PT1S, isEnabled=true,
flowControlSettings=FlowControlSettings{maxOutstandingElementCount=20000,
maxOutstandingRequestBytes=104857600, limitExceededBehavior=Block}}},
isLatencyBasedThrottlingEnabled=false, targetRpcLatency=null,
dynamicFlowControlSettings=DynamicFlowControlSettings{initialOutstandingElementCount=20000,
initialOutstandingRequestBytes=104857600, maxOutstandingElementCount=20000,
maxOutstandingRequestBytes=104857600, minOutstandingElementCount=20000,
minOutstandingRequestBytes=104857600, limitExceededBehavior=Block},
isServerInitiatedFlowControlEnabled=false},
bulkReadRowsSettings=BigtableBulkReadRowsCallSettings{retryableCodes=[DEADLINE_EXCEEDED,
UNAVAILABLE, ABORTED], retrySettings=RetrySettings{totalTimeout=PT10M,
initialRetryDelay=PT0.01S, retryDelayMultiplier=2.0, maxRetryDelay=PT1M,
maxAttempts=0, jittered=true, initialRpcTimeout=PT20S,
rpcTimeoutMultiplier=1.0, maxRpcTimeout=PT20S}},
checkAndMutateRowSettings=UnaryCallSettings{retryableCodes=[],
retrySettings=RetrySettings{totalTimeout=PT20S, initialRetryDelay=PT0S,
retryDelayMultiplier=1.0, maxRetryDelay=PT0S, maxAttempts=0, jittered=true,
initialRpcTimeout=PT20S, rpcTimeoutMultiplier=1.0, maxRpcTimeout=PT20S}},
readModifyWriteRowSettings=UnaryCallSettings{retryableCodes=[],
retrySettings=RetrySettings{totalTimeout=PT20S, initialRetryDelay=PT0S,
retryDelayMultiplier=1.0, maxRetryDelay=PT0S, maxAttempts=0, jittered=true,
initialRpcTimeout=PT20S, rpcTimeoutMultiplier=1.0, maxRpcTimeout=PT20S}},
generateInitialChangeStreamPartitionsSettings=ServerStreamingCallSettings{idleTimeout=PT5M,
waitTimeout=PT0S, retryableCodes=[DEADLINE_EXCEEDED, UNAVAILABLE, ABORTED],
retrySettings=RetrySettings{totalTimeout=PT1H, initialRetryDelay=PT0.01S,
retryDelayMultiplier=2.0, maxRetryDelay=PT1M, maxAttempts=10, jittered=true,
initialRpcTimeout=PT1M, rpcTimeoutMultiplier=2.0, maxRpcTimeout=PT10M}},
readChangeStreamSettings=ServerStreamingCallSettings{idleTimeout=PT5M,
waitTimeout=PT0S, retryableCodes=[DEADLINE_EXCEEDED, UNAVAILABLE, ABORTED],
retrySettings=RetrySettings{totalTimeout=PT12H, initialRetryDelay=PT0.01S,
retryDelayMultiplier=2.0, maxRetryDelay=PT1M, maxAttempts=10, jittered=true,
initialRpcTimeout=PT5M, rpcTimeoutMultiplier=2.0, maxRpcTimeout=PT5M}},
pingAndWarmSettings=UnaryCallSettings{retryableCodes=[],
retrySettings=RetrySettings{totalTimeout=PT30S, initialRetryDelay=PT0S,
retryDelayMultiplier=1.0, maxRetryDelay=PT0S, maxAttempts=1, jittered=true,
initialRpcTimeout=PT30S, rpcTimeoutMultiplier=1.0, maxRpcTimeout=PT30S}},
parent=EnhancedBigtableStubSettings{backgroundExecutorProvider=InstantiatingExecutorProvider{executorThreadCount=16,
threadFactory=com.google.api.gax.core.InstantiatingExecutorProvider$1@6418e267},
transportChannelProvider=com.google.api.gax.grpc.InstantiatingGrpcChannelProvider@3210286,
credentialsProvider=FixedCredentialsProvider{credentials=ComputeEngineCredentials{transportFactoryClassName=com.google.auth.oauth2.OAuth2Utils$DefaultHttpTransportFactory}},
headerProvider=FixedHeaderProvider{headers={user-agent=Apache_Beam_SDK_for_Java/2.49.0-SNAPSHOT}},
internalHeaderProvider=FixedHeaderProvider{headers={x-goog-api-client=gl-java/1.8.0_372
gapic/ gax/2.29.0 grpc/, user-agent=bigtable-java/2.23.3,
bigtable-features=}}, clock=com.google.api.core.NanoClock@6bdfae79,
endpoint=bigtable.googleapis.com:443,
mtlsEndpoint=bigtable.mtls.googleapis.com:443,
switchToMtlsEndpointAllowed=false, quotaProjectId=null,
streamWatchdogProvider=com.google.api.gax.rpc.InstantiatingWatchdogProvider@5eae81e9,
streamWatchdogCheckInterval=PT10S,
tracerFactory=com.google.api.gax.tracing.BaseApiTracerFactory@e0a83c7}}}"
instruction_id: "bundle_61"
transform_id:
"WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)"
log_location: "org.apache.beam.sdk.io.gcp.bigtable.BigtableServiceImpl"
thread: "30"
[33mWARNING [0m root:worker_handlers.py:415 severity: WARN
timestamp {
seconds: 1687984975
nanos: 444000000
}
message: "Reporting metrics are not supported in the current execution
environment."
log_location: "org.apache.beam.sdk.metrics.MetricsEnvironment"
thread: "1"
[33mWARNING [0m root:worker_handlers.py:415 severity: WARN
timestamp {
seconds: 1687984975
nanos: 448000000
}
message: "Hanged up for url: \"localhost:43651\"\n."
log_location: "org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer"
thread: "22"
742c1b1c5a6fe46a78c6dce99e78a9b7e2c5f18d068f0ea425ad75f495b035bd
[32mINFO [0m
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:188 Deleting table
[test-table-1687984961-713f8b]
[32mPASSED[0m
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_set_mutation
[1m-------------------------------- live log call
---------------------------------[0m
[32mINFO [0m
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:184 Created table
[test-table-1687984976-0a4769]
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function annotate_downstream_side_inputs at
0x7f3d1aca5d00> ====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function fix_side_input_pcoll_coders at 0x7f3d1aca5e40>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function pack_combiners at 0x7f3d1aca63e0>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function lift_combiners at 0x7f3d1aca6480>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function expand_sdf at 0x7f3d1aca6660>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function expand_gbk at 0x7f3d1aca6700>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function sink_flattens at 0x7f3d1aca6840>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function greedily_fuse at 0x7f3d1aca68e0>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function read_to_impulse at 0x7f3d1aca6980>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function impulse_to_input at 0x7f3d1aca6a20>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function sort_stages at 0x7f3d1aca6ca0>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function add_impulse_to_dangling_transforms at
0x7f3d1aca6de0> ====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function setup_timer_mapping at 0x7f3d1aca6c00>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function populate_data_channel_coders at 0x7f3d1aca6d40>
====================
[32mINFO [0m apache_beam.runners.worker.statecache:statecache.py:234
Creating state cache with size 104857600
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:903
Created Worker handler
<apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler
object at 0x7f3ceb2395d0> for environment
ref_Environment_default_environment_2 (beam:env:embedded_python:v1, b'')
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:510
starting control server on port 46181
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:511
starting data server on port 35177
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:512
starting state server on port 38915
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:513
starting logging server on port 45747
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:903
Created Worker handler
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
object at 0x7f3ceb2da0d0> for environment external_16beam:env:docker:v1
(beam:env:docker:v1, b'\n apache/beam_java8_sdk:2.49.0.dev')
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:764
Attempting to pull image apache/beam_java8_sdk:2.49.0.dev
Error response from daemon: manifest for apache/beam_java8_sdk:2.49.0.dev not
found: manifest unknown: manifest unknown
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:767
Unable to pull image apache/beam_java8_sdk:2.49.0.dev, defaulting to local
image if it exists
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:788
Waiting for docker to start up. Current status is running
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:792
Docker container is running. container_id =
b'6837b77bb84ea59568d5ab87bb2a3890d21086e7d74ef7dd029901ad7361d6c8', worker_id
= worker_25
[32mINFO [0m root:worker_handlers.py:415 severity: INFO
timestamp {
seconds: 1687984986
nanos: 569000000
}
message: "Fn Harness started"
log_location: "org.apache.beam.fn.harness.FnHarness"
thread: "1"
[32mINFO [0m root:worker_handlers.py:415 severity: INFO
timestamp {
seconds: 1687984986
nanos: 753000000
}
message: "Entering instruction processing loop"
log_location: "org.apache.beam.fn.harness.FnHarness"
thread: "1"
[32mINFO [0m root:worker_handlers.py:415 severity: INFO
timestamp {
seconds: 1687984988
nanos: 203000000
}
message: "Started Bigtable service with settings
BigtableDataSettings{stubSettings=EnhancedBigtableStubSettings{projectId=apache-beam-testing,
instanceId=bt-write-xlang-1687984907-c93242, appProfileId=,
isRefreshingChannel=true, primedTableIds=[],
jwtAudienceMapping={batch-bigtable.googleapis.com=https://bigtable.googleapis.com/},
readRowsSettings=ServerStreamingCallSettings{idleTimeout=PT5M,
waitTimeout=PT5M, retryableCodes=[ABORTED, DEADLINE_EXCEEDED, UNAVAILABLE],
retrySettings=RetrySettings{totalTimeout=PT12H, initialRetryDelay=PT0.01S,
retryDelayMultiplier=2.0, maxRetryDelay=PT1M, maxAttempts=10, jittered=true,
initialRpcTimeout=PT30M, rpcTimeoutMultiplier=2.0, maxRpcTimeout=PT30M}},
readRowSettings=UnaryCallSettings{retryableCodes=[ABORTED, DEADLINE_EXCEEDED,
UNAVAILABLE], retrySettings=RetrySettings{totalTimeout=PT10M,
initialRetryDelay=PT0.01S, retryDelayMultiplier=2.0, maxRetryDelay=PT1M,
maxAttempts=10, jittered=true, initialRpcTimeout=PT30M,
rpcTimeoutMultiplier=2.0, maxRpcTimeout=PT30M}},
sampleRowKeysSettings=UnaryCallSettings{retryableCodes=[DEADLINE_EXCEEDED,
UNAVAILABLE], retrySettings=RetrySettings{totalTimeout=PT10M,
initialRetryDelay=PT0.01S, retryDelayMultiplier=2.0, maxRetryDelay=PT1M,
maxAttempts=0, jittered=true, initialRpcTimeout=PT20S,
rpcTimeoutMultiplier=1.0, maxRpcTimeout=PT20S}},
mutateRowSettings=UnaryCallSettings{retryableCodes=[DEADLINE_EXCEEDED,
UNAVAILABLE], retrySettings=RetrySettings{totalTimeout=PT1M,
initialRetryDelay=PT0.01S, retryDelayMultiplier=2.0, maxRetryDelay=PT1M,
maxAttempts=0, jittered=true, initialRpcTimeout=PT1M, rpcTimeoutMultiplier=1.0,
maxRpcTimeout=PT1M}},
bulkMutateRowsSettings=BigtableBatchingCallSettings{batchingCallSettings=BatchingCallSettings{retryableCodes=[DEADLINE_EXCEEDED,
UNAVAILABLE], retrySettings=RetrySettings{totalTimeout=PT10M,
initialRetryDelay=PT0.01S, retryDelayMultiplier=2.0, maxRetryDelay=PT1M,
maxAttempts=0, jittered=true, initialRpcTimeout=PT6M, rpcTimeoutMultiplier=1.0,
maxRpcTimeout=PT6M},
batchingSettings=BatchingSettings{elementCountThreshold=100,
requestByteThreshold=20971520, delayThreshold=PT1S, isEnabled=true,
flowControlSettings=FlowControlSettings{maxOutstandingElementCount=20000,
maxOutstandingRequestBytes=104857600, limitExceededBehavior=Block}}},
isLatencyBasedThrottlingEnabled=false, targetRpcLatency=null,
dynamicFlowControlSettings=DynamicFlowControlSettings{initialOutstandingElementCount=20000,
initialOutstandingRequestBytes=104857600, maxOutstandingElementCount=20000,
maxOutstandingRequestBytes=104857600, minOutstandingElementCount=20000,
minOutstandingRequestBytes=104857600, limitExceededBehavior=Block},
isServerInitiatedFlowControlEnabled=false},
bulkReadRowsSettings=BigtableBulkReadRowsCallSettings{retryableCodes=[ABORTED,
DEADLINE_EXCEEDED, UNAVAILABLE],
retrySettings=RetrySettings{totalTimeout=PT10M, initialRetryDelay=PT0.01S,
retryDelayMultiplier=2.0, maxRetryDelay=PT1M, maxAttempts=0, jittered=true,
initialRpcTimeout=PT20S, rpcTimeoutMultiplier=1.0, maxRpcTimeout=PT20S}},
checkAndMutateRowSettings=UnaryCallSettings{retryableCodes=[],
retrySettings=RetrySettings{totalTimeout=PT20S, initialRetryDelay=PT0S,
retryDelayMultiplier=1.0, maxRetryDelay=PT0S, maxAttempts=0, jittered=true,
initialRpcTimeout=PT20S, rpcTimeoutMultiplier=1.0, maxRpcTimeout=PT20S}},
readModifyWriteRowSettings=UnaryCallSettings{retryableCodes=[],
retrySettings=RetrySettings{totalTimeout=PT20S, initialRetryDelay=PT0S,
retryDelayMultiplier=1.0, maxRetryDelay=PT0S, maxAttempts=0, jittered=true,
initialRpcTimeout=PT20S, rpcTimeoutMultiplier=1.0, maxRpcTimeout=PT20S}},
generateInitialChangeStreamPartitionsSettings=ServerStreamingCallSettings{idleTimeout=PT5M,
waitTimeout=PT0S, retryableCodes=[ABORTED, DEADLINE_EXCEEDED, UNAVAILABLE],
retrySettings=RetrySettings{totalTimeout=PT1H, initialRetryDelay=PT0.01S,
retryDelayMultiplier=2.0, maxRetryDelay=PT1M, maxAttempts=10, jittered=true,
initialRpcTimeout=PT1M, rpcTimeoutMultiplier=2.0, maxRpcTimeout=PT10M}},
readChangeStreamSettings=ServerStreamingCallSettings{idleTimeout=PT5M,
waitTimeout=PT0S, retryableCodes=[ABORTED, DEADLINE_EXCEEDED, UNAVAILABLE],
retrySettings=RetrySettings{totalTimeout=PT12H, initialRetryDelay=PT0.01S,
retryDelayMultiplier=2.0, maxRetryDelay=PT1M, maxAttempts=10, jittered=true,
initialRpcTimeout=PT5M, rpcTimeoutMultiplier=2.0, maxRpcTimeout=PT5M}},
pingAndWarmSettings=UnaryCallSettings{retryableCodes=[],
retrySettings=RetrySettings{totalTimeout=PT30S, initialRetryDelay=PT0S,
retryDelayMultiplier=1.0, maxRetryDelay=PT0S, maxAttempts=1, jittered=true,
initialRpcTimeout=PT30S, rpcTimeoutMultiplier=1.0, maxRpcTimeout=PT30S}},
parent=EnhancedBigtableStubSettings{backgroundExecutorProvider=InstantiatingExecutorProvider{executorThreadCount=16,
threadFactory=com.google.api.gax.core.InstantiatingExecutorProvider$1@5fbd9c22},
transportChannelProvider=com.google.api.gax.grpc.InstantiatingGrpcChannelProvider@2b689b9d,
credentialsProvider=FixedCredentialsProvider{credentials=ComputeEngineCredentials{transportFactoryClassName=com.google.auth.oauth2.OAuth2Utils$DefaultHttpTransportFactory}},
headerProvider=FixedHeaderProvider{headers={user-agent=Apache_Beam_SDK_for_Java/2.49.0-SNAPSHOT}},
internalHeaderProvider=FixedHeaderProvider{headers={x-goog-api-client=gl-java/1.8.0_372
gapic/ gax/2.29.0 grpc/, user-agent=bigtable-java/2.23.3,
bigtable-features=}}, clock=com.google.api.core.NanoClock@72e035e1,
endpoint=bigtable.googleapis.com:443,
mtlsEndpoint=bigtable.mtls.googleapis.com:443,
switchToMtlsEndpointAllowed=false, quotaProjectId=null,
streamWatchdogProvider=com.google.api.gax.rpc.InstantiatingWatchdogProvider@2cf34fce,
streamWatchdogCheckInterval=PT10S,
tracerFactory=com.google.api.gax.tracing.BaseApiTracerFactory@3f863d85}}}"
instruction_id: "bundle_64"
transform_id:
"WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)"
log_location: "org.apache.beam.sdk.io.gcp.bigtable.BigtableServiceImpl"
thread: "30"
[33mWARNING [0m root:worker_handlers.py:415 severity: WARN
timestamp {
seconds: 1687984990
nanos: 318000000
}
message: "Reporting metrics are not supported in the current execution
environment."
log_location: "org.apache.beam.sdk.metrics.MetricsEnvironment"
thread: "1"
[33mWARNING [0m root:worker_handlers.py:415 severity: WARN
timestamp {
seconds: 1687984990
nanos: 321000000
}
message: "Hanged up for url: \"localhost:35177\"\n."
log_location: "org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer"
thread: "22"
6837b77bb84ea59568d5ab87bb2a3890d21086e7d74ef7dd029901ad7361d6c8
[32mINFO [0m
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:188 Deleting table
[test-table-1687984976-0a4769]
[32mPASSED[0m
[1m------------------------------ live log teardown
-------------------------------[0m
[32mINFO [0m
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:196 Deleting
instance [bt-write-xlang-1687984907-c93242]
[33m=============================== warnings summary
===============================[0m
../../build/gradleenv/417525524/lib/python3.11/site-packages/hdfs/config.py:15
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Direct/ws/src/build/gradleenv/417525524/lib/python3.11/site-packages/hdfs/config.py>:15:
DeprecationWarning: the imp module is deprecated in favour of importlib and
slated for removal in Python 3.12; see the module's documentation for
alternative uses
from imp import load_source
../../build/gradleenv/417525524/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Direct/ws/src/build/gradleenv/417525524/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py>:17:
DeprecationWarning: The distutils package is deprecated and slated for removal
in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
from distutils import util
apache_beam/typehints/pandas_type_compatibility_test.py:67
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Direct/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:67:
FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas
in a future version. Use pandas.Index with the appropriate dtype instead.
}).set_index(pd.Int64Index(range(123, 223), name='an_index')),
apache_beam/typehints/pandas_type_compatibility_test.py:90
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Direct/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:90:
FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas
in a future version. Use pandas.Index with the appropriate dtype instead.
pd.Int64Index(range(123, 223), name='an_index'),
apache_beam/typehints/pandas_type_compatibility_test.py:91
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Direct/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:91:
FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas
in a future version. Use pandas.Index with the appropriate dtype instead.
pd.Int64Index(range(475, 575), name='another_index'),
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_nested_records_and_lists
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_at_least_once
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_auto_sharding
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_with_at_least_once_semantics
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2028:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
is_streaming_pipeline = p.options.view_as(StandardOptions).streaming
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_nested_records_and_lists
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_at_least_once
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_auto_sharding
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_with_at_least_once_semantics
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2034:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
apache_beam/io/external/xlang_bigqueryio_it_test.py: 10 warnings
apache_beam/io/gcp/bigtableio_it_test.py: 6 warnings
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Direct/ws/src/sdks/python/apache_beam/transforms/external.py>:676:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
self._expansion_service, pipeline.options)
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Direct/ws/src/sdks/python/pytest_gcpCrossLanguage.xml>
-
[33m=== [32m13 passed[0m, [33m[1m9 skipped[0m, [33m[1m6960
deselected[0m, [33m[1m33 warnings[0m[33m in 292.50s (0:04:52)[0m[33m
====[0m
> Task :sdks:python:test-suites:direct:py311:gcpCrossLanguageCleanup
Stopping expansion service pid: 2088521.
Skipping invalid pid: 2088522.
> Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup
Killing process at 2088429
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:python:test-suites:direct:py38:gcpCrossLanguagePythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during
this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 6m 52s
115 actionable tasks: 79 executed, 32 from cache, 4 up-to-date
Publishing build scan...
https://ge.apache.org/s/cmjvk6putr6ri
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]