See
<https://ci-beam.apache.org/job/beam_PostCommit_Python36/4520/display/redirect?page=changes>
Changes:
[noreply] Fixing BigQueryIO request too big corner case for streaming inserts
------------------------------------------
[...truncated 49.07 MB...]
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
assert_that/Group/CoGroupByKeyImpl/GroupByKey -> [4]assert_that/{Group, Unkey,
Match} (14/16) (3bb25f4f48d0a164724705d9538ad40d) switched from SCHEDULED to
DEPLOYING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:42 AM org.apache.flink.runtime.executiongraph.Execution deploy'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Deploying assert_that/Group/CoGroupByKeyImpl/GroupByKey ->
[4]assert_that/{Group, Unkey, Match} (14/16) (attempt #0) with attempt id
3bb25f4f48d0a164724705d9538ad40d to 405b2f88-078a-4e86-88cc-86816d276be4 @
localhost (dataPort=-1) with allocation id 3aab6fc78fb4ad4e92018de683ee043e'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:42 AM org.apache.flink.runtime.executiongraph.Execution
transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
assert_that/Group/CoGroupByKeyImpl/GroupByKey -> [4]assert_that/{Group, Unkey,
Match} (15/16) (a18a3282a8dc5f27024f6e001e70f321) switched from SCHEDULED to
DEPLOYING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:42 AM org.apache.flink.runtime.executiongraph.Execution deploy'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Deploying assert_that/Group/CoGroupByKeyImpl/GroupByKey ->
[4]assert_that/{Group, Unkey, Match} (15/16) (attempt #0) with attempt id
a18a3282a8dc5f27024f6e001e70f321 to 405b2f88-078a-4e86-88cc-86816d276be4 @
localhost (dataPort=-1) with allocation id b78160bdbbdab8729a742ae897347f32'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:42 AM org.apache.flink.runtime.executiongraph.Execution
transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
assert_that/Group/CoGroupByKeyImpl/GroupByKey -> [4]assert_that/{Group, Unkey,
Match} (16/16) (8bd5286019bbe178ff69f45580bbb3d9) switched from SCHEDULED to
DEPLOYING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:42 AM org.apache.flink.runtime.executiongraph.Execution deploy'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Deploying assert_that/Group/CoGroupByKeyImpl/GroupByKey ->
[4]assert_that/{Group, Unkey, Match} (16/16) (attempt #0) with attempt id
8bd5286019bbe178ff69f45580bbb3d9 to 405b2f88-078a-4e86-88cc-86816d276be4 @
localhost (dataPort=-1) with allocation id b5a0ea18da6fde3216a14d7f2cfc419a'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:42 AM org.apache.flink.runtime.taskexecutor.TaskExecutor submitTask'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Received task [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (1/16)#0 (300055108587269007d99798f4d5cdbe),
deploy into slot with allocation id 1ab62d8f3702ad877102cb5cfd5c0708.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:42 AM org.apache.flink.runtime.taskmanager.Task transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
[3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (1/16)#0 (300055108587269007d99798f4d5cdbe)
switched from CREATED to DEPLOYING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:42 AM org.apache.flink.runtime.taskmanager.Task doRun'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Loading JAR files for task [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (1/16)#0 (300055108587269007d99798f4d5cdbe)
[DEPLOYING].'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:42 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl
markExistingSlotActive'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Activate slot d2566a442d7339b61144da38ffa1c7fd.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:42 AM org.apache.flink.runtime.state.StateBackendLoader
loadFromApplicationOrConfigOrDefaultInternal'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
No state backend has been configured, using default (HashMap)
org.apache.flink.runtime.state.hashmap.HashMapStateBackend@3d2f5d59'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:42 AM org.apache.flink.runtime.state.CheckpointStorageLoader
createJobManagerCheckpointStorage'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
Checkpoint storage is set to 'jobmanager'"
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskexecutor.TaskExecutor submitTask'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Received task [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (2/16)#0 (21e95a2651ed02329fd4c5ae2b50cedb),
deploy into slot with allocation id d2566a442d7339b61144da38ffa1c7fd.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl
markExistingSlotActive'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Activate slot 386bcbbe0f9707f519046bbd104111db.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.state.StateBackendLoader
loadFromApplicationOrConfigOrDefaultInternal'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
No state backend has been configured, using default (HashMap)
org.apache.flink.runtime.state.hashmap.HashMapStateBackend@14ca105d'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.state.CheckpointStorageLoader
createJobManagerCheckpointStorage'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
Checkpoint storage is set to 'jobmanager'"
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskexecutor.TaskExecutor submitTask'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Received task [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (3/16)#0 (cf4c9b9b674bb6e86d6d45cbad8995ce),
deploy into slot with allocation id 386bcbbe0f9707f519046bbd104111db.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Source: Impulse (1/1)#0 (7319c919f66a4aed4afe9a6b2dce3dda) switched from
DEPLOYING to INITIALIZING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Source: Impulse (1/1)#0 (c3ee950a5723ca13f8e589bb6908b4ce) switched from
DEPLOYING to INITIALIZING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.executiongraph.Execution
transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Source: Impulse (1/1) (c3ee950a5723ca13f8e589bb6908b4ce) switched from
DEPLOYING to INITIALIZING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.executiongraph.Execution
transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Source: Impulse (1/1) (7319c919f66a4aed4afe9a6b2dce3dda) switched from
DEPLOYING to INITIALIZING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
[3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (2/16)#0 (21e95a2651ed02329fd4c5ae2b50cedb)
switched from CREATED to DEPLOYING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task doRun'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Loading JAR files for task [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (2/16)#0 (21e95a2651ed02329fd4c5ae2b50cedb)
[DEPLOYING].'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
[3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (3/16)#0 (cf4c9b9b674bb6e86d6d45cbad8995ce)
switched from CREATED to DEPLOYING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task doRun'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Loading JAR files for task [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (3/16)#0 (cf4c9b9b674bb6e86d6d45cbad8995ce)
[DEPLOYING].'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.state.StateBackendLoader
loadFromApplicationOrConfigOrDefaultInternal'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
No state backend has been configured, using default (HashMap)
org.apache.flink.runtime.state.hashmap.HashMapStateBackend@4ff3cb03'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.state.CheckpointStorageLoader
createJobManagerCheckpointStorage'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
Checkpoint storage is set to 'jobmanager'"
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
[3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (1/16)#0 (300055108587269007d99798f4d5cdbe)
switched from DEPLOYING to INITIALIZING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.executiongraph.Execution
transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
[3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (1/16) (300055108587269007d99798f4d5cdbe)
switched from DEPLOYING to INITIALIZING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl
markExistingSlotActive'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Activate slot ece0b497cb01738411c662b2cabebdf1.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.state.StateBackendLoader
loadFromApplicationOrConfigOrDefaultInternal'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
No state backend has been configured, using default (HashMap)
org.apache.flink.runtime.state.hashmap.HashMapStateBackend@7e81cc3e'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.state.CheckpointStorageLoader
createJobManagerCheckpointStorage'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
Checkpoint storage is set to 'jobmanager'"
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
[3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (2/16)#0 (21e95a2651ed02329fd4c5ae2b50cedb)
switched from DEPLOYING to INITIALIZING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.executiongraph.Execution
transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
[3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (2/16) (21e95a2651ed02329fd4c5ae2b50cedb)
switched from DEPLOYING to INITIALIZING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskexecutor.TaskExecutor submitTask'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Received task [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (4/16)#0 (7a816d1cac69aa3743cec4cb60fe7e29),
deploy into slot with allocation id ece0b497cb01738411c662b2cabebdf1.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
[3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (4/16)#0 (7a816d1cac69aa3743cec4cb60fe7e29)
switched from CREATED to DEPLOYING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task doRun'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Loading JAR files for task [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (4/16)#0 (7a816d1cac69aa3743cec4cb60fe7e29)
[DEPLOYING].'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl
markExistingSlotActive'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Activate slot 28099980801d104e825c1468b0231b5c.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskexecutor.TaskExecutor submitTask'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Received task [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (5/16)#0 (a903a3707ae19bea2900967d1ac59e65),
deploy into slot with allocation id 28099980801d104e825c1468b0231b5c.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
[3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (5/16)#0 (a903a3707ae19bea2900967d1ac59e65)
switched from CREATED to DEPLOYING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl
markExistingSlotActive'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Activate slot be3e7ba8c6c81fff98d672706740f6f3.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task doRun'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Loading JAR files for task [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (5/16)#0 (a903a3707ae19bea2900967d1ac59e65)
[DEPLOYING].'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskexecutor.TaskExecutor submitTask'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Received task [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (6/16)#0 (4c846b4b0e8efd25bbb2c13945e34717),
deploy into slot with allocation id be3e7ba8c6c81fff98d672706740f6f3.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.metrics.groups.TaskMetricGroup
getOrAddOperator'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122
b'WARNING: The operator name [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} exceeded the 80 characters length limit and
was truncated.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.metrics.groups.TaskMetricGroup
getOrAddOperator'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122
b'WARNING: The operator name [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} exceeded the 80 characters length limit and
was truncated.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.state.StateBackendLoader
loadFromApplicationOrConfigOrDefaultInternal'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
No state backend has been configured, using default (HashMap)
org.apache.flink.runtime.state.hashmap.HashMapStateBackend@3deaa13d'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl
markExistingSlotActive'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Activate slot 1f646536fad928fd1d095e94ce5c7372.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
[3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (6/16)#0 (4c846b4b0e8efd25bbb2c13945e34717)
switched from CREATED to DEPLOYING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task doRun'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Loading JAR files for task [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (6/16)#0 (4c846b4b0e8efd25bbb2c13945e34717)
[DEPLOYING].'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.state.CheckpointStorageLoader
createJobManagerCheckpointStorage'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
Checkpoint storage is set to 'jobmanager'"
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
[3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (4/16)#0 (7a816d1cac69aa3743cec4cb60fe7e29)
switched from DEPLOYING to INITIALIZING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.executiongraph.Execution
transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
[3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (4/16) (7a816d1cac69aa3743cec4cb60fe7e29)
switched from DEPLOYING to INITIALIZING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskexecutor.TaskExecutor submitTask'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Received task [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (7/16)#0 (a7f72a80638024b963d1bd97112cc162),
deploy into slot with allocation id 1f646536fad928fd1d095e94ce5c7372.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
[3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (7/16)#0 (a7f72a80638024b963d1bd97112cc162)
switched from CREATED to DEPLOYING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task doRun'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Loading JAR files for task [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (7/16)#0 (a7f72a80638024b963d1bd97112cc162)
[DEPLOYING].'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.state.StateBackendLoader
loadFromApplicationOrConfigOrDefaultInternal'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
No state backend has been configured, using default (HashMap)
org.apache.flink.runtime.state.hashmap.HashMapStateBackend@7dc2df7b'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.state.CheckpointStorageLoader
createJobManagerCheckpointStorage'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
Checkpoint storage is set to 'jobmanager'"
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
[3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (3/16)#0 (cf4c9b9b674bb6e86d6d45cbad8995ce)
switched from DEPLOYING to INITIALIZING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.executiongraph.Execution
transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
[3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (3/16) (cf4c9b9b674bb6e86d6d45cbad8995ce)
switched from DEPLOYING to INITIALIZING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
> Task :sdks:python:test-suites:dataflow:py36:postCommitIT
[gw6] [32mPASSED[0m
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/parquetio_it_test.py::TestParquetIT::test_parquetio_it
> Task :sdks:python:test-suites:portable:py36:postCommitPy36IT
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.state.StateBackendLoader
loadFromApplicationOrConfigOrDefaultInternal'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
No state backend has been configured, using default (HashMap)
org.apache.flink.runtime.state.hashmap.HashMapStateBackend@1ccc9b23'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.state.CheckpointStorageLoader
createJobManagerCheckpointStorage'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
Checkpoint storage is set to 'jobmanager'"
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskmanager.Task transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
[3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (5/16)#0 (a903a3707ae19bea2900967d1ac59e65)
switched from DEPLOYING to INITIALIZING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.executiongraph.Execution
transitionState'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
[3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (5/16) (a903a3707ae19bea2900967d1ac59e65)
switched from DEPLOYING to INITIALIZING.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.metrics.groups.TaskMetricGroup
getOrAddOperator'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122
b'WARNING: The operator name [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} exceeded the 80 characters length limit and
was truncated.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl
markExistingSlotActive'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Activate slot 56d173aec68e18737e074b9ae10653a0.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.taskexecutor.TaskExecutor submitTask'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
Received task [3]Read from
debezium/ExternalTransform(beam:transform:org.apache.beam:debezium_read:v1)/Create.Values/Read(CreateSource)/{ParDo(OutputSingleSource),
ParDo(BoundedSourceAsSDFWrapper)} (8/16)#0 (bb28cf5fdee6f4ba722fee7747f400ac),
deploy into slot with allocation id 56d173aec68e18737e074b9ae10653a0.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
> Task :sdks:python:test-suites:portable:py36:postCommitPy36IT
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory
createBuilder'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b"INFO:
The AWS S3 Beam extension was included in this build, but the awsRegion flag
was not specified. If you don't plan to use S3, then ignore this message."
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.state.StateBackendLoader
loadFromApplicationOrConfigOrDefaultInternal'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'INFO:
No state backend has been configured, using default (HashMap)
org.apache.flink.runtime.state.hashmap.HashMapStateBackend@1ccc9b23'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:122 b'Oct 25,
2021 6:29:43 AM org.apache.flink.runtime.state.CheckpointStorageLoader
createJobManagerCheckpointStorage'java.lang.OutOfMemoryError: GC overhead limit
exceeded
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]