See
<https://builds.apache.org/job/beam_PostCommit_Python37/1864/display/redirect?page=changes>
Changes:
[sunjincheng121] [BEAM-9295] Add Flink 1.10 build target and Make FlinkRunner
compatible
------------------------------------------
[...truncated 619.02 KB...]
Exception in thread read_state:
Traceback (most recent call last):
File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/usr/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py",>
line 712, in pull_responses
for response in responses:
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",>
line 416, in __next__
return self._next()
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",>
line 706, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string =
"{"created":"@1583952303.360974006","description":"Error received from peer
ipv4:127.0.0.1:40455","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Socket
closed","grpc_status":14}"
>
> Task :sdks:python:test-suites:portable:py37:postCommitPy37IT
>>> RUNNING integration tests with pipeline options: --runner=FlinkRunner
>>> --project=apache-beam-testing --environment_type=LOOPBACK
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it
>>> test options: --tests=apache_beam.io.gcp.bigquery_read_it_test
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/setuptools/dist.py>:453:
UserWarning: Normalizing '2.21.0.dev' to '2.21.0.dev0'
normalized_version,
INFO:gen_protos:Skipping proto regeneration: all files up to date
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
Level 5:avro.schema:Register new name for 'org.apache.avro.file.Header'
Level 5:avro.schema:Register new name for 'org.apache.avro.file.magic'
Level 5:avro.schema:Register new name for 'org.apache.avro.file.sync'
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
DEBUG:root:Connecting using Google Application Default Credentials.
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.io.gcp.bigquery_read_it_test:Created dataset
python_read_table_15839523064621 in project apache-beam-testing
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests)
...
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:275:
FutureWarning: _ReadFromBigQuery is experimental.
query=self.query, use_standard_sql=True, project=self.project))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1606:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = pcoll.pipeline.options.view_as(
DEBUG:root:gcs_location is empty, using temp_location instead
DEBUG:root:Unhandled type_constraint: Const[RemoveJsonFiles]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
<function lift_combiners at 0x7fcfac48fbf8> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:19 [1, 1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:Stages:
['ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5\n
read/Read/_SDFBoundedSourceWrapper/Impulse:beam:transform:impulse:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6\n
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9\n
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11\n
read/_PassThroughThenCleanup/Create/Impulse:beam:transform:impulse:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda at
core.py:2643>)_12\n read/_PassThroughThenCleanup/Create/FlatMap(<lambda at
core.py:2643>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14\n
read/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15\n
read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/Impulse_18\n
assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2643>)_19\n assert_that/Create/FlatMap(<lambda at
core.py:2643>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/Map(decode)_21\n
assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_22\n
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/ToVoidKey_23\n
assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/pair_with_0_25\n
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/pair_with_1_26\n
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/Flatten_27\n
assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/GroupByKey_28\n
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_32\n
assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Unkey_33\n
assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Match_34\n
assert_that/Match:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>']
ERROR
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests)
... SKIP: This test doesn't work on these runners: ['PortableRunner',
'FlinkRunner']
INFO:apache_beam.io.gcp.bigquery_read_it_test:Deleting dataset
python_read_table_15839523064621 in project apache-beam-testing
INFO:apache_beam.io.gcp.bigquery_read_it_test:Created dataset
python_read_table_15839523076910 in project apache-beam-testing
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ...
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:162:
FutureWarning: _ReadFromBigQuery is experimental.
query=self.query, use_standard_sql=True, project=self.project))
DEBUG:root:gcs_location is empty, using temp_location instead
DEBUG:root:Unhandled type_constraint: Const[RemoveJsonFiles]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
<function lift_combiners at 0x7fcfac48fbf8> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:19 [1, 1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:Stages:
['ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5\n
read/Read/_SDFBoundedSourceWrapper/Impulse:beam:transform:impulse:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6\n
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9\n
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11\n
read/_PassThroughThenCleanup/Create/Impulse:beam:transform:impulse:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda at
core.py:2643>)_12\n read/_PassThroughThenCleanup/Create/FlatMap(<lambda at
core.py:2643>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14\n
read/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15\n
read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/Impulse_18\n
assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2643>)_19\n assert_that/Create/FlatMap(<lambda at
core.py:2643>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/Map(decode)_21\n
assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_22\n
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/ToVoidKey_23\n
assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/pair_with_0_25\n
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/pair_with_1_26\n
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/Flatten_27\n
assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/GroupByKey_28\n
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_32\n
assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Unkey_33\n
assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Match_34\n
assert_that/Match:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>']
ERROR
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ...
SKIP: This test doesn't work on these runners: ['PortableRunner', 'FlinkRunner']
INFO:apache_beam.io.gcp.bigquery_read_it_test:Deleting dataset
python_read_table_15839523076910 in project apache-beam-testing
======================================================================
ERROR: test_iobase_source
(apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests)
----------------------------------------------------------------------
Traceback (most recent call last):
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py",>
line 276, in test_iobase_source
assert_that(result, equal_to(self.get_expected_data()))
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",>
line 522, in __exit__
self.run().wait_until_finish()
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",>
line 495, in run
self._options).run(False)
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",>
line 508, in run
return self.runner.run_pipeline(self, self._options)
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/flink_runner.py",>
line 47, in run_pipeline
return super(FlinkRunner, self).run_pipeline(pipeline, options)
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",>
line 386, in run_pipeline
job_service_handle = self.create_job_service(options)
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",>
line 293, in create_job_service
return JobServiceHandle(server.start(), options)
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",>
line 86, in start
self._endpoint = self._job_server.start()
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",>
line 111, in start
cmd, endpoint = self.subprocess_cmd_and_endpoint()
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",>
line 156, in subprocess_cmd_and_endpoint
jar_path = self.local_jar(self.path_to_jar())
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/flink_runner.py",>
line 95, in path_to_jar
'runners:flink:%s:job-server:shadowJar' % self._flink_version)
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",>
line 149, in path_to_beam_jar
return subprocess_server.JavaJarServer.path_to_beam_jar(gradle_target)
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/subprocess_server.py",>
line 186, in path_to_beam_jar
(local_path, os.path.abspath(project_root), gradle_target))
RuntimeError:
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/runners/flink/1.9/job-server/build/libs/beam-runners-flink-1.9-job-server-2.21.0-SNAPSHOT.jar>
not found. Please build the server with
cd <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src;> ./gradlew
runners:flink:1.9:job-server:shadowJar
-------------------- >> begin captured logging << --------------------
avro.schema: Level 5: Register new name for 'org.apache.avro.file.Header'
avro.schema: Level 5: Register new name for 'org.apache.avro.file.magic'
avro.schema: Level 5: Register new name for 'org.apache.avro.file.sync'
apache_beam.internal.gcp.auth: INFO: Setting socket default timeout to 60
seconds.
apache_beam.internal.gcp.auth: INFO: socket default timeout is 60.0 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.io.gcp.bigquery_read_it_test: INFO: Created dataset
python_read_table_15839523064621 in project apache-beam-testing
root: DEBUG: gcs_location is empty, using temp_location instead
root: DEBUG: Unhandled type_constraint: Const[RemoveJsonFiles]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
apache_beam.runners.portability.fn_api_runner_transforms: INFO:
==================== <function lift_combiners at 0x7fcfac48fbf8>
====================
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 19 [1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: Stages:
['ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5\n
read/Read/_SDFBoundedSourceWrapper/Impulse:beam:transform:impulse:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6\n
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9\n
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11\n
read/_PassThroughThenCleanup/Create/Impulse:beam:transform:impulse:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda at
core.py:2643>)_12\n read/_PassThroughThenCleanup/Create/FlatMap(<lambda at
core.py:2643>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14\n
read/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15\n
read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/Impulse_18\n
assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2643>)_19\n assert_that/Create/FlatMap(<lambda at
core.py:2643>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/Map(decode)_21\n
assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_22\n
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/ToVoidKey_23\n
assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/pair_with_0_25\n
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/pair_with_1_26\n
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/Flatten_27\n
assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/GroupByKey_28\n
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_32\n
assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Unkey_33\n
assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Match_34\n
assert_that/Match:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>']
--------------------- >> end captured logging << ---------------------
======================================================================
ERROR: test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests)
----------------------------------------------------------------------
Traceback (most recent call last):
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py",>
line 163, in test_iobase_source
assert_that(result, equal_to(self.TABLE_DATA))
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",>
line 522, in __exit__
self.run().wait_until_finish()
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",>
line 495, in run
self._options).run(False)
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",>
line 508, in run
return self.runner.run_pipeline(self, self._options)
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/flink_runner.py",>
line 47, in run_pipeline
return super(FlinkRunner, self).run_pipeline(pipeline, options)
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",>
line 386, in run_pipeline
job_service_handle = self.create_job_service(options)
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",>
line 293, in create_job_service
return JobServiceHandle(server.start(), options)
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",>
line 86, in start
self._endpoint = self._job_server.start()
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",>
line 111, in start
cmd, endpoint = self.subprocess_cmd_and_endpoint()
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",>
line 156, in subprocess_cmd_and_endpoint
jar_path = self.local_jar(self.path_to_jar())
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/flink_runner.py",>
line 95, in path_to_jar
'runners:flink:%s:job-server:shadowJar' % self._flink_version)
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",>
line 149, in path_to_beam_jar
return subprocess_server.JavaJarServer.path_to_beam_jar(gradle_target)
File
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/subprocess_server.py",>
line 186, in path_to_beam_jar
(local_path, os.path.abspath(project_root), gradle_target))
RuntimeError:
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/runners/flink/1.9/job-server/build/libs/beam-runners-flink-1.9-job-server-2.21.0-SNAPSHOT.jar>
not found. Please build the server with
cd <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src;> ./gradlew
runners:flink:1.9:job-server:shadowJar
-------------------- >> begin captured logging << --------------------
apache_beam.io.gcp.bigquery_read_it_test: INFO: Deleting dataset
python_read_table_15839523064621 in project apache-beam-testing
apache_beam.io.gcp.bigquery_read_it_test: INFO: Created dataset
python_read_table_15839523076910 in project apache-beam-testing
root: DEBUG: gcs_location is empty, using temp_location instead
root: DEBUG: Unhandled type_constraint: Const[RemoveJsonFiles]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
apache_beam.runners.portability.fn_api_runner_transforms: INFO:
==================== <function lift_combiners at 0x7fcfac48fbf8>
====================
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 19 [1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: Stages:
['ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5\n
read/Read/_SDFBoundedSourceWrapper/Impulse:beam:transform:impulse:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6\n
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9\n
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11\n
read/_PassThroughThenCleanup/Create/Impulse:beam:transform:impulse:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda at
core.py:2643>)_12\n read/_PassThroughThenCleanup/Create/FlatMap(<lambda at
core.py:2643>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14\n
read/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15\n
read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/Impulse_18\n
assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2643>)_19\n assert_that/Create/FlatMap(<lambda at
core.py:2643>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Create/Map(decode)_21\n
assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_22\n
assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must
follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/ToVoidKey_23\n
assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/pair_with_0_25\n
assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/pair_with_1_26\n
assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/Flatten_27\n
assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/GroupByKey_28\n
assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_32\n
assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Unkey_33\n
assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>',
'ref_AppliedPTransform_assert_that/Match_34\n
assert_that/Match:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: <unknown>']
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-postCommitIT-flink-py37.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 4 tests in 3.576s
FAILED (SKIP=2, errors=2)
> Task :sdks:python:test-suites:portable:py37:postCommitPy37IT FAILED
FAILURE: Build completed with 4 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py37:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/py37/build.gradle'>
line: 60
* What went wrong:
Execution failed for task
':sdks:python:test-suites:direct:py37:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/py37/build.gradle'>
line: 62
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py37:postCommitPy37IT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 2m 45s
85 actionable tasks: 63 executed, 22 from cache
Publishing build scan...
https://gradle.com/s/qpa7xmxrjcpfq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]