See 
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/103/display/redirect?page=changes>

Changes:

[kamil.wasilewski] [BEAM-7660] Create Python ParDo load test job on Flink

[mxm] [BEAM-8157] Introduce encode_nested method on Python SDK Coder

[mxm] [BEAM-8157] Ensure key encoding for state requests is consistent across

[kamil.wasilewski] [BEAM-7660] Parameter names changes

[bhulette] replace unnecessary identity calls in sql tests

[mxm] [BEAM-7962] Drop support for Flink 1.5 and 1.6

[mxm] [BEAM-7962] Update version compatibility section on Flink Runner page

[iemejia] [BEAM-8299] Upgrade Jackson to version 2.9.10

[zyichi] Remove several supported dataflow python streaming items from

[chamikara] Sets workerHarnessContaienrImage as the containerImage of the

[aaltay] [BEAM-8160] Add FnApi execution mode instruction (#9628)

[michal.walenia] [BEAM-8256] Set fixed number of workers for file-based IOITs

------------------------------------------
[...truncated 98.18 KB...]
  Using cached 
https://files.pythonhosted.org/packages/87/76/46d697698a143e05f77bec5a526bf4e56a0be61d63425b68f4ba553b51f2/pytz-2019.2-py2.py3-none-any.whl
Collecting pyyaml<4.0.0,>=3.12 (from apache-beam==2.17.0.dev0)
Collecting avro<2.0.0,>=1.8.1 (from apache-beam==2.17.0.dev0)
Collecting funcsigs<2,>=1.0.2 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Requirement already satisfied: futures<4.0.0,>=3.2.0 in 
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages>
 (from apache-beam==2.17.0.dev0) (3.3.0)
Collecting pyvcf<0.7.0,>=0.6.8 (from apache-beam==2.17.0.dev0)
Collecting typing<3.7.0,>=3.6.0 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/cc/3e/29f92b7aeda5b078c86d14f550bf85cff809042e3429ace7af6193c3bc9f/typing-3.6.6-py2-none-any.whl
Collecting pyarrow<0.15.0,>=0.11.1 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/24/54/3c6225f1ca70351338075af3a3aa3119f2f6c8175989b62eb759cc4a9e5b/pyarrow-0.14.1-cp27-cp27mu-manylinux2010_x86_64.whl
Collecting cachetools<4,>=3.1.0 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/2f/a6/30b0a0bef12283e83e58c1d6e7b5aabc7acfc4110df81a4471655d33e704/cachetools-3.1.1-py2.py3-none-any.whl
Collecting google-apitools<0.5.29,>=0.5.28 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/07/5e/3e04cb66f5ced9267a854184bb09863d85d199646ea8480fee26b4313a00/google_apitools-0.5.28-py2-none-any.whl
Collecting google-cloud-datastore<1.8.0,>=1.7.1 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/d0/aa/29cbcf8cf7d08ce2d55b9dce858f7c632b434cb6451bed17cb4275804217/google_cloud_datastore-1.7.4-py2.py3-none-any.whl
Collecting google-cloud-pubsub<1.1.0,>=0.39.0 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/45/38/49841ce316540f58cbd80b374a16a3d956ddd372792ad0f271ea18676dd9/google_cloud_pubsub-1.0.0-py2.py3-none-any.whl
Collecting google-cloud-bigquery<1.18.0,>=1.6.0 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/a4/96/1b9cf1d43869c47a205aad411dac7c3040df6093d63c39273fa4d4c45da7/google_cloud_bigquery-1.17.1-py2.py3-none-any.whl
Collecting google-cloud-core<2,>=0.28.1 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/ee/f0/084f598629db8e6ec3627688723875cdb03637acb6d86999bb105a71df64/google_cloud_core-1.0.3-py2.py3-none-any.whl
Collecting google-cloud-bigtable<1.1.0,>=0.31.1 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/95/af/0ef7d097a1d5ad0c843867600e86de915e8ab8864740f49a4636cfb51af6/google_cloud_bigtable-1.0.0-py2.py3-none-any.whl
Collecting proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 (from 
apache-beam==2.17.0.dev0)
Collecting googledatastore<7.1,>=7.0.1 (from apache-beam==2.17.0.dev0)
Collecting nose>=1.3.7 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/99/4f/13fb671119e65c4dce97c60e67d3fd9e6f7f809f2b307e2611f4701205cb/nose-1.3.7-py2-none-any.whl
Collecting nose_xunitmp>=0.4.1 (from apache-beam==2.17.0.dev0)
Collecting numpy<2,>=1.14.3 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/d7/b1/3367ea1f372957f97a6752ec725b87886e12af1415216feec9067e31df70/numpy-1.16.5-cp27-cp27mu-manylinux1_x86_64.whl
Collecting pandas<0.25,>=0.23.4 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/db/83/7d4008ffc2988066ff37f6a0bb6d7b60822367dcb36ba5e39aa7801fda54/pandas-0.24.2-cp27-cp27mu-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Collecting tenacity<6.0,>=5.0.2 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/1e/a1/be8c8610f4620c56790965ba2b564dd76d13cbcd7c2ff8f6053ce63027fb/tenacity-5.1.1-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in 
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages>
 (from grpcio<2,>=1.12.1->apache-beam==2.17.0.dev0) (1.12.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in 
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages>
 (from grpcio<2,>=1.12.1->apache-beam==2.17.0.dev0) (1.1.6)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
Collecting requests>=2.7.0 (from hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting pbr>=0.11 (from mock<3.0.0,>=1.0.1->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/46/a4/d5c83831a3452713e4b4f126149bc4fbda170f7cb16a86a00ce57ce0e9ad/pbr-5.4.3-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/a1/71/8f0d444e3a74e5640a3d5d967c1c6b015da9c655f35b2d308a55d907a517/pyasn1-0.4.7-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5 (from 
oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/be/70/e5ea8afd6d08a4b99ebfc77bd1845248d56cfcf43d11f9dc324b9580a35c/pyasn1_modules-0.2.6-py2.py3-none-any.whl
Collecting rsa>=3.1.4 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Requirement already satisfied: setuptools in 
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages>
 (from protobuf<4,>=3.5.0.post1->apache-beam==2.17.0.dev0) (41.2.0)
Collecting pyparsing>=2.1.4 (from pydot<2,>=1.2.0->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/11/fa/0160cd525c62d7abd076a070ff02b2b94de589f1a9789774f17d7c54058e/pyparsing-2.4.2-py2.py3-none-any.whl
Collecting fasteners>=0.14 (from 
google-apitools<0.5.29,>=0.5.28->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/18/bd/55eb2d6397b9c0e263af9d091ebdb756b15756029b3cededf6461481bc63/fasteners-0.15-py2.py3-none-any.whl
Collecting google-api-core[grpc]<2.0.0dev,>=1.6.0 (from 
google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/71/e5/7059475b3013a3c75abe35015c5761735ab224eb1b129fee7c8e376e7805/google_api_core-1.14.2-py2.py3-none-any.whl
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3 (from 
google-cloud-pubsub<1.1.0,>=0.39.0->apache-beam==2.17.0.dev0)
Collecting google-resumable-media<0.5.0dev,>=0.3.1 (from 
google-cloud-bigquery<1.18.0,>=1.6.0->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/96/d7/b29a41b01b854480891dfc408211ffb0cc7a2a3d5f15a3b6740ec18c845b/google_resumable_media-0.4.1-py2.py3-none-any.whl
Collecting googleapis-common-protos<2.0dev,>=1.5.2 (from 
proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.17.0.dev0)
Collecting monotonic>=0.6; python_version == "2.7" (from 
tenacity<6.0,>=5.0.2->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from 
requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/81/b7/cef47224900ca67078ed6e2db51342796007433ad38329558f56a15255f5/urllib3-1.25.5-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from 
requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/18/b0/8146a4f8dd402f60744fa380bc73ca47303cccf8b9190fd16a827281eac2/certifi-2019.9.11-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from 
requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from 
requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting google-auth<2.0dev,>=0.4.0 (from 
google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/c5/9b/ed0516cc1f7609fb0217e3057ff4f0f9f3e3ce79a369c6af4a6c5ca25664/google_auth-1.6.3-py2.py3-none-any.whl
Installing collected packages: crcmod, dill, fastavro, future, docopt, urllib3, 
certifi, chardet, idna, requests, hdfs, httplib2, pbr, funcsigs, mock, pymongo, 
pyasn1, pyasn1-modules, rsa, oauth2client, pyparsing, pydot, python-dateutil, 
pytz, pyyaml, avro, pyvcf, typing, numpy, pyarrow, cachetools, monotonic, 
fasteners, google-apitools, googleapis-common-protos, google-auth, 
google-api-core, google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, 
google-cloud-pubsub, google-resumable-media, google-cloud-bigquery, 
google-cloud-bigtable, proto-google-cloud-datastore-v1, googledatastore, nose, 
nose-xunitmp, pandas, parameterized, pyhamcrest, tenacity, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam avro-1.9.1 cachetools-3.1.1 
certifi-2019.9.11 chardet-3.0.4 crcmod-1.7 dill-0.3.0 docopt-0.6.2 
fastavro-0.21.24 fasteners-0.15 funcsigs-1.0.2 future-0.17.1 
google-api-core-1.14.2 google-apitools-0.5.28 google-auth-1.6.3 
google-cloud-bigquery-1.17.1 google-cloud-bigtable-1.0.0 
google-cloud-core-1.0.3 google-cloud-datastore-1.7.4 google-cloud-pubsub-1.0.0 
google-resumable-media-0.4.1 googleapis-common-protos-1.6.0 
googledatastore-7.0.2 grpc-google-iam-v1-0.12.3 hdfs-2.5.8 httplib2-0.12.0 
idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.16.5 
oauth2client-3.0.0 pandas-0.24.2 parameterized-0.6.3 pbr-5.4.3 
proto-google-cloud-datastore-v1-0.90.4 pyarrow-0.14.1 pyasn1-0.4.7 
pyasn1-modules-0.2.6 pydot-1.4.1 pyhamcrest-1.9.0 pymongo-3.9.0 pyparsing-2.4.2 
python-dateutil-2.8.0 pytz-2019.2 pyvcf-0.6.8 pyyaml-3.13 requests-2.22.0 
rsa-4.0 tenacity-5.1.1 typing-3.6.6 urllib3-1.25.5

> Task :sdks:python:apache_beam:testing:load_tests:run
setup.py:185: UserWarning: You are using Apache Beam with Python 2. New 
releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/local/lib/python2.7/site-packages/setuptools/dist.py>:474:
 UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/__init__.py>:84:
 UserWarning: You are using Apache Beam with Python 2. New releases of Apache 
Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
testGroupByKey 
(apache_beam.testing.load_tests.group_by_key_test.GroupByKeyTest) ... ERROR

======================================================================
ERROR: testGroupByKey 
(apache_beam.testing.load_tests.group_by_key_test.GroupByKeyTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py";,>
 line 70, in tearDown
    result = self.pipeline.run()
  File 
"<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/test_pipeline.py";,>
 line 114, in run
    state = result.wait_until_finish()
  File 
"<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py";,>
 line 439, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
load_tests_Python_Flink_Batch_GBK_4_0924112020_b4c453ee-0404-4dd0-9757-03b8d5335df8
 failed in state FAILED: java.io.IOException: Connection reset by peer
-------------------- >> begin captured logging << --------------------
root: INFO: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/Grammar.txt
root: INFO: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: INFO: Metrics will not be collected
root: INFO: ==================== <function lift_combiners at 0x7ff43f172b90> 
====================
root: DEBUG: 21 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Read/Impulse_3\n  
Read/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Split_4\n  
Read/Split:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_Read/Reshuffle/AddRandomKeys_6\n  
Read/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_8\n 
 Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/GroupByKey_9\n  
Read/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_13\n
  
Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Read/Reshuffle/RemoveRandomKeys_14\n  
Read/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/ReadSplits_15\n 
 Read/ReadSplits:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: 
Start_16\n  Measure time: Start:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_GroupByKey 0_17\n  
GroupByKey 0:beam:transform:group_by_key:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Ungroup 0_21\n  
Ungroup 0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_Measure time: End 0_22\n  Measure time: End 
0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_GroupByKey 1_23\n  GroupByKey 
1:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_Ungroup 1_27\n  Ungroup 
1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_Measure time: End 1_28\n  Measure time: End 
1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_GroupByKey 2_29\n  GroupByKey 
2:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_Ungroup 2_33\n  Ungroup 
2:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_Measure time: End 2_34\n  Measure time: End 
2:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_GroupByKey 3_35\n  GroupByKey 
3:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_Ungroup 3_39\n  Ungroup 
3:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_Measure time: End 3_40\n  Measure time: End 
3:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>']
root: INFO: ==================== <function expand_sdf at 0x7ff43f172c08> 
====================
root: DEBUG: 21 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Read/Impulse_3\n  
Read/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Split_4\n  
Read/Split:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_Read/Reshuffle/AddRandomKeys_6\n  
Read/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_8\n 
 Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/GroupByKey_9\n  
Read/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_13\n
  
Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Read/Reshuffle/RemoveRandomKeys_14\n  
Read/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/ReadSplits_15\n 
 Read/ReadSplits:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: 
Start_16\n  Measure time: Start:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_GroupByKey 0_17\n  
GroupByKey 0:beam:transform:group_by_key:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Ungroup 0_21\n  
Ungroup 0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_Measure time: End 0_22\n  Measure time: End 
0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_GroupByKey 1_23\n  GroupByKey 
1:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_Ungroup 1_27\n  Ungroup 
1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_Measure time: End 1_28\n  Measure time: End 
1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_GroupByKey 2_29\n  GroupByKey 
2:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_Ungroup 2_33\n  Ungroup 
2:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_Measure time: End 2_34\n  Measure time: End 
2:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_GroupByKey 3_35\n  GroupByKey 
3:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_Ungroup 3_39\n  Ungroup 
3:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>', 'ref_AppliedPTransform_Measure time: End 3_40\n  Measure time: End 
3:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>']
root: DEBUG: Runner option 'job_name' was already added
root: DEBUG: Runner option 'runner' was already added
root: DEBUG: Runner option 'temp_location' was already added
root: DEBUG: Runner option 'streaming' was already added
root: DEBUG: Runner option 'dataflow_kms_key' was already added
root: DEBUG: Runner option 'enable_streaming_engine' was already added
root: DEBUG: Runner option 'project' was already added
root: DEBUG: Runner option 'zone' was already added
root: DEBUG: Runner option 'environment_cache_millis' was already added
root: DEBUG: Runner option 'files_to_stage' was already added
root: DEBUG: Runner option 'job_endpoint' was already added
root: DEBUG: Runner option 'sdk_worker_parallelism' was already added
root: DEBUG: Runner option 'experiments' was already added
root: WARNING: Discarding unparseable args: ['--publish_to_big_query=false', 
'--metrics_dataset=load_test', '--metrics_table=python_flink_batch_GBK_4', 
'--input_options={"num_records": 5000000,"key_size": 10,"value_size":90}', 
'--iterations=1', '--fanout=4']
root: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job 
failed. (JobID: 94febac56265a7dfd352dfe6e8fe68b3)
        at 
org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:268)
        at 
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:487)
        at 
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:475)
        at 
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:450)
        at 
org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:210)
        at 
org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:187)
        at 
org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
        at 
org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
        at 
org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:104)
        at 
org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:80)
        at 
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:78)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution 
failed.
        at 
org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
        at 
org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:265)
        ... 16 more
Caused by: java.lang.Exception: The data preparation for task 'GroupReduce 
(GroupReduce at GroupByKey 1)' , caused an error: Error obtaining the sorted 
input: Thread 'SortMerger Reading Thread' terminated due to an exception: Lost 
connection to task manager 
'beam-loadtests-python-gbk-flink-batch-103-w-7.c.apache-beam-testing.internal/10.128.2.117:43555'.
 This indicates that the remote task manager was lost.
        at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:479)
        at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704)
        ... 1 more
Caused by: java.lang.RuntimeException: Error obtaining the sorted input: Thread 
'SortMerger Reading Thread' terminated due to an exception: Lost connection to 
task manager 
'beam-loadtests-python-gbk-flink-batch-103-w-7.c.apache-beam-testing.internal/10.128.2.117:43555'.
 This indicates that the remote task manager was lost.
        at 
org.apache.flink.runtime.operators.sort.UnilateralSortMerger.getIterator(UnilateralSortMerger.java:650)
        at 
org.apache.flink.runtime.operators.BatchTask.getInput(BatchTask.java:1108)
        at 
org.apache.flink.runtime.operators.GroupReduceDriver.prepare(GroupReduceDriver.java:99)
        at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:473)
        ... 3 more
Caused by: java.io.IOException: Thread 'SortMerger Reading Thread' terminated 
due to an exception: Lost connection to task manager 
'beam-loadtests-python-gbk-flink-batch-103-w-7.c.apache-beam-testing.internal/10.128.2.117:43555'.
 This indicates that the remote task manager was lost.
        at 
org.apache.flink.runtime.operators.sort.UnilateralSortMerger$ThreadBase.run(UnilateralSortMerger.java:831)
Caused by: 
org.apache.flink.runtime.io.network.netty.exception.RemoteTransportException: 
Lost connection to task manager 
'beam-loadtests-python-gbk-flink-batch-103-w-7.c.apache-beam-testing.internal/10.128.2.117:43555'.
 This indicates that the remote task manager was lost.
        at 
org.apache.flink.runtime.io.network.netty.CreditBasedPartitionRequestClientHandler.exceptionCaught(CreditBasedPartitionRequestClientHandler.java:160)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:285)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:264)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.fireExceptionCaught(AbstractChannelHandlerContext.java:256)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.ChannelInboundHandlerAdapter.exceptionCaught(ChannelInboundHandlerAdapter.java:131)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:285)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:264)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.fireExceptionCaught(AbstractChannelHandlerContext.java:256)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.ChannelHandlerAdapter.exceptionCaught(ChannelHandlerAdapter.java:87)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:285)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:264)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.fireExceptionCaught(AbstractChannelHandlerContext.java:256)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.DefaultChannelPipeline$HeadContext.exceptionCaught(DefaultChannelPipeline.java:1401)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:285)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:264)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.DefaultChannelPipeline.fireExceptionCaught(DefaultChannelPipeline.java:953)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.handleReadException(AbstractNioByteChannel.java:125)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:174)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
        at 
org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Connection reset by peer
        at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
        at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
        at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
        at sun.nio.ch.IOUtil.read(IOUtil.java:192)
        at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
        at 
org.apache.flink.shaded.netty4.io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:288)
        at 
org.apache.flink.shaded.netty4.io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:1108)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:345)
        at 
org.apache.flink.shaded.netty4.io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:148)
        ... 6 more

root: ERROR: java.io.IOException: Connection reset by peer
root: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 7407.583s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'>
 line: 55

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 3m 54s
3 actionable tasks: 3 executed

Publishing build scan...
https://gradle.com/s/7d2h3ld7lxuy4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to