See 
<https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/64/display/redirect?page=changes>

Changes:

[lostluck] [Go SDK] Fix Off by One error splitting DataSource

[lostluck] Update sdks/go/pkg/beam/core/runtime/exec/datasource.go

[dcavazos] [BEAM-7389] Use includes for buttons

[ankurgoenka] Add artifacts-dir param to flink runner

[github] [BEAM-8313] Rename certain proto fields to be consistent across

------------------------------------------
[...truncated 96.21 KB...]
Collecting fastavro<0.22,>=0.21.4 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/15/e3/5956c75f68906b119191ef30d9acff661b422cf918a29a03ee0c3ba774be/fastavro-0.21.24-cp27-cp27mu-manylinux1_x86_64.whl
Collecting future<1.0.0,>=0.16.0 (from apache-beam==2.17.0.dev0)
Requirement already satisfied: grpcio<2,>=1.12.1 in 
<https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages>
 (from apache-beam==2.17.0.dev0) (1.23.0)
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.17.0.dev0)
Collecting httplib2<=0.12.0,>=0.8 (from apache-beam==2.17.0.dev0)
Collecting mock<3.0.0,>=1.0.1 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/e6/35/f187bdf23be87092bd0f1200d43d23076cee4d0dec109f195173fd3ebc79/mock-2.0.0-py2.py3-none-any.whl
Collecting pymongo<4.0.0,>=3.8.0 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/00/5c/5379d5b8167a5938918d9ee147f865f6f8a64b93947d402cfdca5c1416d2/pymongo-3.9.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting oauth2client<4,>=2.0.1 (from apache-beam==2.17.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in 
<https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages>
 (from apache-beam==2.17.0.dev0) (3.9.2)
Collecting pydot<2,>=1.2.0 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/33/d1/b1479a770f66d962f545c2101630ce1d5592d90cb4f083d38862e93d16d2/pydot-1.4.1-py2.py3-none-any.whl
Collecting python-dateutil<3,>=2.8.0 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl
Collecting pytz>=2018.3 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/87/76/46d697698a143e05f77bec5a526bf4e56a0be61d63425b68f4ba553b51f2/pytz-2019.2-py2.py3-none-any.whl
Collecting pyyaml<4.0.0,>=3.12 (from apache-beam==2.17.0.dev0)
Collecting avro<2.0.0,>=1.8.1 (from apache-beam==2.17.0.dev0)
Collecting funcsigs<2,>=1.0.2 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Requirement already satisfied: futures<4.0.0,>=3.2.0 in 
<https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages>
 (from apache-beam==2.17.0.dev0) (3.3.0)
Collecting pyvcf<0.7.0,>=0.6.8 (from apache-beam==2.17.0.dev0)
Collecting typing<3.7.0,>=3.6.0 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/cc/3e/29f92b7aeda5b078c86d14f550bf85cff809042e3429ace7af6193c3bc9f/typing-3.6.6-py2-none-any.whl
Collecting pyarrow<0.15.0,>=0.11.1 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/24/54/3c6225f1ca70351338075af3a3aa3119f2f6c8175989b62eb759cc4a9e5b/pyarrow-0.14.1-cp27-cp27mu-manylinux2010_x86_64.whl
Collecting cachetools<4,>=3.1.0 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/2f/a6/30b0a0bef12283e83e58c1d6e7b5aabc7acfc4110df81a4471655d33e704/cachetools-3.1.1-py2.py3-none-any.whl
Collecting google-apitools<0.5.29,>=0.5.28 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/07/5e/3e04cb66f5ced9267a854184bb09863d85d199646ea8480fee26b4313a00/google_apitools-0.5.28-py2-none-any.whl
Collecting google-cloud-datastore<1.8.0,>=1.7.1 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/d0/aa/29cbcf8cf7d08ce2d55b9dce858f7c632b434cb6451bed17cb4275804217/google_cloud_datastore-1.7.4-py2.py3-none-any.whl
Collecting google-cloud-pubsub<1.1.0,>=0.39.0 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/45/38/49841ce316540f58cbd80b374a16a3d956ddd372792ad0f271ea18676dd9/google_cloud_pubsub-1.0.0-py2.py3-none-any.whl
Collecting google-cloud-bigquery<1.18.0,>=1.6.0 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/a4/96/1b9cf1d43869c47a205aad411dac7c3040df6093d63c39273fa4d4c45da7/google_cloud_bigquery-1.17.1-py2.py3-none-any.whl
Collecting google-cloud-core<2,>=0.28.1 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/ee/f0/084f598629db8e6ec3627688723875cdb03637acb6d86999bb105a71df64/google_cloud_core-1.0.3-py2.py3-none-any.whl
Collecting google-cloud-bigtable<1.1.0,>=0.31.1 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/95/af/0ef7d097a1d5ad0c843867600e86de915e8ab8864740f49a4636cfb51af6/google_cloud_bigtable-1.0.0-py2.py3-none-any.whl
Collecting proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 (from 
apache-beam==2.17.0.dev0)
Collecting googledatastore<7.1,>=7.0.1 (from apache-beam==2.17.0.dev0)
Collecting nose>=1.3.7 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/99/4f/13fb671119e65c4dce97c60e67d3fd9e6f7f809f2b307e2611f4701205cb/nose-1.3.7-py2-none-any.whl
Collecting nose_xunitmp>=0.4.1 (from apache-beam==2.17.0.dev0)
Collecting numpy<2,>=1.14.3 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/d7/b1/3367ea1f372957f97a6752ec725b87886e12af1415216feec9067e31df70/numpy-1.16.5-cp27-cp27mu-manylinux1_x86_64.whl
Collecting pandas<0.25,>=0.23.4 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/db/83/7d4008ffc2988066ff37f6a0bb6d7b60822367dcb36ba5e39aa7801fda54/pandas-0.24.2-cp27-cp27mu-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Collecting tenacity<6.0,>=5.0.2 (from apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/1e/a1/be8c8610f4620c56790965ba2b564dd76d13cbcd7c2ff8f6053ce63027fb/tenacity-5.1.1-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in 
<https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages>
 (from grpcio<2,>=1.12.1->apache-beam==2.17.0.dev0) (1.12.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in 
<https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages>
 (from grpcio<2,>=1.12.1->apache-beam==2.17.0.dev0) (1.1.6)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
Collecting requests>=2.7.0 (from hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting pbr>=0.11 (from mock<3.0.0,>=1.0.1->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/46/a4/d5c83831a3452713e4b4f126149bc4fbda170f7cb16a86a00ce57ce0e9ad/pbr-5.4.3-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/a1/71/8f0d444e3a74e5640a3d5d967c1c6b015da9c655f35b2d308a55d907a517/pyasn1-0.4.7-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5 (from 
oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/be/70/e5ea8afd6d08a4b99ebfc77bd1845248d56cfcf43d11f9dc324b9580a35c/pyasn1_modules-0.2.6-py2.py3-none-any.whl
Collecting rsa>=3.1.4 (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Requirement already satisfied: setuptools in 
<https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages>
 (from protobuf<4,>=3.5.0.post1->apache-beam==2.17.0.dev0) (41.2.0)
Collecting pyparsing>=2.1.4 (from pydot<2,>=1.2.0->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/11/fa/0160cd525c62d7abd076a070ff02b2b94de589f1a9789774f17d7c54058e/pyparsing-2.4.2-py2.py3-none-any.whl
Collecting fasteners>=0.14 (from 
google-apitools<0.5.29,>=0.5.28->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/18/bd/55eb2d6397b9c0e263af9d091ebdb756b15756029b3cededf6461481bc63/fasteners-0.15-py2.py3-none-any.whl
Collecting google-api-core[grpc]<2.0.0dev,>=1.6.0 (from 
google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/71/e5/7059475b3013a3c75abe35015c5761735ab224eb1b129fee7c8e376e7805/google_api_core-1.14.2-py2.py3-none-any.whl
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3 (from 
google-cloud-pubsub<1.1.0,>=0.39.0->apache-beam==2.17.0.dev0)
Collecting google-resumable-media<0.5.0dev,>=0.3.1 (from 
google-cloud-bigquery<1.18.0,>=1.6.0->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/96/d7/b29a41b01b854480891dfc408211ffb0cc7a2a3d5f15a3b6740ec18c845b/google_resumable_media-0.4.1-py2.py3-none-any.whl
Collecting googleapis-common-protos<2.0dev,>=1.5.2 (from 
proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.17.0.dev0)
Collecting monotonic>=0.6; python_version == "2.7" (from 
tenacity<6.0,>=5.0.2->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from 
requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/e0/da/55f51ea951e1b7c63a579c09dd7db825bb730ec1fe9c0180fc77bfb31448/urllib3-1.25.6-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from 
requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/18/b0/8146a4f8dd402f60744fa380bc73ca47303cccf8b9190fd16a827281eac2/certifi-2019.9.11-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from 
requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from 
requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting google-auth<2.0dev,>=0.4.0 (from 
google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.17.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/c5/9b/ed0516cc1f7609fb0217e3057ff4f0f9f3e3ce79a369c6af4a6c5ca25664/google_auth-1.6.3-py2.py3-none-any.whl
Installing collected packages: crcmod, dill, fastavro, future, docopt, urllib3, 
certifi, chardet, idna, requests, hdfs, httplib2, pbr, funcsigs, mock, pymongo, 
pyasn1, pyasn1-modules, rsa, oauth2client, pyparsing, pydot, python-dateutil, 
pytz, pyyaml, avro, pyvcf, typing, numpy, pyarrow, cachetools, monotonic, 
fasteners, google-apitools, googleapis-common-protos, google-auth, 
google-api-core, google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, 
google-cloud-pubsub, google-resumable-media, google-cloud-bigquery, 
google-cloud-bigtable, proto-google-cloud-datastore-v1, googledatastore, nose, 
nose-xunitmp, pandas, parameterized, pyhamcrest, tenacity, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam avro-1.9.1 cachetools-3.1.1 
certifi-2019.9.11 chardet-3.0.4 crcmod-1.7 dill-0.3.0 docopt-0.6.2 
fastavro-0.21.24 fasteners-0.15 funcsigs-1.0.2 future-0.17.1 
google-api-core-1.14.2 google-apitools-0.5.28 google-auth-1.6.3 
google-cloud-bigquery-1.17.1 google-cloud-bigtable-1.0.0 
google-cloud-core-1.0.3 google-cloud-datastore-1.7.4 google-cloud-pubsub-1.0.0 
google-resumable-media-0.4.1 googleapis-common-protos-1.6.0 
googledatastore-7.0.2 grpc-google-iam-v1-0.12.3 hdfs-2.5.8 httplib2-0.12.0 
idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.16.5 
oauth2client-3.0.0 pandas-0.24.2 parameterized-0.6.3 pbr-5.4.3 
proto-google-cloud-datastore-v1-0.90.4 pyarrow-0.14.1 pyasn1-0.4.7 
pyasn1-modules-0.2.6 pydot-1.4.1 pyhamcrest-1.9.0 pymongo-3.9.0 pyparsing-2.4.2 
python-dateutil-2.8.0 pytz-2019.2 pyvcf-0.6.8 pyyaml-3.13 requests-2.22.0 
rsa-4.0 tenacity-5.1.1 typing-3.6.6 urllib3-1.25.6

> Task :sdks:python:apache_beam:testing:load_tests:run
setup.py:185: UserWarning: You are using Apache Beam with Python 2. New 
releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/build/gradleenv/1329484227/local/lib/python2.7/site-packages/setuptools/dist.py>:474:
 UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/sdks/python/apache_beam/__init__.py>:84:
 UserWarning: You are using Apache Beam with Python 2. New releases of Apache 
Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
testCoGroupByKey 
(apache_beam.testing.load_tests.co_group_by_key_test.CoGroupByKeyTest) ... ERROR

======================================================================
ERROR: testCoGroupByKey 
(apache_beam.testing.load_tests.co_group_by_key_test.CoGroupByKeyTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py";,>
 line 70, in tearDown
    result = self.pipeline.run()
  File 
"<https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/test_pipeline.py";,>
 line 114, in run
    state = result.wait_until_finish()
  File 
"<https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py";,>
 line 439, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
load-tests-python-flink-batch-cogbk-1-0926150453_0b8da907-5e6d-41ea-8134-59f7deadeeb1
 failed in state FAILED: java.io.IOException: Received exit code 126 for 
command 'docker run -d --network=host --env=DOCKER_MAC_CONTAINER=null --rm 
gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest --id=1-1 
--logging_endpoint=localhost:42547 --artifact_endpoint=localhost:34535 
--provision_endpoint=localhost:35701 --control_endpoint=localhost:34089'. 
stderr: docker: Got permission denied while trying to connect to the Docker 
daemon socket at unix:///var/run/docker.sock: Post 
http://%2Fvar%2Frun%2Fdocker.sock/v1.38/containers/create: dial unix 
/var/run/docker.sock: connect: permission denied.See 'docker run --help'.
-------------------- >> begin captured logging << --------------------
root: INFO: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/Grammar.txt
root: INFO: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: INFO: Metrics will not be collected
root: INFO: ==================== <function lift_combiners at 0x7fc4f01b3b90> 
====================
root: DEBUG: 27 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 
1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Read pc1/Impulse_3\n  Read 
pc1/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read pc1/Split_4\n  
Read pc1/Split:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc1/Reshuffle/AddRandomKeys_6\n  Read 
pc1/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc1/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_8\n  Read 
pc1/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Read pc1/Reshuffle/ReshufflePerKey/GroupByKey_9\n  Read 
pc1/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc1/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_13\n  Read 
pc1/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Read pc1/Reshuffle/RemoveRandomKeys_14\n  Read 
pc1/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc1/ReadSplits_15\n  Read pc1/ReadSplits:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Make pc1 
iterable_16\n  Make pc1 iterable:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: Start 
pc1_17\n  Measure time: Start pc1:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc2/Impulse_19\n  Read pc2/Impulse:beam:transform:impulse:v1\n  must follow: \n 
 downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read pc2/Split_20\n 
 Read pc2/Split:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc2/Reshuffle/AddRandomKeys_22\n  Read 
pc2/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc2/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_24\n  Read 
pc2/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Read pc2/Reshuffle/ReshufflePerKey/GroupByKey_25\n  Read 
pc2/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc2/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_29\n  Read 
pc2/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Read pc2/Reshuffle/RemoveRandomKeys_30\n  Read 
pc2/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc2/ReadSplits_31\n  Read pc2/ReadSplits:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Make pc2 
iterable_32\n  Make pc2 iterable:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: Start 
pc2_33\n  Measure time: Start pc2:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_CoGroupByKey 
/pair_with_pc2_35\n  CoGroupByKey /pair_with_pc2:beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_CoGroupByKey /pair_with_pc1_36\n  CoGroupByKey 
/pair_with_pc1:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_CoGroupByKey 
/Flatten_37\n  CoGroupByKey /Flatten:beam:transform:flatten:v1\n  must follow: 
\n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_CoGroupByKey 
/GroupByKey_38\n  CoGroupByKey /GroupByKey:beam:transform:group_by_key:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_CoGroupByKey /Map(_merge_tagged_vals_under_key)_42\n  
CoGroupByKey /Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Consume 
Joined Collections_43\n  Consume Joined Collections:beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Measure time: End_44\n  Measure time: 
End:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>']
root: INFO: ==================== <function expand_sdf at 0x7fc4f01b3c08> 
====================
root: DEBUG: 27 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 
1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Read pc1/Impulse_3\n  Read 
pc1/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read pc1/Split_4\n  
Read pc1/Split:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc1/Reshuffle/AddRandomKeys_6\n  Read 
pc1/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc1/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_8\n  Read 
pc1/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Read pc1/Reshuffle/ReshufflePerKey/GroupByKey_9\n  Read 
pc1/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc1/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_13\n  Read 
pc1/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Read pc1/Reshuffle/RemoveRandomKeys_14\n  Read 
pc1/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc1/ReadSplits_15\n  Read pc1/ReadSplits:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Make pc1 
iterable_16\n  Make pc1 iterable:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: Start 
pc1_17\n  Measure time: Start pc1:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc2/Impulse_19\n  Read pc2/Impulse:beam:transform:impulse:v1\n  must follow: \n 
 downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read pc2/Split_20\n 
 Read pc2/Split:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc2/Reshuffle/AddRandomKeys_22\n  Read 
pc2/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc2/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_24\n  Read 
pc2/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Read pc2/Reshuffle/ReshufflePerKey/GroupByKey_25\n  Read 
pc2/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc2/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_29\n  Read 
pc2/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Read pc2/Reshuffle/RemoveRandomKeys_30\n  Read 
pc2/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read 
pc2/ReadSplits_31\n  Read pc2/ReadSplits:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Make pc2 
iterable_32\n  Make pc2 iterable:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: Start 
pc2_33\n  Measure time: Start pc2:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_CoGroupByKey 
/pair_with_pc2_35\n  CoGroupByKey /pair_with_pc2:beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_CoGroupByKey /pair_with_pc1_36\n  CoGroupByKey 
/pair_with_pc1:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_CoGroupByKey 
/Flatten_37\n  CoGroupByKey /Flatten:beam:transform:flatten:v1\n  must follow: 
\n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_CoGroupByKey 
/GroupByKey_38\n  CoGroupByKey /GroupByKey:beam:transform:group_by_key:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_CoGroupByKey /Map(_merge_tagged_vals_under_key)_42\n  
CoGroupByKey /Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Consume 
Joined Collections_43\n  Consume Joined Collections:beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Measure time: End_44\n  Measure time: 
End:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: 
<unknown>']
root: DEBUG: Runner option 'job_name' was already added
root: DEBUG: Runner option 'runner' was already added
root: DEBUG: Runner option 'temp_location' was already added
root: DEBUG: Runner option 'streaming' was already added
root: DEBUG: Runner option 'dataflow_kms_key' was already added
root: DEBUG: Runner option 'enable_streaming_engine' was already added
root: DEBUG: Runner option 'project' was already added
root: DEBUG: Runner option 'zone' was already added
root: DEBUG: Runner option 'environment_cache_millis' was already added
root: DEBUG: Runner option 'files_to_stage' was already added
root: DEBUG: Runner option 'job_endpoint' was already added
root: DEBUG: Runner option 'sdk_worker_parallelism' was already added
root: DEBUG: Runner option 'experiments' was already added
root: WARNING: Discarding unparseable args: ['--publish_to_big_query=false', 
'--metrics_dataset=load_test', '--metrics_table=python_flink_batch_cogbk_1', 
'--input_options={"num_records": 20000000,"key_size": 10,"value_size": 
90,"num_hot_keys": 1,"hot_key_fraction": 1}', 
'--co_input_options={"num_records": 20000000,"key_size": 10,"value_size": 
90,"num_hot_keys": 1,"hot_key_fraction": 1}', '--iterations=1']
root: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job 
failed. (JobID: 7bada00811f27003423c24e5a3dd3b45)
        at 
org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:268)
        at 
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:487)
        at 
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:475)
        at 
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:450)
        at 
org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:210)
        at 
org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:187)
        at 
org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
        at 
org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
        at 
org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:104)
        at 
org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:80)
        at 
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:78)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution 
failed.
        at 
org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
        at 
org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:265)
        ... 16 more
Caused by: java.lang.Exception: The user defined 'open()' method caused an 
exception: java.io.IOException: Received exit code 126 for command 'docker run 
-d --network=host --env=DOCKER_MAC_CONTAINER=null --rm 
gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest --id=1-1 
--logging_endpoint=localhost:42547 --artifact_endpoint=localhost:34535 
--provision_endpoint=localhost:35701 --control_endpoint=localhost:34089'. 
stderr: docker: Got permission denied while trying to connect to the Docker 
daemon socket at unix:///var/run/docker.sock: Post 
http://%2Fvar%2Frun%2Fdocker.sock/v1.38/containers/create: dial unix 
/var/run/docker.sock: connect: permission denied.See 'docker run --help'.
        at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:498)
        at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704)
        ... 1 more
Caused by: 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException:
 java.io.IOException: Received exit code 126 for command 'docker run -d 
--network=host --env=DOCKER_MAC_CONTAINER=null --rm 
gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest --id=1-1 
--logging_endpoint=localhost:42547 --artifact_endpoint=localhost:34535 
--provision_endpoint=localhost:35701 --control_endpoint=localhost:34089'. 
stderr: docker: Got permission denied while trying to connect to the Docker 
daemon socket at unix:///var/run/docker.sock: Post 
http://%2Fvar%2Frun%2Fdocker.sock/v1.38/containers/create: dial unix 
/var/run/docker.sock: connect: permission denied.See 'docker run --help'.
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4966)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:212)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:203)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:186)
        at 
org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:42)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:198)
        at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.open(FlinkExecutableStageFunction.java:130)
        at 
org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
        at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:494)
        ... 3 more
Caused by: java.io.IOException: Received exit code 126 for command 'docker run 
-d --network=host --env=DOCKER_MAC_CONTAINER=null --rm 
gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest --id=1-1 
--logging_endpoint=localhost:42547 --artifact_endpoint=localhost:34535 
--provision_endpoint=localhost:35701 --control_endpoint=localhost:34089'. 
stderr: docker: Got permission denied while trying to connect to the Docker 
daemon socket at unix:///var/run/docker.sock: Post 
http://%2Fvar%2Frun%2Fdocker.sock/v1.38/containers/create: dial unix 
/var/run/docker.sock: connect: permission denied.See 'docker run --help'.
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:234)
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:168)
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runImage(DockerCommand.java:92)
        at 
org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.createEnvironment(DockerEnvironmentFactory.java:159)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:179)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:163)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
        ... 11 more

root: ERROR: java.io.IOException: Received exit code 126 for command 'docker 
run -d --network=host --env=DOCKER_MAC_CONTAINER=null --rm 
gcr.io/apache-beam-testing/beam_portability/python2.7_sdk:latest --id=1-1 
--logging_endpoint=localhost:42547 --artifact_endpoint=localhost:34535 
--provision_endpoint=localhost:35701 --control_endpoint=localhost:34089'. 
stderr: docker: Got permission denied while trying to connect to the Docker 
daemon socket at unix:///var/run/docker.sock: Post 
http://%2Fvar%2Frun%2Fdocker.sock/v1.38/containers/create: dial unix 
/var/run/docker.sock: connect: permission denied.See 'docker run --help'.
root: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 21.529s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_LoadTests_Python_coGBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'>
 line: 55

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 46s
3 actionable tasks: 3 executed

Publishing build scan...
https://gradle.com/s/tguhiviwfq5wa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to