See 
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/8518/display/redirect>

------------------------------------------
[...truncated 279.23 KB...]
datanode_1  | 19/06/16 00:04:41 INFO DataNode.clienttrace: src: 
/192.168.0.3:50636, dest: /192.168.0.3:50010, bytes: 157283, op: HDFS_WRITE, 
cliID: DFSClient_NONMAPREDUCE_-1169873784_67, offset: 0, srvID: 
3d9f224e-a5e2-4856-949d-ddf8c9e3466f, blockid: 
BP-1213238884-192.168.0.2-1560643431942:blk_1073741825_1001, duration: 19123908
datanode_1  | 19/06/16 00:04:41 INFO datanode.DataNode: PacketResponder: 
BP-1213238884-192.168.0.2-1560643431942:blk_1073741825_1001, 
type=LAST_IN_PIPELINE terminating
namenode_1  | 19/06/16 00:04:41 INFO namenode.FSNamesystem: BLOCK* 
blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) 
in file /kinglear.txt
namenode_1  | 19/06/16 00:04:41 INFO namenode.EditLogFileOutputStream: Nothing 
to flush
namenode_1  | 19/06/16 00:04:42 INFO hdfs.StateChange: DIR* completeFile: 
/kinglear.txt is closed by DFSClient_NONMAPREDUCE_-1169873784_67
test_1      | DEBUG     Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline 
using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function 
annotate_downstream_side_inputs at 0x7f2c8f652500> ====================
test_1      | INFO:root:==================== <function 
fix_side_input_pcoll_coders at 0x7f2c8f6525f0> ====================
test_1      | INFO:root:==================== <function lift_combiners at 
0x7f2c8f652668> ====================
test_1      | INFO:root:==================== <function expand_sdf at 
0x7f2c8f6526e0> ====================
test_1      | INFO:root:==================== <function expand_gbk at 
0x7f2c8f652758> ====================
test_1      | INFO:root:==================== <function sink_flattens at 
0x7f2c8f652848> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 
0x7f2c8f6528c0> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 
0x7f2c8f652938> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 
0x7f2c8f6529b0> ====================
test_1      | INFO:root:==================== <function 
inject_timer_pcollections at 0x7f2c8f652b18> ====================
test_1      | INFO:root:==================== <function sort_stages at 
0x7f2c8f652b90> ====================
test_1      | INFO:root:==================== <function 
window_pcollection_coders at 0x7f2c8f652c08> ====================
test_1      | INFO:root:Running 
(((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running 
(ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/06/16 00:04:44 INFO datanode.webhdfs: 192.168.0.4 GET 
/webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0
 200
test_1      | INFO:root:Running 
(((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default 
mime_type: text/plain
datanode_1  | 19/06/16 00:04:47 INFO datanode.webhdfs: 192.168.0.4 PUT 
/webhdfs/v1/beam-temp-py-wordcount-integration-587060be8fca11e9ac000242c0a80004/20166ad0-6c0f-4feb-beb8-54b9da73118c.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root
 201
namenode_1  | 19/06/16 00:04:47 INFO hdfs.StateChange: BLOCK* allocate 
blk_1073741826_1002, replicas=192.168.0.3:50010 for 
/beam-temp-py-wordcount-integration-587060be8fca11e9ac000242c0a80004/20166ad0-6c0f-4feb-beb8-54b9da73118c.py-wordcount-integration
datanode_1  | 19/06/16 00:04:47 INFO datanode.DataNode: Receiving 
BP-1213238884-192.168.0.2-1560643431942:blk_1073741826_1002 src: 
/192.168.0.3:50658 dest: /192.168.0.3:50010
datanode_1  | 19/06/16 00:04:47 INFO DataNode.clienttrace: src: 
/192.168.0.3:50658, dest: /192.168.0.3:50010, bytes: 48944, op: HDFS_WRITE, 
cliID: DFSClient_NONMAPREDUCE_-177037216_69, offset: 0, srvID: 
3d9f224e-a5e2-4856-949d-ddf8c9e3466f, blockid: 
BP-1213238884-192.168.0.2-1560643431942:blk_1073741826_1002, duration: 5794045
datanode_1  | 19/06/16 00:04:47 INFO datanode.DataNode: PacketResponder: 
BP-1213238884-192.168.0.2-1560643431942:blk_1073741826_1002, 
type=LAST_IN_PIPELINE terminating
namenode_1  | 19/06/16 00:04:47 INFO hdfs.StateChange: DIR* completeFile: 
/beam-temp-py-wordcount-integration-587060be8fca11e9ac000242c0a80004/20166ad0-6c0f-4feb-beb8-54b9da73118c.py-wordcount-integration
 is closed by DFSClient_NONMAPREDUCE_-177037216_69
test_1      | INFO:root:Running 
(write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running 
((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running 
(ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 
(skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-8518_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-8518_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-8518_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-8518_datanode_1 ... done

> Task :sdks:python:container:docker
Requirement already satisfied: crcmod<2.0,>=1.7 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (1.7)
Requirement already satisfied: dill<0.2.10,>=0.2.9 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (0.2.9)
Requirement already satisfied: fastavro<0.22,>=0.21.4 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (0.21.4)
Requirement already satisfied: future<1.0.0,>=0.16.0 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.8 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (1.15.0)
Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (2.1.0)
Requirement already satisfied: httplib2<=0.12.0,>=0.8 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (0.9.2)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (2.0.0)
Requirement already satisfied: oauth2client<4,>=2.0.1 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (3.0.0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (3.6.1)
Requirement already satisfied: pydot<1.3,>=1.2.0 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (1.2.4)
Requirement already satisfied: pytz>=2018.3 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (2018.4)
Requirement already satisfied: pyyaml<4.0.0,>=3.12 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (3.12)
Requirement already satisfied: avro<2.0.0,>=1.8.1 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (1.8.2)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (3.2.0)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (0.6.8)
Requirement already satisfied: typing<3.7.0,>=3.6.0 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (3.6.1)
Requirement already satisfied: pyarrow<0.14.0,>=0.11.1 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (0.11.1)
Requirement already satisfied: cachetools<4,>=3.1.0 in 
/usr/local/lib/python2.7/site-packages (from apache-beam==2.14.0.dev0) (3.1.1)
Collecting google-apitools<0.5.29,>=0.5.28 (from apache-beam==2.14.0.dev0)
  Downloading 
https://files.pythonhosted.org/packages/07/5e/3e04cb66f5ced9267a854184bb09863d85d199646ea8480fee26b4313a00/google_apitools-0.5.28-py2-none-any.whl
 (134kB)

> Task :sdks:python:hdfsIntegrationTest
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-8518_namenode_1 ... done
Aborting on container exit...

real    1m21.751s
user    0m1.198s
sys     0m0.234s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-8518 
--no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-8518_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-8518_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-8518_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-8518_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-8518_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-8518_namenode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-8518_test_net

real    0m1.326s
user    0m0.971s
sys     0m0.204s

> Task :sdks:python:container:docker
Exception:
Traceback (most recent call last):
  File 
"/usr/local/lib/python2.7/site-packages/pip/_internal/cli/base_command.py", 
line 179, in main
    status = self.run(options, args)
  File 
"/usr/local/lib/python2.7/site-packages/pip/_internal/commands/install.py", 
line 315, in run
    resolver.resolve(requirement_set)
  File "/usr/local/lib/python2.7/site-packages/pip/_internal/resolve.py", line 
131, in resolve
    self._resolve_one(requirement_set, req)
  File "/usr/local/lib/python2.7/site-packages/pip/_internal/resolve.py", line 
294, in _resolve_one
    abstract_dist = self._get_abstract_dist_for(req_to_install)
  File "/usr/local/lib/python2.7/site-packages/pip/_internal/resolve.py", line 
242, in _get_abstract_dist_for
    self.require_hashes
  File 
"/usr/local/lib/python2.7/site-packages/pip/_internal/operations/prepare.py", 
line 334, in prepare_linked_requirement
    progress_bar=self.progress_bar
  File "/usr/local/lib/python2.7/site-packages/pip/_internal/download.py", line 
878, in unpack_url
    progress_bar=progress_bar
  File "/usr/local/lib/python2.7/site-packages/pip/_internal/download.py", line 
702, in unpack_http_url
    progress_bar)
  File "/usr/local/lib/python2.7/site-packages/pip/_internal/download.py", line 
946, in _download_http_url
    _download_url(resp, link, content_file, hashes, progress_bar)
  File "/usr/local/lib/python2.7/site-packages/pip/_internal/download.py", line 
639, in _download_url
    hashes.check_against_chunks(downloaded_chunks)
  File "/usr/local/lib/python2.7/site-packages/pip/_internal/utils/hashes.py", 
line 62, in check_against_chunks
    for chunk in chunks:
  File "/usr/local/lib/python2.7/site-packages/pip/_internal/download.py", line 
607, in written_chunks
    for chunk in chunks:
  File "/usr/local/lib/python2.7/site-packages/pip/_internal/utils/ui.py", line 
159, in iter
    for x in it:
  File "/usr/local/lib/python2.7/site-packages/pip/_internal/download.py", line 
596, in resp_read
    decode_content=False):
  File 
"/usr/local/lib/python2.7/site-packages/pip/_vendor/urllib3/response.py", line 
494, in stream
    data = self.read(amt=amt, decode_content=decode_content)
  File 
"/usr/local/lib/python2.7/site-packages/pip/_vendor/urllib3/response.py", line 
459, in read
    raise IncompleteRead(self._fp_bytes_read, self.length_remaining)
  File "/usr/local/lib/python2.7/contextlib.py", line 35, in __exit__
    self.gen.throw(type, value, traceback)
  File 
"/usr/local/lib/python2.7/site-packages/pip/_vendor/urllib3/response.py", line 
374, in _error_catcher
    raise ReadTimeoutError(self._pool, None, 'Read timed out.')
ReadTimeoutError: HTTPSConnectionPool(host='files.pythonhosted.org', port=443): 
Read timed out.
You are using pip version 19.0.3, however version 19.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
The command '/bin/sh -c pip install 
/opt/apache/beam/tars/apache-beam.tar.gz[gcp] &&     rm -rf /root/.cache/pip' 
returned a non-zero code: 2

> Task :sdks:python:container:docker FAILED

> Task :sdks:python:postCommitIT
>>> RUNNING integration tests with pipeline options: 
>>> --runner=TestDataflowRunner --project=apache-beam-testing 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --sdk_location=build/apache-beam.tar.gz 
>>> --requirements_file=postcommit_requirements.txt --num_workers=1 
>>> --sleep_secs=20 
>>> --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.14.0-SNAPSHOT.jar>
>>>  
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>  
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/setuptools/dist.py>:472:
 UserWarning: Normalizing '2.14.0.dev' to '2.14.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_datastore_wordcount_it 
(apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
 ... ok
test_bigquery_tornadoes_it 
(apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) 
... ok
test_autocomplete_it 
(apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it 
(apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) 
... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it 
(apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) 
... ok
test_streaming_wordcount_it 
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_user_score_it 
(apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_hourly_team_score_it 
(apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
 ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... 
ok
test_copy_batch 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) 
... ok
test_copy_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232:
 FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
test_multiple_destinations_transform_streaming 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... SKIP: TestStream is not supported on TestDataflowRunner
test_value_provider_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
test_big_query_read 
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types 
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=kms_key))
test_streaming_data_only 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_job_python_from_python_it 
(apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql_kms_key_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_metrics_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_big_query_write 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... 
SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3585.454s

OK (SKIP=3)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:container:docker'.
> Process 'command 'docker'' finished with non-zero exit value 2

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 4m 33s
102 actionable tasks: 78 executed, 21 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/cpu23sfco4erq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to