See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python2/2757/display/redirect?page=changes>

Changes:

[noreply] [BEAM-7996] Add support for MapType and Nulls in container types for


------------------------------------------
[...truncated 17.37 MB...]
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 1294, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 570, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
    windowed_value, self.process_method(windowed_value.value))
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/datastore/v1new/datastoreio.py",
 line 405, in process
    self._flush_batch()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/datastore/v1new/datastoreio.py",
 line 422, in _flush_batch
    throttle_delay=util.WRITE_BATCH_TARGET_LATENCY_MS // 1000)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py", 
line 236, in wrapper
    return fun(*args, **kwargs)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/datastore/v1new/datastoreio.py",
 line 385, in write_mutations
    self._batch.commit()
  File 
"/usr/local/lib/python2.7/site-packages/google/cloud/datastore/batch.py", line 
274, in commit
    self._commit()
  File 
"/usr/local/lib/python2.7/site-packages/google/cloud/datastore/batch.py", line 
250, in _commit
    self.project, mode, self._mutations, transaction=self._id
  File 
"/usr/local/lib/python2.7/site-packages/google/cloud/datastore_v1/gapic/datastore_client.py",
 line 501, in commit
    request, retry=retry, timeout=timeout, metadata=metadata
  File 
"/usr/local/lib/python2.7/site-packages/google/api_core/gapic_v1/method.py", 
line 143, in __call__
    return wrapped_func(*args, **kwargs)
  File "/usr/local/lib/python2.7/site-packages/google/api_core/retry.py", line 
286, in retry_wrapped_func
    on_error=on_error,
  File "/usr/local/lib/python2.7/site-packages/google/api_core/retry.py", line 
184, in retry_target
    return target()
  File "/usr/local/lib/python2.7/site-packages/google/api_core/timeout.py", 
line 214, in func_with_timeout
    return func(*args, **kwargs)
  File 
"/usr/local/lib/python2.7/site-packages/google/api_core/grpc_helpers.py", line 
59, in error_remapped_callable
    six.raise_from(exceptions.from_grpc_error(exc), exc)
  File "/usr/local/lib/python2.7/site-packages/six.py", line 738, in raise_from
    raise value
NotFound: 404 The project apache-beam-testing does not exist or it does not 
contain an active Cloud Datastore or Cloud Firestore database. Please visit 
http://console.cloud.google.com to create a project or 
https://console.cloud.google.com/datastore/setup?project=apache-beam-testing to 
add a Cloud Datastore or Cloud Firestore database. Note that Cloud Datastore or 
Cloud Firestore always have an associated App Engine app and this app must not 
be disabled. [while running 'Write to Datastore/Write Batch to Datastore']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-06T06:20:42.812Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/batchworker.py", 
line 638, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py", 
line 179, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in 
dataflow_worker.native_operations.NativeReadOperation.start
    def start(self):
  File "dataflow_worker/native_operations.py", line 39, in 
dataflow_worker.native_operations.NativeReadOperation.start
    with self.scoped_start_state:
  File "dataflow_worker/native_operations.py", line 44, in 
dataflow_worker.native_operations.NativeReadOperation.start
    with self.spec.source.reader() as reader:
  File "dataflow_worker/native_operations.py", line 54, in 
dataflow_worker.native_operations.NativeReadOperation.start
    self.output(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 332, in 
apache_beam.runners.worker.operations.Operation.output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_runner.process(o)
  File "apache_beam/runners/common.py", line 1215, in 
apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 1279, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 569, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
    self.output_processor.process_outputs(
  File "apache_beam/runners/common.py", line 1371, in 
apache_beam.runners.common._OutputProcessor.process_outputs
    self.main_receivers.receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_runner.process(o)
  File "apache_beam/runners/common.py", line 1215, in 
apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 1279, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 569, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
    self.output_processor.process_outputs(
  File "apache_beam/runners/common.py", line 1371, in 
apache_beam.runners.common._OutputProcessor.process_outputs
    self.main_receivers.receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 195, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 670, in 
apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 671, in 
apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_runner.process(o)
  File "apache_beam/runners/common.py", line 1215, in 
apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 1294, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 570, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
    windowed_value, self.process_method(windowed_value.value))
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/datastore/v1new/datastoreio.py",
 line 405, in process
    self._flush_batch()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/datastore/v1new/datastoreio.py",
 line 422, in _flush_batch
    throttle_delay=util.WRITE_BATCH_TARGET_LATENCY_MS // 1000)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py", 
line 236, in wrapper
    return fun(*args, **kwargs)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/datastore/v1new/datastoreio.py",
 line 385, in write_mutations
    self._batch.commit()
  File 
"/usr/local/lib/python2.7/site-packages/google/cloud/datastore/batch.py", line 
274, in commit
    self._commit()
  File 
"/usr/local/lib/python2.7/site-packages/google/cloud/datastore/batch.py", line 
250, in _commit
    self.project, mode, self._mutations, transaction=self._id
  File 
"/usr/local/lib/python2.7/site-packages/google/cloud/datastore_v1/gapic/datastore_client.py",
 line 501, in commit
    request, retry=retry, timeout=timeout, metadata=metadata
  File 
"/usr/local/lib/python2.7/site-packages/google/api_core/gapic_v1/method.py", 
line 143, in __call__
    return wrapped_func(*args, **kwargs)
  File "/usr/local/lib/python2.7/site-packages/google/api_core/retry.py", line 
286, in retry_wrapped_func
    on_error=on_error,
  File "/usr/local/lib/python2.7/site-packages/google/api_core/retry.py", line 
184, in retry_target
    return target()
  File "/usr/local/lib/python2.7/site-packages/google/api_core/timeout.py", 
line 214, in func_with_timeout
    return func(*args, **kwargs)
  File 
"/usr/local/lib/python2.7/site-packages/google/api_core/grpc_helpers.py", line 
59, in error_remapped_callable
    six.raise_from(exceptions.from_grpc_error(exc), exc)
  File "/usr/local/lib/python2.7/site-packages/six.py", line 738, in raise_from
    raise value
NotFound: 404 The project apache-beam-testing does not exist or it does not 
contain an active Cloud Datastore or Cloud Firestore database. Please visit 
http://console.cloud.google.com to create a project or 
https://console.cloud.google.com/datastore/setup?project=apache-beam-testing to 
add a Cloud Datastore or Cloud Firestore database. Note that Cloud Datastore or 
Cloud Firestore always have an associated App Engine app and this app must not 
be disabled. [while running 'Write to Datastore/Write Batch to Datastore']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-06T06:20:42.835Z: 
JOB_MESSAGE_BASIC: Finished operation Input/Read+To String+To Entity+Write to 
Datastore/Write Batch to Datastore
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-06T06:20:42.899Z: 
JOB_MESSAGE_DEBUG: Executing failure step failure2
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-06T06:20:42.934Z: 
JOB_MESSAGE_ERROR: Workflow failed. Causes: S01:Input/Read+To String+To 
Entity+Write to Datastore/Write Batch to Datastore failed., The job failed 
because a work item has failed 4 times. Look in previous log entries for the 
cause of each one of the 4 failures. For more information, see 
https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was 
attempted on these workers: 
  beamapp-jenkins-080606144-08052314-3pxp-harness-trdj
      Root cause: Work item failed.,
  beamapp-jenkins-080606144-08052314-3pxp-harness-trdj
      Root cause: Work item failed.,
  beamapp-jenkins-080606144-08052314-3pxp-harness-trdj
      Root cause: Work item failed.,
  beamapp-jenkins-080606144-08052314-3pxp-harness-trdj
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-06T06:20:43.052Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-06T06:20:43.247Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-06T06:20:43.290Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-06T06:21:37.392Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-06T06:21:37.433Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-06T06:21:37.478Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2020-08-05_23_14_54-6732317515998461442 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py27.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 64 tests in 3802.730s

FAILED (SKIP=7, errors=2)
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_03_15-1503429156688924805?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_11_19-5732233468106247144?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_19_59-15399763657275938909?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_28_40-5986723967245155065?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_36_51-14371450565025657116?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_44_18-3096528052676968853?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_51_48-17372876044899289025?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_59_31-9264234924196476177?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_03_20-17420857648687390918?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_16_46-9653308500170665300?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_24_12-2678328687649640658?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_31_02-17994064781825237728?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_38_24-11941746273616797678?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_46_40-11480128632904408335?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_03_16-11571966234503686662?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_21_45-1814833010033587466?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_29_06-8031570706833618661?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_36_08-4325069603800705875?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_52_42-15936571892795695972?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_03_19-9596252839210167561?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_14_54-6732317515998461442?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_22_07-10639802186936416750?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_29_24-6504481002364500176?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_38_07-9852735461727706018?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_45_59-4466224972236273756?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_03_17-13869986573825225234?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_10_37-16017876865856961965?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_18_41-4291982989843073513?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_25_49-17464504412642916248?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_32_50-9173324944332315549?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_40_34-14251769123406130685?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_48_03-13526997915126170110?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_55_50-12626056399820747435?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_03_14-16971644558498166776?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_21_08-13586684944087847714?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_28_46-1186837174010495960?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_36_20-9696335487918833613?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_43_56-7527659081515620040?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_52_07-1966787301488760383?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_03_19-6907179463414935930?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_11_23-9728647317730982863?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_20_11-10073753574204121927?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_28_58-13662365397305495698?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_36_47-4559805325917879012?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_43_10-7502528409180316513?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_49_53-13013997743965348688?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_03_15-9939422865580082695?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_12_38-10101653482478938143?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_23_52-1280429740442326356?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_31_19-11905573911120737445?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_39_28-1382427860137224946?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_47_00-541602196926217251?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'>
 line: 50

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 116

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 5m 42s
159 actionable tasks: 123 executed, 34 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/aifbnqv74g434

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to