See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/2872/display/redirect?page=changes>

Changes:

[piotr.szuberski] [BEAM-9898] Add stub with imports to apache_beam.io.snowflake

[noreply] [BEAM-10916] Remove experimental annotations for BQ storage API source


------------------------------------------
[...truncated 28.01 MB...]
  File "dataflow_worker/native_operations.py", line 54, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 356, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 218, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 703, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 704, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1279, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 569, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1374, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 218, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 703, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 704, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1279, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 569, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1374, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 218, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 703, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 704, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1294, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.7/site-packages/future/utils/__init__.py", line 
446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 570, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File 
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/datastore/v1new/datastoreio.py",
 line 405, in process
    self._flush_batch()
  File 
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/datastore/v1new/datastoreio.py",
 line 422, in _flush_batch
    throttle_delay=util.WRITE_BATCH_TARGET_LATENCY_MS // 1000)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/utils/retry.py", 
line 236, in wrapper
    return fun(*args, **kwargs)
  File 
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/datastore/v1new/datastoreio.py",
 line 385, in write_mutations
    self._batch.commit()
  File 
"/usr/local/lib/python3.7/site-packages/google/cloud/datastore/batch.py", line 
274, in commit
    self._commit()
  File 
"/usr/local/lib/python3.7/site-packages/google/cloud/datastore/batch.py", line 
250, in _commit
    self.project, mode, self._mutations, transaction=self._id
  File 
"/usr/local/lib/python3.7/site-packages/google/cloud/datastore_v1/gapic/datastore_client.py",
 line 501, in commit
    request, retry=retry, timeout=timeout, metadata=metadata
  File 
"/usr/local/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py", 
line 145, in __call__
    return wrapped_func(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/google/api_core/retry.py", line 
286, in retry_wrapped_func
    on_error=on_error,
  File "/usr/local/lib/python3.7/site-packages/google/api_core/retry.py", line 
184, in retry_target
    return target()
  File "/usr/local/lib/python3.7/site-packages/google/api_core/timeout.py", 
line 214, in func_with_timeout
    return func(*args, **kwargs)
  File 
"/usr/local/lib/python3.7/site-packages/google/api_core/grpc_helpers.py", line 
59, in error_remapped_callable
    six.raise_from(exceptions.from_grpc_error(exc), exc)
  File "<string>", line 3, in raise_from
google.api_core.exceptions.NotFound: 404 The project apache-beam-testing does 
not exist or it does not contain an active Cloud Datastore or Cloud Firestore 
database. Please visit http://console.cloud.google.com to create a project or 
https://console.cloud.google.com/datastore/setup?project=apache-beam-testing to 
add a Cloud Datastore or Cloud Firestore database. Note that Cloud Datastore or 
Cloud Firestore always have an associated App Engine app and this app must not 
be disabled. [while running 'Write to Datastore/Write Batch to Datastore']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-22T12:28:40.661Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File 
"/usr/local/lib/python3.7/site-packages/google/api_core/grpc_helpers.py", line 
57, in error_remapped_callable
    return callable_(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/grpc/_channel.py", line 826, in 
__call__
    return _end_unary_response_blocking(state, call, False, None)
  File "/usr/local/lib/python3.7/site-packages/grpc/_channel.py", line 729, in 
_end_unary_response_blocking
    raise _InactiveRpcError(state)
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
        status = StatusCode.NOT_FOUND
        details = "The project apache-beam-testing does not exist or it does 
not contain an active Cloud Datastore or Cloud Firestore database. Please visit 
http://console.cloud.google.com to create a project or 
https://console.cloud.google.com/datastore/setup?project=apache-beam-testing to 
add a Cloud Datastore or Cloud Firestore database. Note that Cloud Datastore or 
Cloud Firestore always have an associated App Engine app and this app must not 
be disabled."
        debug_error_string = 
"{"created":"@1600777718.569033726","description":"Error received from peer 
ipv4:172.217.214.95:443","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"The
 project apache-beam-testing does not exist or it does not contain an active 
Cloud Datastore or Cloud Firestore database. Please visit 
http://console.cloud.google.com to create a project or 
https://console.cloud.google.com/datastore/setup?project=apache-beam-testing to 
add a Cloud Datastore or Cloud Firestore database. Note that Cloud Datastore or 
Cloud Firestore always have an associated App Engine app and this app must not 
be disabled.","grpc_status":5}"
>

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 570, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File 
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/datastore/v1new/datastoreio.py",
 line 405, in process
    self._flush_batch()
  File 
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/datastore/v1new/datastoreio.py",
 line 422, in _flush_batch
    throttle_delay=util.WRITE_BATCH_TARGET_LATENCY_MS // 1000)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/utils/retry.py", 
line 236, in wrapper
    return fun(*args, **kwargs)
  File 
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/datastore/v1new/datastoreio.py",
 line 385, in write_mutations
    self._batch.commit()
  File 
"/usr/local/lib/python3.7/site-packages/google/cloud/datastore/batch.py", line 
274, in commit
    self._commit()
  File 
"/usr/local/lib/python3.7/site-packages/google/cloud/datastore/batch.py", line 
250, in _commit
    self.project, mode, self._mutations, transaction=self._id
  File 
"/usr/local/lib/python3.7/site-packages/google/cloud/datastore_v1/gapic/datastore_client.py",
 line 501, in commit
    request, retry=retry, timeout=timeout, metadata=metadata
  File 
"/usr/local/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py", 
line 145, in __call__
    return wrapped_func(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/google/api_core/retry.py", line 
286, in retry_wrapped_func
    on_error=on_error,
  File "/usr/local/lib/python3.7/site-packages/google/api_core/retry.py", line 
184, in retry_target
    return target()
  File "/usr/local/lib/python3.7/site-packages/google/api_core/timeout.py", 
line 214, in func_with_timeout
    return func(*args, **kwargs)
  File 
"/usr/local/lib/python3.7/site-packages/google/api_core/grpc_helpers.py", line 
59, in error_remapped_callable
    six.raise_from(exceptions.from_grpc_error(exc), exc)
  File "<string>", line 3, in raise_from
google.api_core.exceptions.NotFound: 404 The project apache-beam-testing does 
not exist or it does not contain an active Cloud Datastore or Cloud Firestore 
database. Please visit http://console.cloud.google.com to create a project or 
https://console.cloud.google.com/datastore/setup?project=apache-beam-testing to 
add a Cloud Datastore or Cloud Firestore database. Note that Cloud Datastore or 
Cloud Firestore always have an associated App Engine app and this app must not 
be disabled.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", 
line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", 
line 179, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 39, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 44, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 54, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 356, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 218, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 703, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 704, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1279, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 569, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1374, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 218, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 703, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 704, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1279, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 569, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1374, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 218, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 703, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 704, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1294, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.7/site-packages/future/utils/__init__.py", line 
446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1213, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 570, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
  File 
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/datastore/v1new/datastoreio.py",
 line 405, in process
    self._flush_batch()
  File 
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/datastore/v1new/datastoreio.py",
 line 422, in _flush_batch
    throttle_delay=util.WRITE_BATCH_TARGET_LATENCY_MS // 1000)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/utils/retry.py", 
line 236, in wrapper
    return fun(*args, **kwargs)
  File 
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/datastore/v1new/datastoreio.py",
 line 385, in write_mutations
    self._batch.commit()
  File 
"/usr/local/lib/python3.7/site-packages/google/cloud/datastore/batch.py", line 
274, in commit
    self._commit()
  File 
"/usr/local/lib/python3.7/site-packages/google/cloud/datastore/batch.py", line 
250, in _commit
    self.project, mode, self._mutations, transaction=self._id
  File 
"/usr/local/lib/python3.7/site-packages/google/cloud/datastore_v1/gapic/datastore_client.py",
 line 501, in commit
    request, retry=retry, timeout=timeout, metadata=metadata
  File 
"/usr/local/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py", 
line 145, in __call__
    return wrapped_func(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/google/api_core/retry.py", line 
286, in retry_wrapped_func
    on_error=on_error,
  File "/usr/local/lib/python3.7/site-packages/google/api_core/retry.py", line 
184, in retry_target
    return target()
  File "/usr/local/lib/python3.7/site-packages/google/api_core/timeout.py", 
line 214, in func_with_timeout
    return func(*args, **kwargs)
  File 
"/usr/local/lib/python3.7/site-packages/google/api_core/grpc_helpers.py", line 
59, in error_remapped_callable
    six.raise_from(exceptions.from_grpc_error(exc), exc)
  File "<string>", line 3, in raise_from
google.api_core.exceptions.NotFound: 404 The project apache-beam-testing does 
not exist or it does not contain an active Cloud Datastore or Cloud Firestore 
database. Please visit http://console.cloud.google.com to create a project or 
https://console.cloud.google.com/datastore/setup?project=apache-beam-testing to 
add a Cloud Datastore or Cloud Firestore database. Note that Cloud Datastore or 
Cloud Firestore always have an associated App Engine app and this app must not 
be disabled. [while running 'Write to Datastore/Write Batch to Datastore']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-22T12:28:40.694Z: 
JOB_MESSAGE_BASIC: Finished operation Input/Read+To String+To Entity+Write to 
Datastore/Write Batch to Datastore
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-22T12:28:40.761Z: 
JOB_MESSAGE_DEBUG: Executing failure step failure2
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-22T12:28:40.788Z: 
JOB_MESSAGE_ERROR: Workflow failed. Causes: S01:Input/Read+To String+To 
Entity+Write to Datastore/Write Batch to Datastore failed., The job failed 
because a work item has failed 4 times. Look in previous log entries for the 
cause of each one of the 4 failures. For more information, see 
https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was 
attempted on these workers: 
  beamapp-jenkins-092212223-09220522-6043-harness-f35r
      Root cause: Work item failed.,
  beamapp-jenkins-092212223-09220522-6043-harness-f35r
      Root cause: Work item failed.,
  beamapp-jenkins-092212223-09220522-6043-harness-f35r
      Root cause: Work item failed.,
  beamapp-jenkins-092212223-09220522-6043-harness-f35r
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-22T12:28:40.865Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-22T12:28:40.917Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-22T12:28:40.958Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-22T12:29:30.902Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-22T12:29:30.955Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-22T12:29:31.040Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2020-09-22_05_22_40-4900753317371518910 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 66 tests in 3925.415s

FAILED (SKIP=7, errors=2)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'>
 line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'>
 line: 55

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 118

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 14m 54s
174 actionable tasks: 129 executed, 41 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/xsia35ozypgko

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org

Reply via email to