See
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1229/display/redirect?page=changes>
Changes:
[github] [BEAM-7013] Add HLL doc link to Beam website
------------------------------------------
[...truncated 405.44 KB...]
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "m_out.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s11"
},
"serialized_fn": "ref_AppliedPTransform_m_out_17",
"user_name": "m_out"
}
}
],
"type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
createTime: '2019-06-26T00:10:50.941596Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2019-06-25_17_10_49-1295815700651140808'
location: 'us-central1'
name: 'beamapp-jenkins-0626001041-369143'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2019-06-26T00:10:50.941596Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-06-25_17_10_49-1295815700651140808]
root: INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_10_49-1295815700651140808?project=apache-beam-testing
root: INFO: Job 2019-06-25_17_10_49-1295815700651140808 is in state
JOB_STATE_RUNNING
root: INFO: 2019-06-26T00:10:50.006Z: JOB_MESSAGE_DETAILED: Autoscaling is
enabled for job 2019-06-25_17_10_49-1295815700651140808. The number of workers
will be between 1 and 1000.
root: INFO: 2019-06-26T00:10:50.055Z: JOB_MESSAGE_DETAILED: Autoscaling was
automatically enabled for job 2019-06-25_17_10_49-1295815700651140808.
root: INFO: 2019-06-26T00:10:52.930Z: JOB_MESSAGE_DETAILED: Checking
permissions granted to controller Service Account.
root: INFO: 2019-06-26T00:10:53.597Z: JOB_MESSAGE_BASIC: Worker configuration:
n1-standard-1 in us-central1-f.
root: INFO: 2019-06-26T00:10:54.119Z: JOB_MESSAGE_DETAILED: Expanding
SplittableParDo operations into optimizable parts.
root: INFO: 2019-06-26T00:10:54.180Z: JOB_MESSAGE_DETAILED: Expanding
CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-06-26T00:10:54.289Z: JOB_MESSAGE_DETAILED: Expanding
CoGroupByKey operations into optimizable parts.
root: INFO: 2019-06-26T00:10:54.333Z: JOB_MESSAGE_DEBUG: Combiner lifting
skipped for step GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-06-26T00:10:54.369Z: JOB_MESSAGE_DEBUG: Combiner lifting
skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:
GroupByKey not followed by a combiner.
root: INFO: 2019-06-26T00:10:54.424Z: JOB_MESSAGE_DETAILED: Expanding
GroupByKey operations into optimizable parts.
root: INFO: 2019-06-26T00:10:54.472Z: JOB_MESSAGE_DEBUG: Annotating graph with
Autotuner information.
root: INFO: 2019-06-26T00:10:54.571Z: JOB_MESSAGE_DETAILED: Fusing adjacent
ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-26T00:10:54.628Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/FlatMap(<lambda at core.py:2257>) into Create/Impulse
root: INFO: 2019-06-26T00:10:54.673Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into
Create/MaybeReshuffle/Reshuffle/AddRandomKeys
root: INFO: 2019-06-26T00:10:54.730Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow
root: INFO: 2019-06-26T00:10:54.783Z: JOB_MESSAGE_DETAILED: Fusing consumer
m_out into GroupByKey/GroupByWindow
root: INFO: 2019-06-26T00:10:54.831Z: JOB_MESSAGE_DETAILED: Fusing consumer
GroupByKey/GroupByWindow into GroupByKey/Read
root: INFO: 2019-06-26T00:10:54.882Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read
root: INFO: 2019-06-26T00:10:54.941Z: JOB_MESSAGE_DETAILED: Fusing consumer
GroupByKey/Reify into map_to_common_key
root: INFO: 2019-06-26T00:10:55.003Z: JOB_MESSAGE_DETAILED: Fusing consumer
GroupByKey/Write into GroupByKey/Reify
root: INFO: 2019-06-26T00:10:55.084Z: JOB_MESSAGE_DETAILED: Fusing consumer
map_to_common_key into metrics
root: INFO: 2019-06-26T00:10:55.141Z: JOB_MESSAGE_DETAILED: Fusing consumer
metrics into Create/Map(decode)
root: INFO: 2019-06-26T00:10:55.212Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
root: INFO: 2019-06-26T00:10:55.265Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
root: INFO: 2019-06-26T00:10:55.324Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at
core.py:2257>)
root: INFO: 2019-06-26T00:10:55.371Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify into
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
root: INFO: 2019-06-26T00:10:55.417Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write into
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify
root: INFO: 2019-06-26T00:10:55.474Z: JOB_MESSAGE_DEBUG: Workflow config is
missing a default resource spec.
root: INFO: 2019-06-26T00:10:55.527Z: JOB_MESSAGE_DEBUG: Adding StepResource
setup and teardown to workflow graph.
root: INFO: 2019-06-26T00:10:55.584Z: JOB_MESSAGE_DEBUG: Adding workflow start
and stop steps.
root: INFO: 2019-06-26T00:10:55.638Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-26T00:10:55.854Z: JOB_MESSAGE_DEBUG: Executing wait step
start23
root: INFO: 2019-06-26T00:10:55.968Z: JOB_MESSAGE_BASIC: Executing operation
GroupByKey/Create
root: INFO: 2019-06-26T00:10:56.009Z: JOB_MESSAGE_BASIC: Executing operation
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
root: INFO: 2019-06-26T00:10:56.038Z: JOB_MESSAGE_DEBUG: Starting worker pool
setup.
root: INFO: 2019-06-26T00:10:56.078Z: JOB_MESSAGE_BASIC: Starting 1 workers in
us-central1-f...
root: INFO: 2019-06-26T00:10:56.205Z: JOB_MESSAGE_BASIC: Finished operation
GroupByKey/Create
root: INFO: 2019-06-26T00:10:56.217Z: JOB_MESSAGE_BASIC: Finished operation
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
root: INFO: 2019-06-26T00:10:56.322Z: JOB_MESSAGE_DEBUG: Value
"GroupByKey/Session" materialized.
root: INFO: 2019-06-26T00:10:56.367Z: JOB_MESSAGE_DEBUG: Value
"Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session"
materialized.
root: INFO: 2019-06-26T00:10:56.472Z: JOB_MESSAGE_BASIC: Executing operation
Create/Impulse+Create/FlatMap(<lambda at
core.py:2257>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
root: INFO: 2019-06-26T00:11:38.287Z: JOB_MESSAGE_DETAILED: Workers have
started successfully.
root: INFO: 2019-06-26T00:12:08.863Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised
the number of workers to 1 based on the rate of progress in the currently
running step(s).
root: INFO: 2019-06-26T00:12:09.352Z: JOB_MESSAGE_DETAILED: Workers have
started successfully.
root: INFO: 2019-06-26T00:14:50.562Z: JOB_MESSAGE_BASIC: Finished operation
Create/Impulse+Create/FlatMap(<lambda at
core.py:2257>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
root: INFO: 2019-06-26T00:14:50.692Z: JOB_MESSAGE_BASIC: Executing operation
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-06-26T00:14:50.785Z: JOB_MESSAGE_BASIC: Finished operation
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-06-26T00:14:50.888Z: JOB_MESSAGE_BASIC: Executing operation
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
root: INFO: 2019-06-26T00:15:08.588Z: JOB_MESSAGE_BASIC: Finished operation
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
root: INFO: 2019-06-26T00:15:08.700Z: JOB_MESSAGE_BASIC: Executing operation
GroupByKey/Close
root: INFO: 2019-06-26T00:15:08.746Z: JOB_MESSAGE_BASIC: Finished operation
GroupByKey/Close
root: INFO: 2019-06-26T00:15:09.855Z: JOB_MESSAGE_BASIC: Executing operation
GroupByKey/Read+GroupByKey/GroupByWindow+m_out
root: INFO: 2019-06-26T00:15:14.262Z: JOB_MESSAGE_BASIC: Finished operation
GroupByKey/Read+GroupByKey/GroupByWindow+m_out
root: INFO: 2019-06-26T00:15:14.352Z: JOB_MESSAGE_DEBUG: Executing success step
success21
root: INFO: 2019-06-26T00:15:14.522Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-06-26T00:15:14.634Z: JOB_MESSAGE_DEBUG: Starting worker pool
teardown.
root: INFO: 2019-06-26T00:15:14.695Z: JOB_MESSAGE_BASIC: Stopping worker pool...
--------------------- >> end captured logging << ---------------------
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_40_28-2429220544732775883?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_55_18-15065624749443055838?project=apache-beam-testing.
method_to_use = self._compute_method(p, p.options)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_04_23-254788934447847595?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_12_51-383142861044461482?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_40_24-17555999464770048196?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_59_54-10595021209035686707?project=apache-beam-testing.
kms_key=transform.kms_key))
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_08_41-10309172892790528850?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_40_27-6387726340308976965?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232:
FutureWarning: MatchAll is experimental.
| 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
FutureWarning: MatchAll is experimental.
| 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
FutureWarning: ReadMatches is experimental.
| 'Checksums' >> beam.Map(compute_hash))
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_52_33-526400606797334437?project=apache-beam-testing.
Exception in thread Thread-10:
Traceback (most recent call last):
File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/usr/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",>
line 157, in poll_for_job_completion
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_00_24-3104214009154086724?project=apache-beam-testing.
response = runner.dataflow_client.get_job(job_id)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_10_49-1295815700651140808?project=apache-beam-testing.
File
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",>
line 197, in wrapper
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_18_01-18389773924131078308?project=apache-beam-testing.
return fun(*args, **kwargs)
File
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",>
line 663, in get_job
response = self._client.projects_locations_jobs.Get(request)
File
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",>
line 689, in Get
config, request, global_params=global_params)
File
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",>
line 731, in _RunMethod
return self.ProcessHttpResponse(method_config, http_response, request)
File
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",>
line 737, in ProcessHttpResponse
self.__ProcessHttpResponse(method_config, http_response, request))
File
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",>
line 604, in __ProcessHttpResponse
http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing
<https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-25_17_10_49-1295815700651140808?alt=json>:
response: <{'vary': 'Origin, X-Origin, Referer', 'content-type':
'application/json; charset=UTF-8', 'date': 'Wed, 26 Jun 2019 00:16:01 GMT',
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0',
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff',
'transfer-encoding': 'chunked', 'status': '404', 'content-length': '278',
'-content-encoding': 'gzip'}>, content <{
"error": {
"code": 404,
"message": "(d81788690d52ef5): Information about job
2019-06-25_17_10_49-1295815700651140808 could not be found in our system.
Please double check the id is correct. If it is please contact customer
support.",
"status": "NOT_FOUND"
}
}
>
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_40_24-1549568649588061408?project=apache-beam-testing.
kms_key=transform.kms_key))
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_58_40-10240185915421260586?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_08_43-7226795227366998046?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_17_32-11611202134209340196?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_40_24-14533450433311133035?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_48_29-8731007241354039956?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_55_31-13396344845674337497?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_03_22-6752813564288340831?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_12_47-6309240168363426177?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_19_57-5503282362257845898?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_40_23-20837875628525850?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_47_29-14627932058389029157?project=apache-beam-testing.
kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
method_to_use = self._compute_method(p, p.options)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_55_31-15505553003410686981?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_04_51-2716515877543015195?project=apache-beam-testing.
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_40_29-8796464800206142799?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_50_33-13150094152295368107?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_59_40-5615867313224472927?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_10_03-6294404398551565506?project=apache-beam-testing.
method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_40_25-11208256660590333832?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_48_42-3989548899405159035?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_16_55_54-15882453727387104075?project=apache-beam-testing.
kms_key=transform.kms_key))
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_06_20-4592010706817489104?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-25_17_14_32-18391555626286803862?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=kms_key))
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 2970.791s
FAILED (SKIP=5, failures=1)
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
FAILURE: Build completed with 3 failures.
1: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'>
line: 48
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'>
line: 48
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
line: 78
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 50m 30s
77 actionable tasks: 60 executed, 17 from cache
Publishing build scan...
https://gradle.com/s/3zn6vtudo5kbo
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]