See
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Batch/40/display/redirect?page=changes>
Changes:
[ttanay100] unskip ReifyTest.test_window
[ttanay100] [BEAM-7437] Add streaming flag to BQ streaming inserts IT test
[ttanay100] Change default timeout to 5 mins
[valentyn] Use Beam's abstraction of pickler instead of dill in coder tests.
[kamil.wasilewski] [BEAM-7535] Created Jenkins job for BQ performance tests
[kamil.wasilewski] [BEAM-7535] Delete existing data if the table already exists
[iemejia] [BEAM-6740] Add PTransformTranslator for Combine.Globally
[iemejia] [BEAM-6740] Add extractAcummulatorCoder for Combine.Globally and fix
[iemejia] [BEAM-7640] Change tests to use PayloadTranslator instead of unused
[iemejia] [BEAM-6740] Refactor to remove duplicated code in CombineTranslation
[hannahjiang] BEAM-3645 add thread lock
[cademarkegard] [BEAM-7690] Port WordCountTest off DoFnTester
[kcweaver] [BEAM-7708] don't expect SQL shell bundled dependencies to be
shadowed
[github] [BEAM-7709] Re-use node for explicit flattens
[boyuanz] Reformat CamelCase function naming style to underscore style for
[boyuanz] fix lint
[33895511+aromanenko-dev] [BEAM-6480] Adds AvroIO sink for generic records.
(#9005)
[github] [SQL][Doc] fix broken gradle command.
[lcwik] Added new example on how to create a custom unbounded streaming source
[chambers] Update Python Dataflow runner to patch side input coders on the
unified
[iemejia] [BEAM-7653] Add PTransformTranslator for Combine.GroupedValues
[pachristopher] Update pyarrow version requirement in setup.py
[github] Update code comments to improve readability in docs (#9024)
------------------------------------------
[...truncated 94.13 KB...]
"component_encodings": []
},
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": []
}
],
"is_pair_like": true
}
],
"is_stream_like": true
}
],
"is_pair_like": true
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "GroupByKey 0.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s2"
},
"serialized_fn":
"%0AB%22%40%0A%1Dref_Coder_GlobalWindowCoder_1%12%1F%0A%1D%0A%1Bbeam%3Acoder%3Aglobal_window%3Av1jT%0A%25%0A%23%0A%21beam%3Awindowfn%3Aglobal_windows%3Av0.1%10%01%1A%1Dref_Coder_GlobalWindowCoder_1%22%02%3A%00%28%010%018%01H%01",
"user_name": "GroupByKey 0"
}
},
{
"kind": "ParallelDo",
"name": "s4",
"properties": {
"display_data": [
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.ParDo",
"shortValue": "_UngroupAndReiterate",
"type": "STRING",
"value":
"apache_beam.testing.load_tests.group_by_key_test._UngroupAndReiterate"
}
],
"non_parallel_inputs": {},
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": []
},
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": []
}
],
"is_pair_like": true
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "Ungroup 0.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s3"
},
"serialized_fn": "<string of 456 bytes>",
"user_name": "Ungroup 0"
}
},
{
"kind": "ParallelDo",
"name": "s5",
"properties": {
"display_data": [
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.ParDo",
"shortValue": "MeasureTime",
"type": "STRING",
"value":
"apache_beam.testing.load_tests.load_test_metrics_utils.MeasureTime"
}
],
"non_parallel_inputs": {},
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": []
},
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": []
}
],
"is_pair_like": true
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "Measure time: End 0.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s4"
},
"serialized_fn": "<string of 504 bytes>",
"user_name": "Measure time: End 0"
}
}
],
"type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
createTime: u'2019-07-10T13:07:37.198997Z'
currentStateTime: u'1970-01-01T00:00:00Z'
id: u'2019-07-10_06_07_36-15570994227384705247'
location: u'us-central1'
name: u'load-tests-python-dataflow-batch-gbk-2-0710100250'
projectId: u'apache-beam-testing'
stageStates: []
startTime: u'2019-07-10T13:07:37.198997Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-07-10_06_07_36-15570994227384705247]
root: INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-10_06_07_36-15570994227384705247?project=apache-beam-testing
root: INFO: Job 2019-07-10_06_07_36-15570994227384705247 is in state
JOB_STATE_PENDING
root: INFO: 2019-07-10T13:07:36.209Z: JOB_MESSAGE_WARNING: The requested max
number of workers (5) is ignored as autoscaling is explicitly disabled
(autoscalingAlgorithm=NONE).
root: INFO: 2019-07-10T13:07:39.300Z: JOB_MESSAGE_DETAILED: Checking
permissions granted to controller Service Account.
root: INFO: 2019-07-10T13:07:39.755Z: JOB_MESSAGE_BASIC: Worker configuration:
n1-standard-1 in us-central1-b.
root: INFO: 2019-07-10T13:07:40.385Z: JOB_MESSAGE_DETAILED: Expanding
CoGroupByKey operations into optimizable parts.
root: INFO: 2019-07-10T13:07:40.493Z: JOB_MESSAGE_DEBUG: Combiner lifting
skipped for step GroupByKey 0: GroupByKey not followed by a combiner.
root: INFO: 2019-07-10T13:07:40.552Z: JOB_MESSAGE_DETAILED: Expanding
GroupByKey operations into optimizable parts.
root: INFO: 2019-07-10T13:07:40.591Z: JOB_MESSAGE_DETAILED: Lifting
ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-07-10T13:07:40.681Z: JOB_MESSAGE_DEBUG: Annotating graph with
Autotuner information.
root: INFO: 2019-07-10T13:07:40.774Z: JOB_MESSAGE_DETAILED: Fusing adjacent
ParDo, Read, Write, and Flatten operations
root: INFO: 2019-07-10T13:07:40.823Z: JOB_MESSAGE_DETAILED: Fusing consumer
Measure time: Start into Read
root: INFO: 2019-07-10T13:07:40.883Z: JOB_MESSAGE_DETAILED: Fusing consumer
GroupByKey 0/Write into GroupByKey 0/Reify
root: INFO: 2019-07-10T13:07:40.932Z: JOB_MESSAGE_DETAILED: Fusing consumer
Measure time: End 0 into Ungroup 0
root: INFO: 2019-07-10T13:07:40.979Z: JOB_MESSAGE_DETAILED: Fusing consumer
GroupByKey 0/Reify into Measure time: Start
root: INFO: 2019-07-10T13:07:41.020Z: JOB_MESSAGE_DETAILED: Fusing consumer
GroupByKey 0/GroupByWindow into GroupByKey 0/Read
root: INFO: 2019-07-10T13:07:41.065Z: JOB_MESSAGE_DETAILED: Fusing consumer
Ungroup 0 into GroupByKey 0/GroupByWindow
root: INFO: 2019-07-10T13:07:41.110Z: JOB_MESSAGE_DEBUG: Workflow config is
missing a default resource spec.
root: INFO: 2019-07-10T13:07:41.160Z: JOB_MESSAGE_DEBUG: Adding StepResource
setup and teardown to workflow graph.
root: INFO: 2019-07-10T13:07:41.209Z: JOB_MESSAGE_DEBUG: Adding workflow start
and stop steps.
root: INFO: 2019-07-10T13:07:41.260Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-07-10T13:07:41.540Z: JOB_MESSAGE_DEBUG: Executing wait step
start13
root: INFO: 2019-07-10T13:07:41.689Z: JOB_MESSAGE_BASIC: Executing operation
GroupByKey 0/Create
root: INFO: 2019-07-10T13:07:41.749Z: JOB_MESSAGE_DEBUG: Starting worker pool
setup.
root: INFO: 2019-07-10T13:07:41.798Z: JOB_MESSAGE_BASIC: Starting 5 workers in
us-central1-b...
root: INFO: 2019-07-10T13:07:41.865Z: JOB_MESSAGE_BASIC: Worker configuration:
n1-standard-1 in us-central1-b.
root: INFO: 2019-07-10T13:07:41.876Z: JOB_MESSAGE_BASIC: Finished operation
GroupByKey 0/Create
root: INFO: 2019-07-10T13:07:42Z: JOB_MESSAGE_DEBUG: Value "GroupByKey
0/Session" materialized.
root: INFO: 2019-07-10T13:07:42.117Z: JOB_MESSAGE_BASIC: Executing operation
Read+Measure time: Start+GroupByKey 0/Reify+GroupByKey 0/Write
root: INFO: Job 2019-07-10_06_07_36-15570994227384705247 is in state
JOB_STATE_RUNNING
root: INFO: 2019-07-10T13:08:37.280Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised
the number of workers to 4 based on the rate of progress in the currently
running step(s).
root: INFO: 2019-07-10T13:08:37.321Z: JOB_MESSAGE_DETAILED: Resized worker pool
to 4, though goal was 5. This could be a quota issue.
root: INFO: 2019-07-10T13:08:42.760Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised
the number of workers to 5 based on the rate of progress in the currently
running step(s).
root: INFO: 2019-07-10T13:09:22.200Z: JOB_MESSAGE_DETAILED: Workers have
started successfully.
root: INFO: 2019-07-10T13:09:22.236Z: JOB_MESSAGE_DETAILED: Workers have
started successfully.
root: INFO: 2019-07-10T13:10:41.864Z: JOB_MESSAGE_BASIC: Worker configuration:
n1-standard-1 in us-central1-b.
root: INFO: 2019-07-10T13:13:41.493Z: JOB_MESSAGE_DETAILED: Checking
permissions granted to controller Service Account.
root: INFO: 2019-07-10T13:13:41.948Z: JOB_MESSAGE_BASIC: Worker configuration:
n1-standard-1 in us-central1-b.
root: INFO: 2019-07-10T13:14:00.868Z: JOB_MESSAGE_BASIC: Finished operation
Read+Measure time: Start+GroupByKey 0/Reify+GroupByKey 0/Write
root: INFO: 2019-07-10T13:14:00.947Z: JOB_MESSAGE_BASIC: Executing operation
GroupByKey 0/Close
root: INFO: 2019-07-10T13:14:00.996Z: JOB_MESSAGE_BASIC: Finished operation
GroupByKey 0/Close
root: INFO: 2019-07-10T13:14:01.114Z: JOB_MESSAGE_BASIC: Executing operation
GroupByKey 0/Read+GroupByKey 0/GroupByWindow+Ungroup 0+Measure time: End 0
root: INFO: 2019-07-10T13:16:34.396Z: JOB_MESSAGE_BASIC: Finished operation
GroupByKey 0/Read+GroupByKey 0/GroupByWindow+Ungroup 0+Measure time: End 0
root: INFO: 2019-07-10T13:16:34.495Z: JOB_MESSAGE_DEBUG: Executing success step
success11
root: INFO: 2019-07-10T13:16:34.662Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-07-10T13:16:34.757Z: JOB_MESSAGE_DEBUG: Starting worker pool
teardown.
root: INFO: 2019-07-10T13:16:34.806Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-07-10_06_07_36-15570994227384705247/messages?alt=json&startTime=2019-07-10T13%3A16%3A34.806Z
after exception HttpError accessing
<https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-07-10_06_07_36-15570994227384705247/messages?alt=json&startTime=2019-07-10T13%3A16%3A34.806Z>:
response: <{'status': '503', 'content-length': '102', 'x-xss-protection': '0',
'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary':
'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip',
'cache-control': 'private', 'date': 'Wed, 10 Jul 2019 13:18:28 GMT',
'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json;
charset=UTF-8'}>, content <{
"error": {
"code": 503,
"message": "Deadline exceeded",
"status": "UNAVAILABLE"
}
}
>
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 712.328s
FAILED (errors=1)
> Task :sdks:python:apache_beam:testing:load_tests:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'>
line: 49
* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 12m 6s
4 actionable tasks: 3 executed, 1 up-to-date
Publishing build scan...
https://gradle.com/s/ck5rf5yylopsi
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]