See 
<https://builds.apache.org/job/beam_PostCommit_Python37/61/display/redirect?page=changes>

Changes:

[kenn] [BEAM-7755] adding support for nested rows and arrays in BigQuery to

------------------------------------------
[...truncated 425.46 KB...]
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s18"
        },
        "serialized_fn": 
"eNq1VP1z1EQYTnJXKKEUaBWK+HGg6FXhoqCCCAi9tkDPHjVUblFr2CR7l7T5ejcb2s7czaBMOv0b/MHRGf9Q3927UivCb85OPt6vZ3ef99l9Vql7NKNewByX0bghOE3ybsrjvOGlnJlNGkXUjViH0yxjfD5dTEzQZn8BfQBGnVQ0TXO6CVQ8P4yihiPfpuNxRgVzukXiiTDFgmr9QDxKqe+I7YyZMEbGEaKZ+mwVbThUwmEbxustvaXhY7Smm+au1tee6z39oQZH2iWYs0THki04WsIEifHXCtKYWess2QiTfO97KY/oU2Ztpnwjx/0xS27PWUlz0UzjOBTOyrYI0uTKVSvnnpX7G7mVKY/1DzqsfTosSUcj24ZjasU3Ihq7Pr0Fk8snxpoaHCcGepGJEyWcnBUwZcP0gT33mHCoENyENxSAW4SRwHXCm+QwmhiWUTi1A6dtmDlQGsZZyoUTp34RIWVnyFkseE3T4K0SztrwtprHQRBPOA68swPv2vAeOSSdDAoaQa39X23zGBpwLqjWg1EjxloT2Ii/hLaLjRjo2xeFrv4MocvWDCp9o1/ZqPDLwvB1+d81TqH/V72jJetVTVQ2TP67rsmxdbuvzWtr1wbV7cm+/ke1X/1T17WOBkuYN4Z5z4Z5Ekk2fg/pMWYQfDpaX+e/vRxNDKL5GkrkfJucxh0u0jBifo3mOePieu0Cr928iW94fwc+qJMqZkRhLuCCoiNHepkPH5JpNOaQ0TuqbGHLY5kUMHxEjmBEKnSB85RDXZVxFqdPGcwSE41HNCpG0Y8FfDLMoJ6QPF8kx9BgWxnzcB5HzXyJHH8xs7MXgobKHHlH1ZYSCItYzBIBnwr4jKz/z6pnOUqzZxUijKTkLwe11mbzvKaPjxn6uBqGPmVM6iZ+J3EY+oxexTdcUZp7sZ3PS/gCD8OXNlwNZoIzZObfwh1O1JATwbUSvrLheoBC/dqGG0GtHZxbg5vLT2ZpCbds+KaE2wO4Q45KAcvbwwnCROQwd/ACw4DyN3yGh4GKlOfm/Qeyd/ek24Qm3l7z7QEs1BVUmGSFUHg5LLbJBLrSQuz77raLHbjnYsvu27BUQsuGb0tYHkC7HswFEuwBgq3Ug8V2oHK/c4dLpLyXIw/yerSDpULAQxtWVS8znnosz+H7YPWl3TxSkB2EJPuQj93CXYMfBvDjGvz02vu6EyZ+uomcmrCGOD8PwKmrnmyqAK7lyavqhxnm3Sh1aTTEQbYoorhK/oKHvR7jCOG9CmKUYs6zLi0isToywUcQRk6qE+EVcRFReazkjcag29LJlIQPY1QDjTPHS2M3TBiHHoYUP2Hu+ENICHYLV0DY+Bvmih/1",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-07-26T20:38:50.666134Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-07-26_13_38_49-5067447276669107585'
 location: 'us-central1'
 name: 'beamapp-jenkins-0726203840-893745'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-07-26T20:38:50.666134Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-07-26_13_38_49-5067447276669107585]
root: INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_38_49-5067447276669107585?project=apache-beam-testing
root: INFO: Job 2019-07-26_13_38_49-5067447276669107585 is in state 
JOB_STATE_RUNNING
root: INFO: 2019-07-26T20:38:49.384Z: JOB_MESSAGE_DETAILED: Autoscaling is 
enabled for job 2019-07-26_13_38_49-5067447276669107585. The number of workers 
will be between 1 and 1000.
root: INFO: 2019-07-26T20:38:49.425Z: JOB_MESSAGE_DETAILED: Autoscaling was 
automatically enabled for job 2019-07-26_13_38_49-5067447276669107585.
root: INFO: 2019-07-26T20:38:52.699Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: INFO: 2019-07-26T20:38:53.467Z: JOB_MESSAGE_BASIC: Worker configuration: 
n1-standard-1 in us-central1-a.
root: INFO: 2019-07-26T20:38:55.079Z: JOB_MESSAGE_DETAILED: Expanding 
CoGroupByKey operations into optimizable parts.
root: INFO: 2019-07-26T20:38:55.127Z: JOB_MESSAGE_DEBUG: Combiner lifting 
skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a 
combiner.
root: INFO: 2019-07-26T20:38:55.160Z: JOB_MESSAGE_DETAILED: Expanding 
GroupByKey operations into optimizable parts.
root: INFO: 2019-07-26T20:38:55.206Z: JOB_MESSAGE_DETAILED: Lifting 
ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-07-26T20:38:55.348Z: JOB_MESSAGE_DEBUG: Annotating graph with 
Autotuner information.
root: INFO: 2019-07-26T20:38:55.468Z: JOB_MESSAGE_DETAILED: Fusing adjacent 
ParDo, Read, Write, and Flatten operations
root: INFO: 2019-07-26T20:38:55.519Z: JOB_MESSAGE_DETAILED: Fusing consumer row 
to string into read
root: INFO: 2019-07-26T20:38:55.566Z: JOB_MESSAGE_DETAILED: Fusing consumer 
count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
 into count/CombineGlobally(CountCombineFn)/KeyWithVoid
root: INFO: 2019-07-26T20:38:55.614Z: JOB_MESSAGE_DETAILED: Fusing consumer 
count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract into 
count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine
root: INFO: 2019-07-26T20:38:55.663Z: JOB_MESSAGE_DETAILED: Fusing consumer 
count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine into 
count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read
root: INFO: 2019-07-26T20:38:55.713Z: JOB_MESSAGE_DETAILED: Fusing consumer 
count/CombineGlobally(CountCombineFn)/KeyWithVoid into row to string
root: INFO: 2019-07-26T20:38:55.762Z: JOB_MESSAGE_DETAILED: Fusing consumer 
count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write into 
count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify
root: INFO: 2019-07-26T20:38:55.817Z: JOB_MESSAGE_DETAILED: Fusing consumer 
count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify into 
count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
root: INFO: 2019-07-26T20:38:55.851Z: JOB_MESSAGE_DETAILED: Fusing consumer 
count/CombineGlobally(CountCombineFn)/UnKey into 
count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract
root: INFO: 2019-07-26T20:38:55.891Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-07-26T20:38:55.940Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Match into assert_that/Unkey
root: INFO: 2019-07-26T20:38:55.979Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/Map(_merge_tagged_vals_under_key) into 
assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2019-07-26T20:38:56.014Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2019-07-26T20:38:56.062Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/GroupByKey/GroupByWindow into 
assert_that/Group/GroupByKey/Read
root: INFO: 2019-07-26T20:38:56.103Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
s15 for input s13.out
root: INFO: 2019-07-26T20:38:56.142Z: JOB_MESSAGE_DETAILED: Fusing unzipped 
copy of assert_that/Group/GroupByKey/Reify, through flatten 
assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2019-07-26T20:38:56.189Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2019-07-26T20:38:56.233Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
root: INFO: 2019-07-26T20:38:56.275Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-07-26T20:38:56.318Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-07-26T20:38:56.361Z: JOB_MESSAGE_DETAILED: Fusing consumer 
count/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault into 
count/CombineGlobally(CountCombineFn)/DoOnce/Read
root: INFO: 2019-07-26T20:38:56.405Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/WindowInto(WindowIntoFn) into 
count/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault
root: INFO: 2019-07-26T20:38:56.447Z: JOB_MESSAGE_DEBUG: Workflow config is 
missing a default resource spec.
root: INFO: 2019-07-26T20:38:56.482Z: JOB_MESSAGE_DEBUG: Adding StepResource 
setup and teardown to workflow graph.
root: INFO: 2019-07-26T20:38:56.530Z: JOB_MESSAGE_DEBUG: Adding workflow start 
and stop steps.
root: INFO: 2019-07-26T20:38:56.578Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-07-26T20:38:56.748Z: JOB_MESSAGE_DEBUG: Executing wait step 
start38
root: INFO: 2019-07-26T20:38:56.838Z: JOB_MESSAGE_BASIC: Executing operation 
assert_that/Group/GroupByKey/Create
root: INFO: 2019-07-26T20:38:56.878Z: JOB_MESSAGE_BASIC: Executing operation 
count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
root: INFO: 2019-07-26T20:38:56.904Z: JOB_MESSAGE_DEBUG: Starting worker pool 
setup.
root: INFO: 2019-07-26T20:38:56.944Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-a...
root: INFO: 2019-07-26T20:38:56.993Z: JOB_MESSAGE_BASIC: Finished operation 
assert_that/Group/GroupByKey/Create
root: INFO: 2019-07-26T20:38:57.005Z: JOB_MESSAGE_BASIC: Finished operation 
count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
root: INFO: 2019-07-26T20:38:57.082Z: JOB_MESSAGE_DEBUG: Value 
"assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2019-07-26T20:38:57.127Z: JOB_MESSAGE_DEBUG: Value 
"count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Session" 
materialized.
root: INFO: 2019-07-26T20:38:57.165Z: JOB_MESSAGE_BASIC: Executing operation 
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-07-26T20:38:57.209Z: JOB_MESSAGE_BASIC: Executing operation 
read+row to 
string+count/CombineGlobally(CountCombineFn)/KeyWithVoid+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write
root: INFO: 2019-07-26T20:38:57.275Z: JOB_MESSAGE_BASIC: Worker configuration: 
n1-standard-1 in us-central1-a.
root: INFO: 2019-07-26T20:38:57.702Z: JOB_MESSAGE_BASIC: BigQuery export job 
"dataflow_job_15507615192659607188" started. You can check its status with the 
bq tool: "bq show -j --project_id=apache-beam-testing 
dataflow_job_15507615192659607188".
root: INFO: 2019-07-26T20:39:28.096Z: JOB_MESSAGE_DETAILED: BigQuery export job 
progress: "dataflow_job_15507615192659607188" observed total of 1 exported 
files thus far.
root: INFO: 2019-07-26T20:39:28.124Z: JOB_MESSAGE_BASIC: BigQuery export job 
finished: "dataflow_job_15507615192659607188"
--------------------- >> end captured logging << ---------------------
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_24_25-17722113206542149765?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_40_33-9069526600948934947?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_49_04-16600699666259674122?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_58_06-8363846330893142438?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_14_06_56-222217634683249870?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_24_22-14514318667971977556?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_51_40-11519821405477280587?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232:
 FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_14_00_51-9232573160719572084?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_24_26-8417340155672994975?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Exception in thread Thread-7:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_38_49-5067447276669107585?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_44_01-6767404844052065318?project=apache-beam-testing.
  File "/usr/lib/python3.7/threading.py", line 865, in run
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_53_18-6430029469431595115?project=apache-beam-testing.
    self._target(*self._args, **self._kwargs)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_14_02_21-8830238735928693740?project=apache-beam-testing.
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 157, in poll_for_job_completion
    response = runner.dataflow_client.get_job(job_id)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 197, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 670, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py";,>
 line 689, in Get
    config, request, global_params=global_params)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py";,>
 line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py";,>
 line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py";,>
 line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing 
<https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-07-26_13_38_49-5067447276669107585?alt=json>:
 response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 
'application/json; charset=UTF-8', 'date': 'Fri, 26 Jul 2019 20:40:26 GMT', 
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 
'transfer-encoding': 'chunked', 'status': '404', 'content-length': '279', 
'-content-encoding': 'gzip'}>, content <{
  "error": {
    "code": 404,
    "message": "(687dcbfbb85804b7): Information about job 
2019-07-26_13_38_49-5067447276669107585 could not be found in our system. 
Please double check the id is correct. If it is please contact customer 
support.",
    "status": "NOT_FOUND"
  }
}
>

<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_24_21-3706674836787840249?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_50_02-7989104036306550287?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_58_00-4407603101841831116?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_24_22-9238334030523803668?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_35_15-4868205814488542101?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_43_53-3584540171146075594?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_52_56-13958975310770779041?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_14_05_49-8112949248420680717?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=kms_key))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_24_20-1770764265049143679?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Exception in thread Thread-2:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_33_53-8581855280738181295?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_43_57-4458161703035446415?project=apache-beam-testing.
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 157, in poll_for_job_completion
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_54_03-8795858236289569284?project=apache-beam-testing.
    response = runner.dataflow_client.get_job(job_id)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 197, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 670, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py";,>
 line 689, in Get
    config, request, global_params=global_params)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py";,>
 line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py";,>
 line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py";,>
 line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing 
<https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-07-26_13_33_53-8581855280738181295?alt=json>:
 response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 
'application/json; charset=UTF-8', 'date': 'Fri, 26 Jul 2019 20:40:36 GMT', 
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 
'transfer-encoding': 'chunked', 'status': '404', 'content-length': '279', 
'-content-encoding': 'gzip'}>, content <{
  "error": {
    "code": 404,
    "message": "(55658fbd30609ba7): Information about job 
2019-07-26_13_33_53-8581855280738181295 could not be found in our system. 
Please double check the id is correct. If it is please contact customer 
support.",
    "status": "NOT_FOUND"
  }
}
>

<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:565:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_24_26-9689853474071563872?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_35_23-1104313340577050245?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_46_51-14012955419212732176?project=apache-beam-testing.
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_56_59-12520241367282287768?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_24_22-9492592111855405951?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_34_19-5839292707225717112?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_42_15-4185680915022932869?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_13_51_32-10284747539675501768?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_14_02_49-8378994142621799930?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-26_14_12_22-3427641669460017667?project=apache-beam-testing.

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 41 tests in 3450.429s

FAILED (SKIP=4, failures=2)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
 line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 51s
64 actionable tasks: 48 executed, 16 from cache

Publishing build scan...
https://gradle.com/s/txhbchwwtsllu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to