See 
<https://builds.apache.org/job/beam_PostCommit_Python2/590/display/redirect?page=changes>

Changes:

[kirillkozlov] [BEAM-8275] Beam SQL should support BigQuery in DIRECT_READ mode

[github] Addressed review comments

[github] Added a test for BigQuery SQL read in EXPORT mode


------------------------------------------
[...truncated 1.50 MB...]
      "major": "7"
    }, 
    "workerPools": [
      {
        "autoscalingSettings": {}, 
        "kind": "harness", 
        "numWorkers": 1, 
        "packages": [
          {
            "location": 
"storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/requirements.txt",
 
            "name": "requirements.txt"
          }, 
          {
            "location": 
"storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/setuptools-41.1.0.zip",
 
            "name": "setuptools-41.1.0.zip"
          }, 
          {
            "location": 
"storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/PyHamcrest-1.9.0.tar.gz",
 
            "name": "PyHamcrest-1.9.0.tar.gz"
          }, 
          {
            "location": 
"storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/mock-3.0.5.tar.gz",
 
            "name": "mock-3.0.5.tar.gz"
          }, 
          {
            "location": 
"storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/setuptools-41.1.0.post1.tar.gz",
 
            "name": "setuptools-41.1.0.post1.tar.gz"
          }, 
          {
            "location": 
"storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/setuptools-41.0.1.zip",
 
            "name": "setuptools-41.0.1.zip"
          }, 
          {
            "location": 
"storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/six-1.12.0.tar.gz",
 
            "name": "six-1.12.0.tar.gz"
          }, 
          {
            "location": 
"storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/funcsigs-1.0.2.tar.gz",
 
            "name": "funcsigs-1.0.2.tar.gz"
          }, 
          {
            "location": 
"storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/setuptools-41.2.0.zip",
 
            "name": "setuptools-41.2.0.zip"
          }, 
          {
            "location": 
"storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/dataflow_python_sdk.tar",
 
            "name": "dataflow_python_sdk.tar"
          }, 
          {
            "location": 
"storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930165314-347896.1569862394.348033/dataflow-worker.jar",
 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com";, 
            "servicePath": "https://dataflow.googleapis.com";
          }
        }, 
        "workerHarnessContainerImage": 
"gcr.io/cloud-dataflow/v1beta3/python:beam-master-20190802"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0930165314-347896", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "bigquery_export_format": "FORMAT_AVRO", 
        "bigquery_flatten_results": true, 
        "bigquery_query": "SELECT * FROM (SELECT \"apple\" as fruit), (SELECT 
\"orange\" as fruit),", 
        "bigquery_use_legacy_sql": true, 
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "BigQuerySource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          }, 
          {
            "key": "query", 
            "label": "Query", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "STRING", 
            "value": "SELECT * FROM (SELECT \"apple\" as fruit), (SELECT 
\"orange\" as fruit),"
          }, 
          {
            "key": "validation", 
            "label": "Validation Enabled", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "BOOLEAN", 
            "value": false
          }
        ], 
        "format": "bigquery", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                  "component_encodings": [
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": []
                    }, 
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "read.out"
          }
        ], 
        "user_name": "read"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s2", 
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED", 
        "dataset": "python_query_to_table_15698623933884", 
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": 
"RowAsDictJsonCoder$eNprYEpOLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBKpRfo",
 
              "component_encodings": []
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "bigquery", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "schema": "{\"fields\": [{\"type\": \"STRING\", \"name\": \"fruit\", 
\"mode\": \"NULLABLE\"}]}", 
        "table": "output_table", 
        "user_name": "write/WriteToBigQuery/NativeWrite", 
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-09-30T16:53:24.826061Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-09-30_09_53_23-11518054275188141166'
 location: u'us-central1'
 name: u'beamapp-jenkins-0930165314-347896'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-09-30T16:53:24.826061Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-09-30_09_53_23-11518054275188141166]
root: INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_09_53_23-11518054275188141166?project=apache-beam-testing
root: INFO: Job 2019-09-30_09_53_23-11518054275188141166 is in state 
JOB_STATE_RUNNING
root: INFO: 2019-09-30T16:53:23.661Z: JOB_MESSAGE_DETAILED: Autoscaling was 
automatically enabled for job 2019-09-30_09_53_23-11518054275188141166.
root: INFO: 2019-09-30T16:53:23.661Z: JOB_MESSAGE_DETAILED: Autoscaling is 
enabled for job 2019-09-30_09_53_23-11518054275188141166. The number of workers 
will be between 1 and 1000.
root: INFO: 2019-09-30T16:53:26.790Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: INFO: 2019-09-30T16:53:27.540Z: JOB_MESSAGE_BASIC: Worker configuration: 
n1-standard-1 in us-central1-a.
root: INFO: 2019-09-30T16:53:28.237Z: JOB_MESSAGE_DETAILED: Expanding 
CoGroupByKey operations into optimizable parts.
root: INFO: 2019-09-30T16:53:28.292Z: JOB_MESSAGE_DETAILED: Expanding 
GroupByKey operations into optimizable parts.
root: INFO: 2019-09-30T16:53:28.322Z: JOB_MESSAGE_DETAILED: Lifting 
ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-09-30T16:53:28.358Z: JOB_MESSAGE_DEBUG: Annotating graph with 
Autotuner information.
root: INFO: 2019-09-30T16:53:28.507Z: JOB_MESSAGE_DETAILED: Fusing adjacent 
ParDo, Read, Write, and Flatten operations
root: INFO: 2019-09-30T16:53:28.539Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/WriteToBigQuery/NativeWrite into read
root: INFO: 2019-09-30T16:53:28.566Z: JOB_MESSAGE_DEBUG: Workflow config is 
missing a default resource spec.
root: INFO: 2019-09-30T16:53:28.601Z: JOB_MESSAGE_DEBUG: Adding StepResource 
setup and teardown to workflow graph.
root: INFO: 2019-09-30T16:53:28.637Z: JOB_MESSAGE_DEBUG: Adding workflow start 
and stop steps.
root: INFO: 2019-09-30T16:53:28.671Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-09-30T16:53:28.794Z: JOB_MESSAGE_DEBUG: Executing wait step 
start3
root: INFO: 2019-09-30T16:53:28.868Z: JOB_MESSAGE_BASIC: Executing operation 
read+write/WriteToBigQuery/NativeWrite
root: INFO: 2019-09-30T16:53:28.907Z: JOB_MESSAGE_DEBUG: Starting worker pool 
setup.
root: INFO: 2019-09-30T16:53:28.945Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-a...
root: INFO: 2019-09-30T16:53:31.042Z: JOB_MESSAGE_BASIC: BigQuery query issued 
as job: "dataflow_job_18190909499168834354". You can check its status with the 
bq tool: "bq show -j --project_id=apache-beam-testing 
dataflow_job_18190909499168834354".
root: INFO: 2019-09-30T16:53:55.076Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised 
the number of workers to 1 based on the rate of progress in the currently 
running step(s).
root: INFO: 2019-09-30T16:54:32.963Z: JOB_MESSAGE_DETAILED: Workers have 
started successfully.
root: INFO: 2019-09-30T16:54:32.994Z: JOB_MESSAGE_DETAILED: Workers have 
started successfully.
root: INFO: 2019-09-30T16:55:13.747Z: JOB_MESSAGE_BASIC: BigQuery query 
completed, job : "dataflow_job_18190909499168834354"
root: INFO: 2019-09-30T16:55:14.143Z: JOB_MESSAGE_BASIC: BigQuery export job 
"dataflow_job_15720953621548940030" started. You can check its status with the 
bq tool: "bq show -j --project_id=apache-beam-testing 
dataflow_job_15720953621548940030".
root: INFO: 2019-09-30T16:55:44.620Z: JOB_MESSAGE_DETAILED: BigQuery export job 
progress: "dataflow_job_15720953621548940030" observed total of 1 exported 
files thus far.
root: INFO: 2019-09-30T16:55:44.662Z: JOB_MESSAGE_BASIC: BigQuery export job 
finished: "dataflow_job_15720953621548940030"
root: INFO: 2019-09-30T16:58:05.913Z: JOB_MESSAGE_BASIC: Executing BigQuery 
import job "dataflow_job_18190909499168835932". You can check its status with 
the bq tool: "bq show -j --project_id=apache-beam-testing 
dataflow_job_18190909499168835932".
root: INFO: 2019-09-30T16:59:28.778Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: INFO: 2019-09-30T17:05:28.778Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: WARNING: Timing out on waiting for job 
2019-09-30_09_53_23-11518054275188141166 after 904 seconds
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 4181.793s

FAILED (SKIP=4, errors=5, failures=1)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
 line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 10m 45s
111 actionable tasks: 86 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/m3bqcglj4gbeg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to