See 
<https://builds.apache.org/job/beam_BiqQueryIO_Write_Performance_Test_Python_Batch/2/display/redirect?page=changes>

Changes:

[dcavazos] Add Python snippet for Map transform

[aryan.naraghi] Implement splitAtFraction for the BigQuery Storage API

[github] Drop experimental from python streaming in the doc

[juta.staes] [BEAM-7326] add documentation bigquery data types

[lcwik] [BEAM-4948, BEAM-6267, BEAM-5559, BEAM-7289] Update the version of guava

[aaltay] [BEAM-7475] update wordcount example (#8803)

[github] [BEAM-7013] Add HLL doc link to Beam website

[heejong] [BEAM-7424] Retry HTTP 429 errors from GCS

[markliu] [BEAM-4046, BEAM-7527] Fix benchmark with correct Gradle project

[zyichi] [BEAM-7586] Add Integration test for python mongodb io

[valentyn] Match Python 3 warning message in __init__.py with the one in 
setup.py.

[kedin] Spotless config update to include java files only under src directory

[chamikara] [BEAM-7548] fix flaky tests for ApproximateUnique (#8948)

[samuelw] [BEAM-7547] Avoid WindmillStateCache cache hits for stale work.

[iemejia] [BEAM-7640] Create amazon-web-services2 module and AwsOptions

[alireza4263] [BEAM-7545] Adding RowCount to TextTable.

[kcweaver] [BEAM-6692] portable Spark: reshuffle translation

[chamikara] [BEAM-7548] Fix flaky tests for ApproximateUnique (#8960)

[iemejia] [BEAM-7589] Use only one KinesisProducer instance per JVM

[iemejia] [BEAM-7589] Make KinesisIOIT compatible with all other ITs

[daniel.o.programmer] Update python containers to beam-master-20190605

[hannahjiang] BEAM-3645 add ParallelBundleProcessor

[hannahjiang] BEAM-3645 reflect comments

[hannahjiang] BEAM-3645 add changes from review comments

[hannahjiang] BEAM-3645 add thread lock when generating process_bundle_id

[github] Tiny typo fix

[kamil.wasilewski] [BEAM-7536] Fixed BQ dataset name in collecting Load Tests 
metrics

------------------------------------------
[...truncated 51.14 KB...]
            "servicePath": "https://dataflow.googleapis.com";
          }
        }, 
        "workerHarnessContainerImage": 
"gcr.io/cloud-dataflow/v1beta3/python:beam-master-20190628"
      }
    ]
  }, 
  "name": "performance-tests-bqio-write-python-batch-10gb0702141153", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "custom_source_step_input": {
          "metadata": {
            "estimated_size_bytes": {
              "@type": "http://schema.org/Integer";, 
              "value": 10747904000
            }
          }, 
          "spec": {
            "@type": "CustomSourcesType", 
            "serialized_source": "<string of 356 bytes>"
          }
        }, 
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "SyntheticSource", 
            "type": "STRING", 
            "value": "apache_beam.testing.synthetic_pipeline.SyntheticSource"
          }
        ], 
        "format": "custom_source", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                  "component_encodings": [
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": []
                    }, 
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ProduceRows.out"
          }
        ], 
        "user_name": "ProduceRows"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "format_record"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                  "component_encodings": [
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": []
                    }, 
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "Format.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "<string of 1068 bytes>", 
        "user_name": "Format"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED", 
        "dataset": "beam_performance", 
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": 
"RowAsDictJsonCoder$eNprYEpOLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBKpRfo",
 
              "component_encodings": []
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "bigquery", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "schema": "{\"fields\": [{\"type\": \"BYTES\", \"name\": \"data\", 
\"mode\": \"NULLABLE\"}]}", 
        "table": "bqio_write_10GB", 
        "user_name": "WriteToBigQuery/WriteToBigQuery/NativeWrite", 
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-07-02T15:53:00.658851Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-07-02_08_52_59-13093630053970301780'
 location: u'us-central1'
 name: u'performance-tests-bqio-write-python-batch-10gb0702141153'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-07-02T15:53:00.658851Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-07-02_08_52_59-13093630053970301780]
root: INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-02_08_52_59-13093630053970301780?project=apache-beam-testing
root: INFO: Job 2019-07-02_08_52_59-13093630053970301780 is in state 
JOB_STATE_PENDING
root: INFO: 2019-07-02T15:53:02.589Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: INFO: 2019-07-02T15:53:02.993Z: JOB_MESSAGE_BASIC: Worker configuration: 
n1-standard-1 in us-central1-a.
root: INFO: 2019-07-02T15:53:03.592Z: JOB_MESSAGE_DETAILED: Expanding 
CoGroupByKey operations into optimizable parts.
root: INFO: 2019-07-02T15:53:03.643Z: JOB_MESSAGE_DETAILED: Expanding 
GroupByKey operations into optimizable parts.
root: INFO: 2019-07-02T15:53:03.693Z: JOB_MESSAGE_DETAILED: Lifting 
ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-07-02T15:53:03.736Z: JOB_MESSAGE_DEBUG: Annotating graph with 
Autotuner information.
root: INFO: 2019-07-02T15:53:03.795Z: JOB_MESSAGE_DETAILED: Fusing adjacent 
ParDo, Read, Write, and Flatten operations
root: INFO: 2019-07-02T15:53:03.829Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Format into ProduceRows
root: INFO: 2019-07-02T15:53:03.872Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/WriteToBigQuery/NativeWrite into Format
root: INFO: 2019-07-02T15:53:03.913Z: JOB_MESSAGE_DEBUG: Workflow config is 
missing a default resource spec.
root: INFO: 2019-07-02T15:53:03.961Z: JOB_MESSAGE_DEBUG: Adding StepResource 
setup and teardown to workflow graph.
root: INFO: 2019-07-02T15:53:04.008Z: JOB_MESSAGE_DEBUG: Adding workflow start 
and stop steps.
root: INFO: 2019-07-02T15:53:04.058Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-07-02T15:53:04.242Z: JOB_MESSAGE_DEBUG: Executing wait step 
start3
root: INFO: 2019-07-02T15:53:04.328Z: JOB_MESSAGE_BASIC: Executing operation 
ProduceRows+Format+WriteToBigQuery/WriteToBigQuery/NativeWrite
root: INFO: 2019-07-02T15:53:04.371Z: JOB_MESSAGE_DEBUG: Starting worker pool 
setup.
root: INFO: 2019-07-02T15:53:04.418Z: JOB_MESSAGE_BASIC: Starting 3 workers in 
us-central1-a...
root: INFO: Job 2019-07-02_08_52_59-13093630053970301780 is in state 
JOB_STATE_RUNNING
root: INFO: 2019-07-02T15:54:09.639Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised 
the number of workers to 3 based on the rate of progress in the currently 
running step(s).
root: INFO: 2019-07-02T15:54:40.314Z: JOB_MESSAGE_DETAILED: Workers have 
started successfully.
root: INFO: 2019-07-02T15:54:40.355Z: JOB_MESSAGE_DETAILED: Workers have 
started successfully.
root: INFO: 2019-07-02T16:01:17.899Z: JOB_MESSAGE_BASIC: Executing BigQuery 
import job "dataflow_job_14080375331294610306". You can check its status with 
the bq tool: "bq show -j --project_id=apache-beam-testing 
dataflow_job_14080375331294610306".
root: INFO: 2019-07-02T16:01:18.376Z: JOB_MESSAGE_BASIC: BigQuery import job 
"dataflow_job_14080375331294610306" failed. We will try again with a new import 
job "dataflow_job_14080375331294610306-A".
root: INFO: 2019-07-02T16:01:18.603Z: JOB_MESSAGE_BASIC: BigQuery import job 
"dataflow_job_14080375331294610306-A" failed. We will try again with a new 
import job "dataflow_job_14080375331294610306-B".
root: INFO: 2019-07-02T16:01:19.819Z: JOB_MESSAGE_BASIC: Finished operation 
ProduceRows+Format+WriteToBigQuery/WriteToBigQuery/NativeWrite
root: INFO: 2019-07-02T16:01:19.916Z: JOB_MESSAGE_DEBUG: Executing failure step 
failure2
root: INFO: 2019-07-02T16:01:19.955Z: JOB_MESSAGE_ERROR: Workflow failed. 
Causes: S01:ProduceRows+Format+WriteToBigQuery/WriteToBigQuery/NativeWrite 
failed., BigQuery import job "dataflow_job_14080375331294610306-B" failed., 
BigQuery job "dataflow_job_14080375331294610306-B" in project 
"apache-beam-testing" finished with error(s): errorResult: Already Exists: 
Table apache-beam-testing:beam_performance.bqio_write_10GB, error: Already 
Exists: Table apache-beam-testing:beam_performance.bqio_write_10GB
root: INFO: 2019-07-02T16:01:20.105Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-07-02T16:01:20.173Z: JOB_MESSAGE_DEBUG: Starting worker pool 
teardown.
root: INFO: 2019-07-02T16:01:20.215Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-07-02T16:04:09.650Z: JOB_MESSAGE_DETAILED: Autoscaling: 
Resized worker pool from 3 to 0.
root: INFO: 2019-07-02T16:04:09.692Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-07-02T16:04:09.740Z: JOB_MESSAGE_DEBUG: Tearing down pending 
resources...
root: INFO: Job 2019-07-02_08_52_59-13093630053970301780 is in state 
JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_BiqQueryIO_Write_Performance_Test_Python_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 677.515s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_BiqQueryIO_Write_Performance_Test_Python_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'>
 line: 49

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 12m 3s
4 actionable tasks: 4 executed

Publishing build scan...
https://gradle.com/s/x3n24vukwde5u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to