See <https://builds.apache.org/job/beam_PostCommit_Python35/487/display/redirect?page=changes>
Changes: [kamil.wasilewski] [BEAM-8178] Switch to a new gradle task that builds python docker image ------------------------------------------ [...truncated 369.37 KB...] "properties": { "custom_source_step_input": { "metadata": { "estimated_size_bytes": { "@type": "http://schema.org/Integer", "value": 100 } }, "spec": { "@type": "CustomSourcesType", "serialized_source": "eNp9kMFKw0AQhpMmpppWqx4ET17rJS8R8OKl4CUXWWY3E1y6aZzdTQ9CoPokXnwBQfBSfIX0HfQ9TGoa9OJpmPmGj/ln5Qm4B3GHjCPkkdWwMFmhcxMJjWCRmaLUAkMWb9ubn46cy0dyKxpMk1PHcZhBLUHJB0zZElSJhrxb8qfx2XDge8Zq3wMu/GBR5hy169LeL5Ji1pMBBfH5jgSf6+evl3UPPRrGFzs4rj82q/q9fq3fNk/9ik/7mATtRaJIUdPBn3DbmelKeAXGzrTMpZVLNHE7DClsco0qGieTVpKBUhzEvLMd/mObSTFX2FmOGsukomNueDJqRbawoJhpHkQn12nJo28YqXwE" } }, "display_data": [ { "key": "source", "label": "Read Source", "namespace": "apache_beam.io.iobase.Read", "shortValue": "_CreateSource", "type": "STRING", "value": "apache_beam.transforms.create_source._CreateSource" } ], "format": "custom_source", "output_info": [ { "encoding": { "@type": "kind:windowed_value", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=", "component_encodings": [] }, { "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=", "component_encodings": [] } ], "is_pair_like": true }, { "@type": "kind:global_window" } ], "is_wrapper": true }, "output_name": "out", "user_name": "create/Read.out" } ], "user_name": "create/Read" } }, { "kind": "ParallelWrite", "name": "s2", "properties": { "create_disposition": "CREATE_IF_NEEDED", "dataset": "python_write_to_table_15686440961574", "display_data": [], "encoding": { "@type": "kind:windowed_value", "component_encodings": [ { "@type": "RowAsDictJsonCoder$eNprYE5OLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBK5xfp", "component_encodings": [] }, { "@type": "kind:global_window" } ], "is_wrapper": true }, "format": "bigquery", "parallel_input": { "@type": "OutputReference", "output_name": "out", "step_name": "s1" }, "schema": "{\"fields\": [{\"name\": \"number\", \"mode\": \"NULLABLE\", \"type\": \"INTEGER\"}, {\"name\": \"str\", \"mode\": \"NULLABLE\", \"type\": \"STRING\"}]}", "table": "python_write_table", "user_name": "write/WriteToBigQuery/NativeWrite", "write_disposition": "WRITE_EMPTY" } } ], "type": "JOB_TYPE_BATCH" } root: INFO: Create job: <Job createTime: '2019-09-16T14:28:44.534856Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2019-09-16_07_28_43-9584088841369332293' location: 'us-central1' name: 'beamapp-jenkins-0916142817-091360' projectId: 'apache-beam-testing' stageStates: [] startTime: '2019-09-16T14:28:44.534856Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> root: INFO: Created job with id: [2019-09-16_07_28_43-9584088841369332293] root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_28_43-9584088841369332293?project=apache-beam-testing root: INFO: Job 2019-09-16_07_28_43-9584088841369332293 is in state JOB_STATE_RUNNING root: INFO: 2019-09-16T14:28:43.268Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-09-16_07_28_43-9584088841369332293. The number of workers will be between 1 and 1000. root: INFO: 2019-09-16T14:28:43.268Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-09-16_07_28_43-9584088841369332293. root: INFO: 2019-09-16T14:28:46.344Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. root: INFO: 2019-09-16T14:28:47.100Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a. root: INFO: 2019-09-16T14:28:48.006Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. root: INFO: 2019-09-16T14:28:48.043Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. root: INFO: 2019-09-16T14:28:48.076Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns root: INFO: 2019-09-16T14:28:48.111Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. root: INFO: 2019-09-16T14:28:48.160Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations root: INFO: 2019-09-16T14:28:48.192Z: JOB_MESSAGE_DETAILED: Fusing consumer write/WriteToBigQuery/NativeWrite into create/Read root: INFO: 2019-09-16T14:28:48.220Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. root: INFO: 2019-09-16T14:28:48.257Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. root: INFO: 2019-09-16T14:28:48.289Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. root: INFO: 2019-09-16T14:28:48.317Z: JOB_MESSAGE_DEBUG: Assigning stage ids. root: INFO: 2019-09-16T14:28:48.440Z: JOB_MESSAGE_DEBUG: Executing wait step start3 root: INFO: 2019-09-16T14:28:48.516Z: JOB_MESSAGE_BASIC: Executing operation create/Read+write/WriteToBigQuery/NativeWrite root: INFO: 2019-09-16T14:28:48.555Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. root: INFO: 2019-09-16T14:28:48.589Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a... root: INFO: 2019-09-16T14:29:20.140Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s). root: INFO: 2019-09-16T14:29:53.008Z: JOB_MESSAGE_DETAILED: Workers have started successfully. root: INFO: 2019-09-16T14:29:53.028Z: JOB_MESSAGE_DETAILED: Workers have started successfully. root: INFO: 2019-09-16T14:31:59.689Z: JOB_MESSAGE_ERROR: Workflow failed. root: INFO: 2019-09-16T14:31:59.747Z: JOB_MESSAGE_BASIC: Executing BigQuery import job "dataflow_job_4806243847665945542". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_4806243847665945542". root: INFO: 2019-09-16T14:32:00.344Z: JOB_MESSAGE_WARNING: S01:create/Read+write/WriteToBigQuery/NativeWrite failed. root: INFO: 2019-09-16T14:32:00.431Z: JOB_MESSAGE_BASIC: Finished operation create/Read+write/WriteToBigQuery/NativeWrite root: INFO: 2019-09-16T14:32:00.542Z: JOB_MESSAGE_DETAILED: Cleaning up. root: INFO: 2019-09-16T14:32:00.598Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. root: INFO: 2019-09-16T14:32:00.632Z: JOB_MESSAGE_BASIC: Stopping worker pool... root: INFO: 2019-09-16T14:35:45.260Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. root: INFO: 2019-09-16T14:35:45.306Z: JOB_MESSAGE_BASIC: Worker pool stopped. root: INFO: 2019-09-16T14:35:45.343Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... root: INFO: Job 2019-09-16_07_28_43-9584088841369332293 is in state JOB_STATE_FAILED root: INFO: Deleting dataset python_write_to_table_15686440961574 in project apache-beam-testing --------------------- >> end captured logging << --------------------- Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_04_02-1221469477659938451?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_20_35-15954842341401332260?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_30_35-372802914622079505?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_41_24-15890810096746621636?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_51_08-1834603801540445360?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_03_58-11671365154048896394?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:696: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_26_53-3355102677857287149?project=apache-beam-testing kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_46_28-15943954786550019732?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_04_02-17486123305763773208?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_19_18-5709868248034428116?project=apache-beam-testing experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_29_56-16654135669813247225?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_38_26-4869869991567138773?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_03_57-3447886450872772463?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_24_13-13147149381676089964?project=apache-beam-testing experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_28_43-9584088841369332293?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_36_23-6290770757767922941?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_45_42-5351113806088168765?project=apache-beam-testing temp_location = p.options.view_as(GoogleCloudOptions).temp_location <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:696: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_03_57-6013583439855751453?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_16_06-2256187872524816660?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_28_10-3870196212297825320?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_39_37-10252128859307455037?project=apache-beam-testing experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_03_56-17886873744811574094?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_12_53-657536677855800724?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:696: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_23_44-10745705514830542911?project=apache-beam-testing kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_34_52-11505852715800741451?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:577: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_45_43-11252122260177931069?project=apache-beam-testing streaming = self.test_pipeline.options.view_as(StandardOptions).streaming <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental. | 'Checksums' >> beam.Map(compute_hash)) <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_04_03-16900731905644631147?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_14_41-1994340018237339097?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_24_35-8335287080160931632?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_35_59-5878807046960838319?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_46_06-16760130892559386105?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_56_35-2222132295195165739?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_03_59-12823144994214992482?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:696: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_15_25-17367065665891201897?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_26_05-7730667919761957000?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_36_21-3960166801038114527?project=apache-beam-testing kms_key=kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_45_47-6144841978697921771?project=apache-beam-testing ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py35.xml ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 45 tests in 3826.404s FAILED (SKIP=6, errors=3) > Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 56 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 4m 40s 64 actionable tasks: 47 executed, 17 from cache Publishing build scan... https://gradle.com/s/thx52hlnbka52 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org For additional commands, e-mail: builds-h...@beam.apache.org