See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/3754/display/redirect>

Changes:


------------------------------------------
[...truncated 22.56 MB...]
  seconds: 1619849717
  nanos: 279535293
}
message: "Python sdk harness starting."
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:164"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1619849717
  nanos: 281322956
}
message: "Status HTTP server running at localhost:44985"
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:81"
thread: "status-server-demon"

INFO:root:severity: INFO
timestamp {
  seconds: 1619849717
  nanos: 285999774
}
message: "Creating insecure state channel for localhost:32845."
instruction_id: "bundle_1"
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:875"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1619849717
  nanos: 286254167
}
message: "State channel established."
instruction_id: "bundle_1"
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:882"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1619849717
  nanos: 287892818
}
message: "Creating client data channel for localhost:39921"
instruction_id: "bundle_1"
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:689"
thread: "Thread-14"

INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running 
(((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-2930-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x
 coord/Write)
INFO:root:Running 
(((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-2930-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x
 coord/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x 
coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x 
coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running 
((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running 
((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running 
((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running 
((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running 
(ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running 
(ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
  seconds: 1619849717
  nanos: 493162393
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), 
batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:303"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1619849717
  nanos: 600134134
}
message: "Renamed 1 shards in 0.11 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1619849717
  nanos: 610587596
}
message: "No more requests from control plane"
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:269"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1619849717
  nanos: 610781908
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:270"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1619849717
  nanos: 610859870
}
message: "Closing all cached grpc data channels."
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:721"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1619849717
  nanos: 610934019
}
message: "Closing all cached gRPC state handlers."
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:894"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1619849717
  nanos: 611627817
}
message: "Done consuming work."
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:282"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1619849717
  nanos: 611784219
}
message: "Python sdk harness exiting."
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:167"
thread: "MainThread"

ae521edf44de74fc587be09af822527f2aaedd7c48415edaca351e4874050147
INFO:apache_beam.runners.portability.local_job_service:Successfully completed 
job in 6.48507285118103 seconds.
INFO:root:Successfully completed job in 6.48507285118103 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:direct:py37:postCommitIT
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_streaming_data_only 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) 
... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_read_via_sql 
(apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest)
 ... ok
test_read_via_table 
(apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest)
 ... ok
test_spanner_error 
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
 ... ok
test_spanner_update 
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
 ... ok
test_write_batches 
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
 ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... 
ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) 
... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) 
... ok
test_big_query_write 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Test that schema update options are respected when appending to an existing ... 
ok
test_big_query_write_without_schema 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types_avro 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql_kms_key_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: This test doesn't work on DirectRunner.

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py37.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 141.137s

OK (SKIP=1)

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at 
localhost:40499
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.7 interpreter.
INFO:root:Default Python SDK image for environment is 
apache/beam_python3.7_sdk:2.31.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7fec89b79158> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function sort_stages at 0x7fec89b79840> ====================
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/wordcount.py";,>
 line 94, in <module>
    run()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/wordcount.py";,>
 line 89, in run
    output | 'Write' >> WriteToText(known_args.output)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 582, in __exit__
    self.result = self.run()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 561, in run
    return self.runner.run_pipeline(self, self._options)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/spark_runner.py";,>
 line 43, in run_pipeline
    return super(SparkRunner, self).run_pipeline(pipeline, options)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py";,>
 line 437, in run_pipeline
    job_service_handle = self.create_job_service(options)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py";,>
 line 317, in create_job_service
    return self.create_job_service_handle(server.start(), options)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/job_server.py";,>
 line 81, in start
    self._endpoint = self._job_server.start()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/job_server.py";,>
 line 106, in start
    cmd, endpoint = self.subprocess_cmd_and_endpoint()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/job_server.py";,>
 line 150, in subprocess_cmd_and_endpoint
    jar_path = self.local_jar(self.path_to_jar())
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/spark_runner.py";,>
 line 79, in path_to_jar
    self._jar)
ValueError: Unable to parse jar URL 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.31.0-SNAPSHOT.jar";.>
 If using a full URL, make sure the scheme is specified. If using a local file 
path, make sure the file exists; you may have to first build the job server 
using `./gradlew runners:spark:2:job-server:shadowJar`.

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch 
> FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:java-fn-execution:compileJava'.
> Failed to load cache entry for task ':runners:java-fn-execution:compileJava'

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 14m 56s
173 actionable tasks: 136 executed, 33 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches 
limit is too low.

Publishing build scan...
https://gradle.com/s/jls6wmbcfogiw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to