See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python36/4517/display/redirect>

Changes:


------------------------------------------
[...truncated 37.50 MB...]

INFO:root:severity: INFO
timestamp {
  seconds: 1635078481
  nanos: 735338687
}
message: "Control channel established."
log_location: 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:172"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1635078481
  nanos: 736046791
}
message: "Initializing SDKHarness with unbounded number of workers."
log_location: 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:215"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1635078481
  nanos: 739040851
}
message: "Python sdk harness starting."
log_location: 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker_main.py:152"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1635078481
  nanos: 743798971
}
message: "Creating insecure state channel for localhost:33205."
instruction_id: "bundle_1"
log_location: 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:807"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1635078481
  nanos: 744133710
}
message: "State channel established."
instruction_id: "bundle_1"
log_location: 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:814"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1635078481
  nanos: 747519493
}
message: "Creating client data channel for localhost:36583"
instruction_id: "bundle_1"
log_location: 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/data_plane.py:750"
thread: "Thread-14"

INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running 
(((((ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-Impulse_15)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-FlatMap-lambda-at-core-py-3222-_16))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-Map-decode-_18))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-InitializeWrite_19))+(ref_PCollection_PCollection_10/Write))+(ref_PCollection_PCollection_11/Write)
INFO:root:Running 
(((((ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-Impulse_15)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-FlatMap-lambda-at-core-py-3222-_16))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-DoOnce-Map-decode-_18))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-InitializeWrite_19))+(ref_PCollection_PCollection_10/Write))+(ref_PCollection_PCollection_11/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x 
coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x 
coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running 
((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running 
((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running 
((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running 
((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running 
(ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running 
(ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
  seconds: 1635078481
  nanos: 931499004
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), 
batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: 
"/usr/local/lib/python3.6/site-packages/apache_beam/io/filebasedsink.py:303"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1635078482
  nanos: 262308120
}
message: "Renamed 1 shards in 0.33 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: 
"/usr/local/lib/python3.6/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1635078482
  nanos: 304753303
}
message: "No more requests from control plane"
log_location: 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:244"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1635078482
  nanos: 304969549
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:245"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1635078482
  nanos: 305051565
}
message: "Closing all cached grpc data channels."
log_location: 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1635078482
  nanos: 305122137
}
message: "Closing all cached gRPC state handlers."
log_location: 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:826"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1635078482
  nanos: 305638551
}
message: "Done consuming work."
log_location: 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1635078482
  nanos: 305758953
}
message: "Python sdk harness exiting."
log_location: 
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed 
job in 11.939534187316895 seconds.
INFO:root:Successfully completed job in 11.939534187316895 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py36:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at 
localhost:39169
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.6 interpreter.
INFO:root:Default Python SDK image for environment is 
apache/beam_python3.6_sdk:2.35.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function pack_combiners at 0x7f365cf46c80> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f365cf46d08> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function sort_stages at 0x7f365cf47488> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.35.0-SNAPSHOT.jar'>
 '--spark-master-url' 'local[4]' '--artifacts-dir' 
'/tmp/beam-temp3luylypu/artifactsyw38vrob' '--job-port' '46567' 
'--artifact-port' '0' '--expansion-port' '0']
WARNING:root:Waiting for grpc channel to be ready at localhost:46567.
WARNING:root:Waiting for grpc channel to be ready at localhost:46567.
INFO:apache_beam.utils.subprocess_server:b'21/10/24 12:28:16 INFO 
org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService 
started on localhost:43833'
INFO:apache_beam.utils.subprocess_server:b'21/10/24 12:28:16 INFO 
org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService 
started on localhost:46567'
INFO:apache_beam.utils.subprocess_server:b'21/10/24 12:28:16 WARN 
org.apache.beam.runners.jobsubmission.JobServerDriver: Exception during job 
server creation'
INFO:apache_beam.utils.subprocess_server:b'java.io.IOException: Failed to bind 
to address 0.0.0.0/0.0.0.0:46567'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.netty.NettyServer.start(NettyServer.java:328)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.ServerImpl.start(ServerImpl.java:179)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.ServerImpl.start(ServerImpl.java:90)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.sdk.fn.server.ServerFactory$InetSocketAddressServerFactory.createServer(ServerFactory.java:162)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.sdk.fn.server.ServerFactory$InetSocketAddressServerFactory.create(ServerFactory.java:145)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.sdk.fn.server.GrpcFnServer.create(GrpcFnServer.java:110)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.runners.jobsubmission.JobServerDriver.createJobServer(JobServerDriver.java:235)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.runners.jobsubmission.JobServerDriver.run(JobServerDriver.java:173)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.runners.spark.SparkJobServerDriver.main(SparkJobServerDriver.java:55)'
INFO:apache_beam.utils.subprocess_server:b'Caused by: 
org.apache.beam.vendor.grpc.v1p36p0.io.netty.channel.unix.Errors$NativeIoException:
 bind(..) failed: Address already in use'
INFO:apache_beam.utils.subprocess_server:b'21/10/24 12:28:16 INFO 
org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingServer 
stopped on localhost:43833'
INFO:apache_beam.utils.subprocess_server:b'21/10/24 12:28:16 INFO 
org.apache.beam.runners.jobsubmission.JobServerDriver: Expansion stopped on 
localhost:46567'
WARNING:root:Waiting for grpc channel to be ready at localhost:46567.
ERROR:apache_beam.utils.subprocess_server:Starting job service with ['java', 
'-jar', 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.35.0-SNAPSHOT.jar',>
 '--spark-master-url', 'local[4]', '--artifacts-dir', 
'/tmp/beam-temp3luylypu/artifactsyw38vrob', '--job-port', '46567', 
'--artifact-port', '0', '--expansion-port', '0']
ERROR:apache_beam.utils.subprocess_server:Error bringing up service
Traceback (most recent call last):
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/utils/subprocess_server.py";,>
 line 86, in start
    'Service failed to start up with error %s' % self._process.poll())
RuntimeError: Service failed to start up with error 0
Traceback (most recent call last):
  File "/usr/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/examples/wordcount.py";,>
 line 94, in <module>
    run()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/examples/wordcount.py";,>
 line 89, in run
    output | 'Write' >> WriteToText(known_args.output)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 596, in __exit__
    self.result = self.run()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/portability/spark_runner.py";,>
 line 47, in run_pipeline
    return super().run_pipeline(pipeline, options)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py";,>
 line 438, in run_pipeline
    job_service_handle = self.create_job_service(options)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py";,>
 line 317, in create_job_service
    return self.create_job_service_handle(server.start(), options)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/portability/job_server.py";,>
 line 81, in start
    self._endpoint = self._job_server.start()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/portability/job_server.py";,>
 line 110, in start
    return self._server.start()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/utils/subprocess_server.py";,>
 line 86, in start
    'Service failed to start up with error %s' % self._process.poll())
RuntimeError: Service failed to start up with error 0

> Task :sdks:python:test-suites:portable:py36:portableWordCountSparkRunnerBatch 
> FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/portable/common.gradle'>
 line: 225

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py36:postCommitPy36IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py36:portableWordCountSparkRunnerBatch'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 41m 28s
211 actionable tasks: 147 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/5z3l7gv6mxviu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to