See
<https://ci-beam.apache.org/job/beam_PostCommit_Python38/823/display/redirect?page=changes>
Changes:
[sonam.ramchand] Enabled strict dependency on flink runner
[sonam.ramchand] Added new line
[sonam.ramchand] removed checker-qual
[sonam.ramchand] Removed flink-clients dep
[sonam.ramchand] set enableStrictDependencies:true
[sonam.ramchand] made flink-clients runtimeOnly
[noreply] Remove stray colon.
[zyichi] [BEAM-11679] Override PubsubUnboundedSource transform for dataflow
[zyichi] Populate PubsubMessage message id
[zyichi] Enable tests
[zyichi] Exclude failing FhirIO ITs
[zyichi] Include runner v2 IT tests in java post commit
[Ismaël Mejía] [BEAM-11697] Upgrade Flink runner to Flink versions 1.12.1 and
1.11.3
[shehzaad] strict dependency checking on sdks/io/amazon-web-services*
[shehzaad] move amazon-web-services* changes to another PR
[shehzaad] undo previous erroneous commit
[shehzaad] better use of variables
[randomstep] [BEAM-8725] bump snappy-java to 1.1.8.4
[nielm] Add BigDecimal support for SpannerIO
[noreply] [BEAM-11695] Remove translations.pack_combiners from default
optimizers
[Kyle Weaver] [BEAM-11689] Add public.nexus.pentaho.org to offline repositories.
[Chamikara Madhusanka Jayalath] Moving to 2.29.0-SNAPSHOT on master branch.
[noreply] [BEAM-11531] Add pd.to_datetime, handle DeferredBase args in
[Andrew Pilloud] [BEAM-11165] Async ZetaSQL Calc
[noreply] [BEAM-11695] Combiner packing in Dataflow (#13763)
------------------------------------------
[...truncated 8.96 MB...]
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker_main.py:136"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1611793794
nanos: 540066719
}
message: "Creating state cache with size 0"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/statecache.py:174"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1611793794
nanos: 540886640
}
message: "Creating insecure control channel for localhost:45309."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:186"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1611793794
nanos: 574045658
}
message: "Control channel established."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:194"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1611793794
nanos: 613609552
}
message: "Initializing SDKHarness with unbounded number of workers."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:237"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1611793794
nanos: 680694103
}
message: "Creating insecure state channel for localhost:40649."
instruction_id: "bundle_1"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:872"
thread: "Thread-11"
INFO:root:severity: INFO
timestamp {
seconds: 1611793794
nanos: 681174516
}
message: "State channel established."
instruction_id: "bundle_1"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:879"
thread: "Thread-11"
INFO:root:severity: INFO
timestamp {
seconds: 1611793794
nanos: 698238611
}
message: "Creating client data channel for localhost:46425"
instruction_id: "bundle_1"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/data_plane.py:689"
thread: "Thread-11"
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
(((((ref_AppliedPTransform_WriteToText/Write/WriteImpl/DoOnce/Impulse_15)+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda
at
core.py:2957>)_16))+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/DoOnce/Map(decode)_18))+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/InitializeWrite_19))+(ref_PCollection_PCollection_10/Write))+(ref_PCollection_PCollection_11/Write)
INFO:root:Running
(((((ref_AppliedPTransform_WriteToText/Write/WriteImpl/DoOnce/Impulse_15)+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda
at
core.py:2957>)_16))+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/DoOnce/Map(decode)_18))+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/InitializeWrite_19))+(ref_PCollection_PCollection_10/Write))+(ref_PCollection_PCollection_11/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x
coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)_20))+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/WriteBundles_21))+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x
coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)_20))+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/WriteBundles_21))+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running
((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running
((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
(ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/FinalizeWrite_26)
INFO:root:Running
(ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText/Write/WriteImpl/FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
seconds: 1611793795
nanos: 312618732
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0),
batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/io/filebasedsink.py:304"
thread: "Thread-11"
INFO:root:severity: INFO
timestamp {
seconds: 1611793795
nanos: 357612133
}
message: "Renamed 1 shards in 0.04 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/io/filebasedsink.py:352"
thread: "Thread-11"
INFO:root:severity: INFO
timestamp {
seconds: 1611793795
nanos: 372107744
}
message: "No more requests from control plane"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:266"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1611793795
nanos: 372356653
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:267"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1611793795
nanos: 372503280
}
message: "Closing all cached grpc data channels."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/data_plane.py:721"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1611793795
nanos: 372651100
}
message: "Closing all cached gRPC state handlers."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:891"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1611793795
nanos: 379261255
}
message: "Done consuming work."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:279"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1611793795
nanos: 379522562
}
message: "Python sdk harness exiting."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker_main.py:162"
thread: "MainThread"
d655d1a701a7a7fce9a46b6c16684c0df5f91f302313f29ba12ba7f409895cca
INFO:apache_beam.runners.portability.local_job_service:Successfully completed
job in 11.467022895812988 seconds.
INFO:root:Successfully completed job in 11.467022895812988 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
> Task :sdks:java:harness:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :sdks:java:harness:classes
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar
> Task :runners:java-fn-execution:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :runners:java-fn-execution:compileJava FAILED
> Task :sdks:python:test-suites:portable:py38:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at
localhost:44689
WARNING:root:Make sure that locally built Python SDK docker image has Python
3.8 interpreter.
INFO:root:Default Python SDK image for environment is
apache/beam_python3.8_sdk:2.29.0.dev
Traceback (most recent call last):
File "/usr/lib/python3.8/runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/lib/python3.8/runpy.py", line 87, in _run_code
exec(code, run_globals)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/examples/wordcount.py",>
line 99, in <module>
run()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/examples/wordcount.py",>
line 94, in run
output | 'Write' >> WriteToText(known_args.output)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/pipeline.py",>
line 580, in __exit__
self.result = self.run()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/pipeline.py",>
line 559, in run
return self.runner.run_pipeline(self, self._options)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/spark_runner.py",>
line 47, in run_pipeline
return super(SparkRunner, self).run_pipeline(pipeline, options)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",>
line 418, in run_pipeline
job_service_handle = self.create_job_service(options)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",>
line 313, in create_job_service
return self.create_job_service_handle(server.start(), options)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",>
line 83, in start
self._endpoint = self._job_server.start()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",>
line 108, in start
cmd, endpoint = self.subprocess_cmd_and_endpoint()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",>
line 151, in subprocess_cmd_and_endpoint
jar_path = self.local_jar(self.path_to_jar())
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/spark_runner.py",>
line 82, in path_to_jar
raise ValueError(
ValueError: Unable to parse jar URL
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/runners/spark/job-server/build/libs/beam-runners-spark-job-server-2.29.0-SNAPSHOT.jar".>
If using a full URL, make sure the scheme is specified. If using a local file
path, make sure the file exists; you may have to first build the job server
using `./gradlew runners:spark:job-server:shadowJar`.
> Task :sdks:python:test-suites:portable:py38:portableWordCountSparkRunnerBatch
> FAILED
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:java-fn-execution:compileJava'.
> Failed to store cache entry for task ':runners:java-fn-execution:compileJava'
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py38:portableWordCountSparkRunnerBatch'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 36m 3s
176 actionable tasks: 157 executed, 15 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches
limit is too low.
Publishing build scan...
https://gradle.com/s/aj4nc2hiz2nwm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]