See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python38/2734/display/redirect?page=changes>

Changes:

[Alexey Romanenko] [BEAM-12918] Add PostCommit_Java_Tpcds_Dataflow job


------------------------------------------
[...truncated 647.96 KB...]
INFO:root:starting state server on port 36003
INFO:root:starting logging server on port 45749
INFO:apache_beam.runners.portability.fn_api_runner.worker_handlers:Created 
Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
 object at 0x7f3966cb1400> for environment 
ref_Environment_default_environment_1 (beam:env:docker:v1, 
b'\n$apache/beam_python3.8_sdk:2.40.0.dev')
INFO:apache_beam.runners.portability.fn_api_runner.worker_handlers:Attempting 
to pull image apache/beam_python3.8_sdk:2.40.0.dev
INFO:root:Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler
 object at 0x7f3966cb1400> for environment 
ref_Environment_default_environment_1 (beam:env:docker:v1, 
b'\n$apache/beam_python3.8_sdk:2.40.0.dev')
INFO:root:Attempting to pull image apache/beam_python3.8_sdk:2.40.0.dev
E0607 14:24:57.164155752 1443240 fork_posix.cc:76]           Other threads are 
currently calling into gRPC, skipping fork() handlers
Error response from daemon: manifest for apache/beam_python3.8_sdk:2.40.0.dev 
not found
INFO:apache_beam.runners.portability.fn_api_runner.worker_handlers:Unable to 
pull image apache/beam_python3.8_sdk:2.40.0.dev, defaulting to local image if 
it exists
INFO:root:Unable to pull image apache/beam_python3.8_sdk:2.40.0.dev, defaulting 
to local image if it exists
E0607 14:24:57.635605233 1443240 fork_posix.cc:76]           Other threads are 
currently calling into gRPC, skipping fork() handlers
E0607 14:24:57.911955128 1443240 fork_posix.cc:76]           Other threads are 
currently calling into gRPC, skipping fork() handlers
INFO:apache_beam.runners.portability.fn_api_runner.worker_handlers:Waiting for 
docker to start up. Current status is running
INFO:apache_beam.runners.portability.fn_api_runner.worker_handlers:Docker 
container is running. container_id = 
b'1debeeb6d5268f37f87ecaab3354e6e3a9a55d588d3a105bbc578f81a37e32b3', worker_id 
= worker_0
INFO:root:Waiting for docker to start up. Current status is running
INFO:root:Docker container is running. container_id = 
b'1debeeb6d5268f37f87ecaab3354e6e3a9a55d588d3a105bbc578f81a37e32b3', worker_id 
= worker_0
E0607 14:24:58.967800363 1443375 fork_posix.cc:76]           Other threads are 
currently calling into gRPC, skipping fork() handlers
E0607 14:25:04.023318894 1443375 fork_posix.cc:76]           Other threads are 
currently calling into gRPC, skipping fork() handlers
INFO:root:severity: INFO
timestamp {
  seconds: 1654611905
  nanos: 856449842
}
message: "Logging handler created."
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker_main.py:84"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1654611905
  nanos: 859190464
}
message: "semi_persistent_directory: /tmp"
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker_main.py:107"
thread: "MainThread"

WARNING:root:severity: WARN
timestamp {
  seconds: 1654611905
  nanos: 863778114
}
message: "Discarding unparseable args: [\'--direct_runner_use_stacked_bundle\', 
\'--pipeline_type_check\']"
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/options/pipeline_options.py:335"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1654611905
  nanos: 865828275
}
message: "Pipeline_options: {\'experiments\': [\'beam_fn_api\'], 
\'requirements_file\': \'/tmp/tmpa292aro7/requirements.txt\', 
\'save_main_session\': True, \'sdk_location\': \'container\', \'job_endpoint\': 
\'embed\', \'environment_type\': \'DOCKER\', \'sdk_worker_parallelism\': \'1\', 
\'environment_cache_millis\': \'0\'}"
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker_main.py:123"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1654611905
  nanos: 869563817
}
message: "Creating state cache with size 0"
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/statecache.py:172"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1654611905
  nanos: 869934797
}
message: "Creating insecure control channel for localhost:36929."
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:181"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1654611905
  nanos: 874671697
}
message: "Control channel established."
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:189"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1654611905
  nanos: 875267028
}
message: "Initializing SDKHarness with unbounded number of workers."
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:232"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1654611905
  nanos: 877648115
}
message: "Python sdk harness starting."
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker_main.py:179"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1654611905
  nanos: 882445812
}
message: "Creating insecure state channel for localhost:36003."
instruction_id: "bundle_1"
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:858"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1654611905
  nanos: 882743597
}
message: "State channel established."
instruction_id: "bundle_1"
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:865"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1654611905
  nanos: 884356021
}
message: "Creating client data channel for localhost:43815"
instruction_id: "bundle_1"
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/data_plane.py:772"
thread: "Thread-14"

E0607 14:25:05.947795819 1443240 fork_posix.cc:76]           Other threads are 
currently calling into gRPC, skipping fork() handlers
INFO:root:severity: INFO
timestamp {
  seconds: 1654611905
  nanos: 947427511
}
message: "No more requests from control plane"
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:261"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1654611905
  nanos: 947590589
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:262"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1654611905
  nanos: 947672605
}
message: "Closing all cached grpc data channels."
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/data_plane.py:805"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1654611905
  nanos: 947740316
}
message: "Closing all cached gRPC state handlers."
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:877"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1654611905
  nanos: 947961091
}
message: "Done consuming work."
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:274"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1654611905
  nanos: 948067903
}
message: "Python sdk harness exiting."
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker_main.py:181"
thread: "MainThread"

1debeeb6d5268f37f87ecaab3354e6e3a9a55d588d3a105bbc578f81a37e32b3
INFO:apache_beam.runners.portability.local_job_service:Completed job in 
9.69567084312439 seconds with state DONE.
INFO:root:Completed job in 9.69567084312439 seconds with state DONE.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py38:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py38:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/direct/common.gradle'>
 line: 182

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:direct:py38:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 9m 45s
217 actionable tasks: 147 executed, 64 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/ytpsezqmw7qgm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to