See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/2463/display/redirect?page=changes>
Changes: [noreply] [CdapIO] Add readme for CdapIO. Update readme for SparkReceiverIO. [Moritz Mack] [Spark Dataset runner] Add @Experimental and reduce visibility where ------------------------------------------ [...truncated 2.20 MB...] root_transform_ids: "s1" root_transform_ids: "s3" requirements: "beam:requirement:pardo:splittable_dofn:v1" 2022/12/05 15:19:58 Cross-compiling <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/sdks/go/test/integration/io/xlang/kafka/kafka_test.go> as /tmp/worker-2-1670253598274607890 2022/12/05 15:19:59 Prepared job with id: go-testkafkaio_basicreadwrite-128_adde67d6-e90c-4e97-a73a-b49112475c25 and staging token: go-testkafkaio_basicreadwrite-128_adde67d6-e90c-4e97-a73a-b49112475c25 2022/12/05 15:19:59 Staged binary artifact with token: 2022/12/05 15:19:59 Submitted job: go0testkafkaio0basicreadwrite0128-jenkins-1205151959-5ce0f506_42d1bb40-b77c-43e4-98d6-503e71ebfede 2022/12/05 15:19:59 Job state: STOPPED 2022/12/05 15:19:59 Job state: STARTING 2022/12/05 15:19:59 Job state: RUNNING 2022/12/05 15:20:21 Job state: DONE 2022/12/05 15:20:21 Warning: 6 errors during metrics processing: [failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1" type:"beam:metrics:sum_int64:v1" payload:"\x01" labels:{key:"PCOLLECTION" value:"adVfSAAvuEExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output/SplitAndSize0"} failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1" type:"beam:metrics:distribution_int64:v1" payload:"\x01\xfd\"\xfd\"\xfd\"" labels:{key:"PCOLLECTION" value:"adVfSAAvuEExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output/SplitAndSize0"} failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1" type:"beam:metrics:sum_int64:v1" payload:"\x01" labels:{key:"PCOLLECTION" value:"adVfSAAvuEExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output/SplitAndSize0"} failed to deduce Step from MonitoringInfo: urn:"beam:metric:element_count:v1" type:"beam:metrics:sum_int64:v1" payload:"\x01" labels:{key:"PCOLLECTION" value:"adVfSAAvuEExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output/PairWithRestriction0"} failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1" type:"beam:metrics:distribution_int64:v1" payload:"\x01\xf8\"\xf8\"\xf8\"" labels:{key:"PCOLLECTION" value:"adVfSAAvuEExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output/SplitAndSize0"} failed to deduce Step from MonitoringInfo: urn:"beam:metric:sampled_byte_size:v1" type:"beam:metrics:distribution_int64:v1" payload:"\x01\xf0\"\xf0\"\xf0\"" labels:{key:"PCOLLECTION" value:"adVfSAAvuEExternal/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output/PairWithRestriction0"}] --- PASS: TestKafkaIO_BasicReadWrite (37.16s) PASS ok github.com/apache/beam/sdks/v2/go/test/integration/io/xlang/kafka 43.191s $ cd ../.. $ exit $ exit > Task :runners:spark:3:job-server:validatesCrossLanguageRunnerJavaUsingPython org.apache.beam.runners.core.construction.ValidateRunnerXlangTest$CombineGloballyTest > test FAILED java.lang.RuntimeException at JobServicePipelineResult.java:176 org.apache.beam.sdk.extensions.python.transforms.DataframeTransformTest > testDataframeSum FAILED java.lang.RuntimeException at JobServicePipelineResult.java:176 org.apache.beam.sdk.extensions.python.transforms.PythonMapTest > testPythonMap FAILED java.lang.RuntimeException at PythonMapTest.java:41 Caused by: java.lang.RuntimeException at PythonMapTest.java:41 Caused by: java.io.IOException at PythonMapTest.java:41 java.lang.NullPointerException at Preconditions.java:980 org.apache.beam.sdk.extensions.python.transforms.PythonMapTest > testPythonFlatMap FAILED java.lang.RuntimeException at PythonMapTest.java:54 Caused by: java.lang.RuntimeException at PythonMapTest.java:54 java.lang.NullPointerException at Preconditions.java:980 org.apache.beam.sdk.extensions.python.transforms.RunInferenceTransformTest > testRunInferenceWithKVs FAILED java.lang.RuntimeException at RunInferenceTransformTest.java:108 Caused by: java.lang.RuntimeException at RunInferenceTransformTest.java:108 java.lang.NullPointerException at Preconditions.java:980 org.apache.beam.sdk.extensions.python.transforms.RunInferenceTransformTest > testRunInference FAILED java.lang.RuntimeException at RunInferenceTransformTest.java:60 Caused by: java.lang.RuntimeException at RunInferenceTransformTest.java:60 java.lang.NullPointerException at Preconditions.java:980 16 tests completed, 6 failed, 2 skipped > Task :runners:spark:3:job-server:validatesCrossLanguageRunnerJavaUsingPython > FAILED > Task :runners:spark:3:job-server:validatesCrossLanguageRunnerPythonUsingJava >>> RUNNING integration tests with pipeline options: --runner=PortableRunner >>> --job_endpoint=localhost:35381 --environment_cache_millis=10000 >>> --experiments=beam_fn_api >>> pytest options: >>> collect markers: -m=uses_java_expansion_service ============================= test session starts ============================== platform linux -- Python 3.10.2, pytest-7.2.0, pluggy-1.0.0 rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/sdks/python,> configfile: pytest.ini plugins: xdist-2.5.0, hypothesis-6.60.0, timeout-2.1.0, forked-1.4.0, requests-mock-1.10.0 timeout: 600.0s timeout method: signal timeout func_only: False ----------------------------- live log collection ------------------------------ WARNING apache_beam.runners.interactive.interactive_environment:interactive_environment.py:190 Dependencies required for Interactive Beam PCollection visualization are not available, please use: `pip install apache-beam[interactive]` to install necessary dependencies to enable all data visualization features. WARNING apache_beam.runners.interactive.interactive_environment:interactive_environment.py:199 You cannot use Interactive Beam features when you are not in an interactive environment such as a Jupyter notebook or ipython terminal. WARNING root:avroio_test.py:54 python-snappy is not installed; some tests will be skipped. WARNING root:tfrecordio_test.py:55 Tensorflow is not installed, so skipping some tests. INFO root:environments.py:376 Default Python SDK image for environment is apache/beam_python3.10_sdk:2.45.0.dev collected 6728 items / 6 errors / 6718 deselected / 5 skipped / 10 selected !!!!!!!!!!!!!!!!!!! Interrupted: 6 errors during collection !!!!!!!!!!!!!!!!!!!! OSError: [Errno 28] No space left on device During handling of the above exception, another exception occurred: Traceback (most recent call last): File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/bin/pytest",> line 8, in <module> sys.exit(console_main()) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/_pytest/config/__init__.py",> line 190, in console_main code = main() File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/_pytest/config/__init__.py",> line 167, in main ret: Union[ExitCode, int] = config.hook.pytest_cmdline_main( File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_hooks.py",> line 265, in __call__ return self._hookexec(self.name, self.get_hookimpls(), kwargs, firstresult) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_manager.py",> line 80, in _hookexec return self._inner_hookexec(hook_name, methods, kwargs, firstresult) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_callers.py",> line 60, in _multicall return outcome.get_result() File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_result.py",> line 60, in get_result raise ex[1].with_traceback(ex[2]) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_callers.py",> line 39, in _multicall res = hook_impl.function(*args) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/_pytest/main.py",> line 317, in pytest_cmdline_main return wrap_session(config, _main) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/_pytest/main.py",> line 305, in wrap_session config.hook.pytest_sessionfinish( File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_hooks.py",> line 265, in __call__ return self._hookexec(self.name, self.get_hookimpls(), kwargs, firstresult) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_manager.py",> line 80, in _hookexec return self._inner_hookexec(hook_name, methods, kwargs, firstresult) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_callers.py",> line 55, in _multicall gen.send(outcome) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/_pytest/terminal.py",> line 808, in pytest_sessionfinish outcome.get_result() File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_result.py",> line 60, in get_result raise ex[1].with_traceback(ex[2]) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_callers.py",> line 39, in _multicall res = hook_impl.function(*args) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/_pytest/junitxml.py",> line 651, in pytest_sessionfinish with open(self.logfile, "w", encoding="utf-8") as logfile: OSError: [Errno 28] No space left on device > Task :runners:spark:3:job-server:validatesCrossLanguageRunnerPythonUsingJava > FAILED > Task :runners:spark:3:job-server:validatesCrossLanguageRunnerPythonUsingSql > FAILED >>> RUNNING integration tests with pipeline options: --runner=PortableRunner >>> --job_endpoint=localhost:35381 --environment_cache_millis=10000 >>> --experiments=beam_fn_api >>> pytest options: >>> collect markers: -m=xlang_sql_expansion_service Traceback (most recent call last): File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/bin/pytest",> line 8, in <module> sys.exit(console_main()) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/_pytest/config/__init__.py",> line 190, in console_main code = main() File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/_pytest/config/__init__.py",> line 148, in main config = _prepareconfig(args, plugins) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/_pytest/config/__init__.py",> line 329, in _prepareconfig config = pluginmanager.hook.pytest_cmdline_parse( File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_hooks.py",> line 265, in __call__ return self._hookexec(self.name, self.get_hookimpls(), kwargs, firstresult) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_manager.py",> line 80, in _hookexec return self._inner_hookexec(hook_name, methods, kwargs, firstresult) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_callers.py",> line 55, in _multicall gen.send(outcome) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/_pytest/helpconfig.py",> line 103, in pytest_cmdline_parse config: Config = outcome.get_result() File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_result.py",> line 60, in get_result raise ex[1].with_traceback(ex[2]) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_callers.py",> line 39, in _multicall res = hook_impl.function(*args) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/_pytest/config/__init__.py",> line 1058, in pytest_cmdline_parse self.parse(args) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/_pytest/config/__init__.py",> line 1346, in parse self._preparse(args, addopts=addopts) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/_pytest/config/__init__.py",> line 1248, in _preparse self.hook.pytest_load_initial_conftests( File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_hooks.py",> line 265, in __call__ return self._hookexec(self.name, self.get_hookimpls(), kwargs, firstresult) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_manager.py",> line 80, in _hookexec return self._inner_hookexec(hook_name, methods, kwargs, firstresult) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_callers.py",> line 60, in _multicall return outcome.get_result() File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_result.py",> line 60, in get_result raise ex[1].with_traceback(ex[2]) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/pluggy/_callers.py",> line 34, in _multicall next(gen) # first yield File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/_pytest/capture.py",> line 141, in pytest_load_initial_conftests capman.start_global_capturing() File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/_pytest/capture.py",> line 688, in start_global_capturing self._global_capturing = _get_multicapture(self._method) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/_pytest/capture.py",> line 630, in _get_multicapture return MultiCapture(in_=FDCapture(0), out=FDCapture(1), err=FDCapture(2)) File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/_pytest/capture.py",> line 388, in __init__ TemporaryFile(buffering=0), File "/usr/lib/python3.10/tempfile.py", line 735, in TemporaryFile prefix, suffix, dir, output_type = _sanitize_params(prefix, suffix, dir) File "/usr/lib/python3.10/tempfile.py", line 265, in _sanitize_params dir = gettempdir() File "/usr/lib/python3.10/tempfile.py", line 438, in gettempdir return _os.fsdecode(_gettempdir()) File "/usr/lib/python3.10/tempfile.py", line 431, in _gettempdir tempdir = _get_default_tempdir() File "/usr/lib/python3.10/tempfile.py", line 363, in _get_default_tempdir raise FileNotFoundError(_errno.ENOENT, FileNotFoundError: [Errno 2] No usable temporary directory found in ['/tmp', '/var/tmp', '/usr/tmp', '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/sdks/python']> > Task :runners:spark:3:job-server:sparkJobServerCleanup <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/sdks/python/scripts/run_job_server.sh>: line 19: cannot create temp file for here-document: No space left on device Stopping job server pid: 2135080. > Task :runners:spark:3:job-server:validatesCrossLanguageRunnerCleanup Stopping expansion service pid: 2146436. Stopping expansion service pid: 2146437. FAILURE: Build completed with 3 failures. 1: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':runners:spark:3:job-server:validatesCrossLanguageRunnerJavaUsingPython'. > There were failing tests. See the report at: > file://<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark3/ws/src/runners/spark/3/job-server/build/reports/tests/validatesCrossLanguageRunnerJavaUsingPython/index.html> * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. ============================================================================== 2: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':runners:spark:3:job-server:validatesCrossLanguageRunnerPythonUsingJava'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. ============================================================================== 3: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':runners:spark:3:job-server:validatesCrossLanguageRunnerPythonUsingSql'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness. Please consult deprecation warnings for more details. BUILD FAILED in 32m 17s 248 actionable tasks: 32 executed, 216 up-to-date A build scan cannot be produced as an error occurred spooling the build data. Please report this problem via https://gradle.com/help/plugin and include the following via copy/paste: ---------- Gradle version: 7.5.1 Plugin version: 3.4.1 java.lang.IllegalStateException: Could not close the event spooler due to previous errors. at com.gradle.scan.plugin.internal.f.c.c.a(SourceFile:128) at com.gradle.scan.plugin.internal.q.a$a.a(SourceFile:31) at com.gradle.scan.plugin.internal.q.a$a.a(SourceFile:20) at com.gradle.scan.plugin.internal.q.a.c(SourceFile:67) ---------- Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
