Build failed in Jenkins: beam_PostCommit_Py_ValCont #5286

2020-01-15 Thread Apache Jenkins Server
s_pipeline_test.ExerciseMetricsPipelineTest)
--
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py;,>
 line 812, in run
test(orig)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 45, in __call__
return self.run(*arg, **kwarg)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 133, in run
self.runTest(result)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 151, in runTest
test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 58, in test_metrics_fnapi_it
result = self.run_pipeline(experiment='beam_fn_api')
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 39, in run_pipeline
test_pipeline = TestPipeline(is_integration_test=True)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/testing/test_pipeline.py;,>
 line 108, in __init__
super(TestPipeline, self).__init__(runner, options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py;,>
 line 184, in __init__
errors = PipelineOptionsValidator(self._options, runner).validate()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options_validator.py;,>
 line 113, in validate
errors.extend(self.options.view_as(cls).validate(self))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 591, in validate
self.view_as(GoogleCloudOptions).region = self._get_default_gcp_region()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 559, in _get_default_gcp_region
raw_output = processes.check_output(cmd, stderr=DEVNULL)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/utils/processes.py;,>
 line 85, in check_output
out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python2.7/subprocess.py", line 568, in check_output
output, unused_err = process.communicate()
  File "/usr/lib/python2.7/subprocess.py", line 792, in communicate
stdout = _eintr_retry_call(self.stdout.read)
  File "/usr/lib/python2.7/subprocess.py", line 476, in _eintr_retry_call
return func(*args)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py;,>
 line 276, in signalhandler
raise TimedOutException()
TimedOutException: 'test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)'

--
XML: nosetests-python2.7_sdk.xml
--
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
--
Ran 2 tests in 904.303s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20200115-081713
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:e8b44ef2cb5909c9fc244e617b8660a31289fc35a504b08f2213295d0018f868
Deleted: sha256:95225e18f6c9d21dba187608b301ee03fcb2efdeaa45e90007754a0be51d9008
Deleted: sha256:0c64cdc9864c1362ba8aed378f55941de5ba2318b4e4985d042ca5bed2efcec5
Deleted: sha256:a4f2ce140151770272af6c82d77af1c60a5e94644fe403f812d0103bada35122
Deleted: sha256:7f38fe446dea2f4bcb38fd6ccf25c14c1650ee5ce8b896562ff494592eeaf266
Deleted: sha256:2dfdd8de04b4234fafa29f32263e1e49b09cb3cc813193df7c01c14bae0b211c
Deleted: sha256:bf7d04dea7666e55baa9dd18912b386fbae15bd63f1c1d4dafe1b95115b152a4
Deleted: sha256:93a7382afa84

beam_PostCommit_PortableJar_Flink - Build # 1113 - Aborted

2020-01-15 Thread Apache Jenkins Server
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink 
(build #1113)

Status: Aborted

Check console output at 
https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1113/ to view 
the results.

-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org

Build failed in Jenkins: beam_PostCommit_Java11_ValidatesRunner_Direct #2995

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[valentyn] Switch to unittest.SkipTest instead of using nose.


--
[...truncated 102 B...]
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6bc5f6ea1ba117d7e2604ea0a1e83c6caa25fa8e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6bc5f6ea1ba117d7e2604ea0a1e83c6caa25fa8e
Commit message: "Merge pull request #10571 [BEAM-3713] Use unittest.SkipTest 
instead of nose.plugins.skip.SkipTest."
 > git rev-list --no-walk 148cc71b1c3c6d54835c3b72d7842052fcf6c340 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 -Dorg.gradle.java.home=/usr/lib/jvm/java-8-openjdk-amd64 
:runners:direct-java:shadowJar :runners:direct-java:shadowTestJar
Starting a Gradle Daemon, 1 stopped Daemon could not be reused, use --status 
for details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:direct-java:processTestResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :model:job-management:extractProto
> Task :model:fn-execution:extractProto
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> 

beam_PostCommit_Py_VR_Dataflow - Build # 5514 - Aborted

2020-01-15 Thread Apache Jenkins Server
The Apache Jenkins build system has built beam_PostCommit_Py_VR_Dataflow (build 
#5514)

Status: Aborted

Check console output at 
https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/5514/ to view the 
results.

-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org

Build failed in Jenkins: beam_PostCommit_Java_PVR_Flink_Streaming #3814

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[valentyn] Switch to unittest.SkipTest instead of using nose.


--
Started by GitHub push by tvalentyn
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6bc5f6ea1ba117d7e2604ea0a1e83c6caa25fa8e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6bc5f6ea1ba117d7e2604ea0a1e83c6caa25fa8e
Commit message: "Merge pull request #10571 [BEAM-3713] Use unittest.SkipTest 
instead of nose.plugins.skip.SkipTest."
 > git rev-list --no-walk 148cc71b1c3c6d54835c3b72d7842052fcf6c340 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g 
:runners:flink:1.9:job-server:validatesPortableRunnerStreaming
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:flink:1.9:copyResourcesOverrides NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:core-construction-java:processTestResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :runners:portability:java:processTestResources NO-SOURCE
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :runners:flink:1.9:copySourceOverrides
> Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE
> Task :runners:flink:1.9:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:build-tools:compileJava FROM-CACHE
> Task :runners:flink:1.9:copyTestSourceOverrides
> Task :runners:flink:1.9:processTestResources
> Task :sdks:java:build-tools:processResources
> Task :sdks:java:build-tools:classes
> Task :sdks:java:core:processResources
> 

Build failed in Jenkins: beam_PostCommit_XVR_Flink #1456

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[valentyn] Switch to unittest.SkipTest instead of using nose.


--
[...truncated 5.14 MB...]
  File "apache_beam/runners/worker/operations.py", line 657, in 
apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 658, in 
apache_beam.runners.worker.operations.DoOperation.process
delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 878, in 
apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
  File "apache_beam/runners/common.py", line 885, in 
apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 956, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 883, in 
apache_beam.runners.common.DoFnRunner.process
return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 498, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
windowed_value, self.process_method(windowed_value.value))
  File 
"
 line 1437, in 
  File 
"
 line 191, in _equal
BeamAssertException: Failed assert: ['a: 3', 'b: 1', 'c: 2'] == [], missing 
elements ['a: 3', 'b: 1', 'c: 2'] [while running 'assert_that/Match']

at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at 
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:204)
at 
org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
... 1 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for 
instruction 130: Traceback (most recent call last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 167, in _execute
response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 223, in 
lambda: self.create_worker().do_instruction(request), request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 352, in do_instruction
request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 386, in process_bundle
bundle_processor.process_bundle(instruction_id))
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 812, in process_bundle
data.transform_id].process_encoded(data.data)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 205, in process_encoded
self.output(decoded_value)
  File "apache_beam/runners/worker/operations.py", line 302, in 
apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
  File "apache_beam/runners/worker/operations.py", line 304, in 
apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 178, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 657, in 
apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 658, in 
apache_beam.runners.worker.operations.DoOperation.process
delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 878, in 
apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
  File "apache_beam/runners/common.py", 

Build failed in Jenkins: beam_PostCommit_Java11_ValidatesRunner_Direct #2996

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[mxm] [BEAM-6008] Make sure to end stream only after sending all messages and


--
[...truncated 792 B...]
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4fc64ce9de73d0ebfc41e6aa04a4c6a5428a79dd (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4fc64ce9de73d0ebfc41e6aa04a4c6a5428a79dd
Commit message: "Merge pull request #10583: [BEAM-6008] Make sure to end stream 
only after sending all messages and state updates"
 > git rev-list --no-walk 6bc5f6ea1ba117d7e2604ea0a1e83c6caa25fa8e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 -Dorg.gradle.java.home=/usr/lib/jvm/java-8-openjdk-amd64 
:runners:direct-java:shadowJar :runners:direct-java:shadowTestJar
Starting a Gradle Daemon (subsequent builds will be faster)
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :model:job-management:extractProto
> Task :model:fn-execution:extractProto
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:direct-java:processTestResources NO-SOURCE
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task 

Build failed in Jenkins: beam_PostCommit_PortableJar_Spark #285

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[valentyn] Switch to unittest.SkipTest instead of using nose.


--
[...truncated 28.61 KB...]
Resolving github.com/hashicorp/hcl: 
commit='23c074d0eceb2b8a5bfdbb271ab780cde70f05a8', 
urls=[https://github.com/hashicorp/hcl.git, g...@github.com:hashicorp/hcl.git]
Resolving github.com/ianlancetaylor/demangle: 
commit='4883227f66371e02c4948937d3e2be1664d9be38', 
urls=[https://github.com/ianlancetaylor/demangle.git, 
g...@github.com:ianlancetaylor/demangle.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/kr/fs: commit='2788f0dbd16903de03cb8186e5c7d97b69ad387b', 
urls=[https://github.com/kr/fs.git, g...@github.com:kr/fs.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/magiconair/properties: 
commit='49d762b9817ba1c2e9d0c69183c2b4a8b8f1d934', 
urls=[https://github.com/magiconair/properties.git, 
g...@github.com:magiconair/properties.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/mitchellh/go-homedir: 
commit='b8bc1bf767474819792c23f32d8286a45736f1c6', 
urls=[https://github.com/mitchellh/go-homedir.git, 
g...@github.com:mitchellh/go-homedir.git]
Resolving github.com/mitchellh/mapstructure: 
commit='a4e142e9c047c904fa2f1e144d9a84e6133024bc', 
urls=[https://github.com/mitchellh/mapstructure.git, 
g...@github.com:mitchellh/mapstructure.git]
Resolving github.com/nightlyone/lockfile: 
commit='0ad87eef1443f64d3d8c50da647e2b1552851124', 
urls=[https://github.com/nightlyone/lockfile, 
g...@github.com:nightlyone/lockfile.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/openzipkin/zipkin-go: 
commit='3741243b287094fda649c7f0fa74bd51f37dc122', 
urls=[https://github.com/openzipkin/zipkin-go.git, 
g...@github.com:openzipkin/zipkin-go.git]
Resolving github.com/pelletier/go-toml: 
commit='acdc4509485b587f5e675510c4f2c63e90ff68a8', 
urls=[https://github.com/pelletier/go-toml.git, 
g...@github.com:pelletier/go-toml.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/pierrec/lz4: 
commit='ed8d4cc3b461464e69798080a0092bd028910298', 
urls=[https://github.com/pierrec/lz4.git, g...@github.com:pierrec/lz4.git]
Resolving github.com/pierrec/xxHash: 
commit='a0006b13c722f7f12368c00a3d3c2ae8a999a0c6', 
urls=[https://github.com/pierrec/xxHash.git, g...@github.com:pierrec/xxHash.git]
Resolving github.com/pkg/errors: 
commit='30136e27e2ac8d167177e8a583aa4c3fea5be833', 
urls=[https://github.com/pkg/errors.git, g...@github.com:pkg/errors.git]
Resolving github.com/pkg/sftp: 
commit='22e9c1ccc02fc1b9fa3264572e49109b68a86947', 
urls=[https://github.com/pkg/sftp.git, g...@github.com:pkg/sftp.git]
Resolving github.com/prometheus/client_golang: 
commit='9bb6ab929dcbe1c8393cd9ef70387cb69811bd1c', 
urls=[https://github.com/prometheus/client_golang.git, 
g...@github.com:prometheus/client_golang.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/prometheus/procfs: 
commit='cb4147076ac75738c9a7d279075a253c0cc5acbd', 
urls=[https://github.com/prometheus/procfs.git, 
g...@github.com:prometheus/procfs.git]
Resolving github.com/rcrowley/go-metrics: 
commit='8732c616f52954686704c8645fe1a9d59e9df7c1', 
urls=[https://github.com/rcrowley/go-metrics.git, 
g...@github.com:rcrowley/go-metrics.git]
Resolving github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 
g...@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 

beam_PostCommit_PortableJar_Flink - Build # 1114 - Aborted

2020-01-15 Thread Apache Jenkins Server
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink 
(build #1114)

Status: Aborted

Check console output at 
https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1114/ to view 
the results.

-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org

Build failed in Jenkins: beam_PostCommit_Python2 #1452

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[valentyn] Switch to unittest.SkipTest instead of using nose.


--
[...truncated 6.14 MB...]
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in 
loads
return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in 
_import_module
return getattr(__import__(module, None, None, [obj]), obj)
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",
 line 26, in 
from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, 
in 
from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", 
line 7, in 
from hamcrest.library.object import *
  File 
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", 
line 4, in 
from .hasproperty import has_properties, has_property
  File 
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py",
 line 174
),
^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-15T09:01:48.924Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 647, in do_work
work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", 
line 176, in execute
op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in 
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in 
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in 
apache_beam.runners.worker.operations.DoOperation.start
super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in 
apache_beam.runners.worker.operations.Operation.start
def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in 
apache_beam.runners.worker.operations.Operation.start
self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in 
apache_beam.runners.worker.operations.DoOperation.setup
with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in 
apache_beam.runners.worker.operations.DoOperation.setup
pickler.loads(self.spec.serialized_fn))
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 
290, in loads
return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in 
loads
return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in 
_import_module
return getattr(__import__(module, None, None, [obj]), obj)
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",
 line 26, in 
from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, 
in 
from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", 
line 7, in 
from hamcrest.library.object import *
  File 
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", 
line 4, in 
from .hasproperty import has_properties, has_property
  File 
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py",
 line 174
),
^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-15T09:01:52.067Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 647, in do_work
work_executor.execute()
  File 

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Flink #6366

2020-01-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Py_ValCont #5287

2020-01-15 Thread Apache Jenkins Server
env/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py;,>
 line 812, in run
test(orig)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 45, in __call__
return self.run(*arg, **kwarg)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 133, in run
self.runTest(result)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 151, in runTest
test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 58, in test_metrics_fnapi_it
result = self.run_pipeline(experiment='beam_fn_api')
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 39, in run_pipeline
test_pipeline = TestPipeline(is_integration_test=True)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/testing/test_pipeline.py;,>
 line 108, in __init__
super(TestPipeline, self).__init__(runner, options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py;,>
 line 184, in __init__
errors = PipelineOptionsValidator(self._options, runner).validate()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options_validator.py;,>
 line 113, in validate
errors.extend(self.options.view_as(cls).validate(self))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 591, in validate
self.view_as(GoogleCloudOptions).region = self._get_default_gcp_region()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 559, in _get_default_gcp_region
raw_output = processes.check_output(cmd, stderr=DEVNULL)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/utils/processes.py;,>
 line 85, in check_output
out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python2.7/subprocess.py", line 567, in check_output
process = Popen(stdout=PIPE, *popenargs, **kwargs)
  File "/usr/lib/python2.7/subprocess.py", line 711, in __init__
errread, errwrite)
  File "/usr/lib/python2.7/subprocess.py", line 1235, in _execute_child
self.pid = os.fork()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py;,>
 line 276, in signalhandler
raise TimedOutException()
TimedOutException: 'test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)'

--
XML: nosetests-python2.7_sdk.xml
--
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
--
Ran 2 tests in 914.066s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20200115-105830
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:a95b17e2d1374581553e7db255e4fbf89472a7410daedfd7c7ccdf5c1b7fca9f
Deleted: sha256:a7e2d78171b8b0c7ee356c818b6f3259d00686eb600ce64789adc2c92c996d36
Deleted: sha256:9f42e34ad5aeef4126230ed64d0b1fad6c4d29f5745cee4f91a3083ab7520cd2
Deleted: sha256:6e2cc5d4fc70d7f343c5756ecedd5760c78c9644b3d807320d1fc149570fc3fb
Deleted: sha256:ce67f7749464e45b175fc5fa102a56a6ae26b1f543792e281a7e2e65b5208b3e
Deleted: sha256:35f9795eb2f4b26d5e944ae508587c7b60382bc765ce54a8a74dcc39e638da22
Deleted: sha256:3b708cb8fdea7ad778676f3f49a88006a08d19ca522b2057961a4838de279cd3
Deleted: sha256:780123758ca56e6d49f22d28a79268bcd4764fd5b984d0396b747de70e31f3c1
Deleted: sha256:5b90338900163fef3410ad8e5a81ada157b840979fda505986b0739874a209b2
Deleted: sha256:fa9d88d9051c7c2c32e2f1bf7a703338f84486502a9a17c79d452ff61a66634b
Deleted: sha256:3d5172a0c0e29

Build failed in Jenkins: beam_PostCommit_Py_ValCont #5288

2020-01-15 Thread Apache Jenkins Server
---
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py;,>
 line 812, in run
test(orig)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 45, in __call__
return self.run(*arg, **kwarg)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 133, in run
self.runTest(result)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 151, in runTest
test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 58, in test_metrics_fnapi_it
result = self.run_pipeline(experiment='beam_fn_api')
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 39, in run_pipeline
test_pipeline = TestPipeline(is_integration_test=True)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/testing/test_pipeline.py;,>
 line 108, in __init__
super(TestPipeline, self).__init__(runner, options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py;,>
 line 184, in __init__
errors = PipelineOptionsValidator(self._options, runner).validate()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options_validator.py;,>
 line 113, in validate
errors.extend(self.options.view_as(cls).validate(self))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 591, in validate
self.view_as(GoogleCloudOptions).region = self._get_default_gcp_region()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 559, in _get_default_gcp_region
raw_output = processes.check_output(cmd, stderr=DEVNULL)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/utils/processes.py;,>
 line 85, in check_output
out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python2.7/subprocess.py", line 568, in check_output
output, unused_err = process.communicate()
  File "/usr/lib/python2.7/subprocess.py", line 792, in communicate
stdout = _eintr_retry_call(self.stdout.read)
  File "/usr/lib/python2.7/subprocess.py", line 476, in _eintr_retry_call
return func(*args)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py;,>
 line 276, in signalhandler
raise TimedOutException()
TimedOutException: 'test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)'

--
XML: nosetests-python2.7_sdk.xml
--
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
--
Ran 2 tests in 904.178s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20200115-120306
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:df28e12f52d43a5e13bddb2e8f952e9cc899f8e376db46e4c6c191aba67a21fb
Deleted: sha256:d2f092daee5d25936e0138bd640c158f18f7363cd70ce24511a93d4746d60e67
Deleted: sha256:68bfe170d05f2a7ef9b4936e52389774bfd602cd130520c55b495f19dfa2816c
Deleted: sha256:0b4689093d1437b2101a8aaaffb2f2fc8d13932d990643d56e21aecd1ebbbdc4
Deleted: sha256:bcb20b4a3b7154186275add3f58b7466bc6070ed537ac4927c934754bd4ba8b6
Deleted: sha256:d752aeed5f479c204da3c2feb7cd1b321f832a11aef99ea534c433fe8f4a
Deleted: sha256:6819584953bfc27cb8da1c803cec194d9f2a73be50271f5939fc2fd330a08c0e
Deleted: sha256:8e6fbdbcf9ef0213c1e4c269fd4505d59610c7c11962a8e9c7b05ec8347e4820
Deleted: sha256:c47583ee5a46

Build failed in Jenkins: beam_PreCommit_Java_Cron #2278

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[valentyn] Switch to unittest.SkipTest instead of using nose.

[mxm] [BEAM-6008] Make sure to end stream only after sending all messages and


--
[...truncated 484.76 KB...]

> Task :runners:extensions-java:metrics:javadoc
[main] INFO org.gradle.internal.nativeintegration.services.NativeServices - 
Initialized native services in: /home/jenkins/.gradle/native

> Task :runners:extensions-java:metrics:spotbugsMain
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.

> Task :runners:local-java:test
> Task :runners:local-java:check
> Task :runners:local-java:build
> Task :sdks:java:core:packageTests
> Task :sdks:java:core:assemble
> Task :sdks:java:core:analyzeClassesDependencies SKIPPED
> Task :sdks:java:core:analyzeTestClassesDependencies SKIPPED
> Task :sdks:java:core:analyzeDependencies SKIPPED

> Task :runners:samza:test

38 tests completed, 2 failed

> Task :runners:samza:test FAILED
> Task :sdks:java:fn-execution:assemble
> Task :sdks:java:fn-execution:analyzeClassesDependencies SKIPPED
> Task :sdks:java:fn-execution:compileTestJava FROM-CACHE
> Task :sdks:java:fn-execution:testClasses UP-TO-DATE
> Task :sdks:java:fn-execution:analyzeTestClassesDependencies SKIPPED
> Task :sdks:java:fn-execution:analyzeDependencies SKIPPED
> Task :sdks:java:fn-execution:checkstyleMain
> Task :sdks:java:fn-execution:checkstyleTest
> Task :vendor:sdks-java-extensions-protobuf:test NO-SOURCE
> Task 
> :vendor:sdks-java-extensions-protobuf:validateShadedJarDoesntLeakNonProjectClasses
> Task :vendor:sdks-java-extensions-protobuf:check
> Task :vendor:sdks-java-extensions-protobuf:build
> Task :sdks:java:extensions:google-cloud-platform-core:assemble
> Task 
> :sdks:java:extensions:google-cloud-platform-core:analyzeClassesDependencies 
> SKIPPED
> Task 
> :sdks:java:extensions:google-cloud-platform-core:analyzeTestClassesDependencies
>  SKIPPED
> Task :sdks:java:extensions:google-cloud-platform-core:analyzeDependencies 
> SKIPPED
> Task :sdks:java:extensions:google-cloud-platform-core:checkstyleMain
> Task :sdks:java:extensions:google-cloud-platform-core:checkstyleTest
> Task :sdks:java:fn-execution:javadoc
> Task :runners:samza:job-server:startShadowScripts
> Task :sdks:java:extensions:google-cloud-platform-core:javadoc
[main] INFO org.gradle.internal.nativeintegration.services.NativeServices - 
Initialized native services in: /home/jenkins/.gradle/native
> Task :runners:samza:job-server:shadowDistTar
> Task :sdks:java:core:checkstyleMain

> Task :sdks:java:fn-execution:spotbugsMain
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.

[main] INFO org.gradle.internal.nativeintegration.services.NativeServices - 
Initialized native services in: /home/jenkins/.gradle/native

> Task :sdks:java:extensions:google-cloud-platform-core:spotbugsMain
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.

> Task :runners:extensions-java:metrics:test
> Task :runners:extensions-java:metrics:check
> Task :runners:extensions-java:metrics:build
> Task :runners:extensions-java:metrics:buildDependents
> Task :sdks:java:extensions:protobuf:assemble
> Task :sdks:java:extensions:protobuf:analyzeClassesDependencies SKIPPED
> Task :sdks:java:extensions:protobuf:extractIncludeTestProto
> Task :sdks:java:extensions:protobuf:generateTestProto
> Task :sdks:java:extensions:protobuf:compileTestJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:testClasses
> Task :sdks:java:extensions:protobuf:analyzeTestClassesDependencies SKIPPED
> Task :sdks:java:extensions:protobuf:analyzeDependencies SKIPPED
> Task :sdks:java:extensions:protobuf:checkstyleMain
> Task :sdks:java:extensions:protobuf:checkstyleTest
> Task :sdks:java:io:elasticsearch-tests:elasticsearch-tests-5:check
> Task :sdks:java:io:elasticsearch-tests:elasticsearch-tests-5:build
> Task :sdks:java:io:elasticsearch-tests:elasticsearch-tests-5:buildDependents
> Task :sdks:java:io:common:analyzeTestClassesDependencies SKIPPED
> Task :sdks:java:io:common:analyzeDependencies SKIPPED
> Task :sdks:java:io:common:checkstyleTest
> Task :sdks:java:extensions:protobuf:javadoc
> Task :sdks:java:core:checkstyleTest
> Task :runners:samza:job-server:shadowDistZip
> Task :runners:samza:job-server:assemble
> Task :runners:samza:job-server:analyzeClassesDependencies SKIPPED
> Task :runners:samza:job-server:analyzeTestClassesDependencies SKIPPED
> Task :runners:samza:job-server:analyzeDependencies SKIPPED

Build failed in Jenkins: beam_PostCommit_Python2 #1453

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[mxm] [BEAM-6008] Make sure to end stream only after sending all messages and


--
[...truncated 3.62 MB...]
[grpc-default-executor-0] INFO 
org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished 
getting job metrics for 
BeamApp-jenkins-0115111602-d247d8fa_33b09041-c476-4c8c-8eba-b37c45723115
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
:84:
 UserWarning: You are using Apache Beam with Python 2. New releases of Apache 
Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
INFO:apache_beam.runners.portability.fn_api_runner_transforms:
  
WARNING:apache_beam.utils.subprocess_server:Starting service with ['java' 
'-jar' 
'
 '--spark-master-url' 'local[4]' '--artifacts-dir' 
'/tmp/beam-templOnevZ/artifactsUExFbF' '--job-port' '56577' '--artifact-port' 
'0' '--expansion-port' '0']
20/01/15 11:16:22 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: 
ArtifactStagingService started on localhost:37265
20/01/15 11:16:22 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java 
ExpansionService started on localhost:35649
20/01/15 11:16:22 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService 
started on localhost:56577
WARNING:root:Waiting for grpc channel to be ready at localhost:56577.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--parallelism=2', '--shutdown_sources_on_final_watermark']
20/01/15 11:16:23 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking 
job BeamApp-jenkins-0115111623-6ce5efad_adfaa441-bcd7-4672-a705-579461577ad1
20/01/15 11:16:24 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job 
invocation 
BeamApp-jenkins-0115111623-6ce5efad_adfaa441-bcd7-4672-a705-579461577ad1
INFO:root:Waiting until the pipeline has finished because the environment 
"LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
20/01/15 11:16:32 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
PipelineOptions.filesToStage was not specified. Defaulting to files from the 
classpath
20/01/15 11:16:32 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will 
stage 1 files. (Enable logging at DEBUG level to see which files will be 
staged.)
20/01/15 11:16:32 INFO 
org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand 
new Spark Context.
20/01/15 11:16:32 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
20/01/15 11:16:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
Running job 
BeamApp-jenkins-0115111623-6ce5efad_adfaa441-bcd7-4672-a705-579461577ad1 on 
Spark master local[4]
20/01/15 11:16:33 INFO 
org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated 
aggregators accumulator: 
20/01/15 11:16:33 INFO 
org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics 
accumulator: MetricQueryResults()

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel 
for localhost:37257.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with 
unbounded number of workers.
20/01/15 11:16:36 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam 
Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for 
localhost:33107.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for 
localhost:40713
20/01/15 11:16:36 INFO 
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client 
connected.
20/01/15 11:16:37 WARN 
org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: 
Either 

beam_PostCommit_PortableJar_Flink - Build # 1115 - Aborted

2020-01-15 Thread Apache Jenkins Server
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink 
(build #1115)

Status: Aborted

Check console output at 
https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1115/ to view 
the results.

-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org

Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #215

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[robertwb] [BEAM-6587] Remove hacks due to missing common string coder.

[kirillkozlov] Update data source for SQL performance tests

[chadrik] [BEAM-7746] Address changes in code since annotations were introduced

[chadrik] [BEAM-7746]  Typing fixes that require runtime code changes

[chadrik] [BEAM-7746] Avoid creating attributes dynamically, so that they can be

[chadrik] [BEAM-7746] Bugfix: coder id is expected to be str in python3

[chadrik] [BEAM-7746] Explicitly unpack tuple to avoid inferring unbounded tuple

[chadrik] [BEAM-7746] Generate files with protobuf urns as part of gen_protos

[chadrik] [BEAM-7746] Move name and coder to base StateSpec class

[chadrik] [BEAM-7746] Remove reference to missing attribute in

[chadrik] [BEAM-7746] Non-Optional arguments cannot default to None

[chadrik] [BEAM-7746] Avoid reusing variables with different data types

[chadrik] [BEAM-7746] Add StateHandler abstract base class

[chadrik] [BEAM-7746] Add TODO about fixing assignment to

[chadrik] [BEAM-7746] Fix functions that were defined twice

[chadrik] [BEAM-7746] Fix tests that have the same name

[iemejia] [BEAM-9040] Add skipQueries option to skip queries in a Nexmark suite

[iemejia] [BEAM-9040] Add Spark Structured Streaming Runner to Nexmark 
PostCommit

[valentyn] Switch to unittest.SkipTest instead of using nose.

[mxm] [BEAM-6008] Make sure to end stream only after sending all messages and

[chamikara] Sets the correct coder when clustering is enabled for the

[robertwb] Always initalize output processor on construction.

[github] [Go SDK Doc] Update Dead Container Link (#10585)

[github] Merge pull request #10582 for [INFRA-19670] Add .asf.yaml for Github


--
[...truncated 274.62 KB...]
value {
  bool_value: false
}
  }
  fields {
key: "beam:option:job_endpoint:v1"
value {
  string_value: "localhost:8099"
}
  }
  fields {
key: "beam:option:job_name:v1"
value {
  string_value: "load_tests_Python_Flink_Batch_GBK_3_0115100243"
}
  }
  fields {
key: "beam:option:job_port:v1"
value {
  string_value: "0"
}
  }
  fields {
key: "beam:option:job_server_timeout:v1"
value {
  string_value: "60"
}
  }
  fields {
key: "beam:option:load_balance_bundles:v1"
value {
  bool_value: false
}
  }
  fields {
key: "beam:option:no_auth:v1"
value {
  bool_value: false
}
  }
  fields {
key: "beam:option:object_reuse:v1"
value {
  bool_value: false
}
  }
  fields {
key: "beam:option:parallelism:v1"
value {
  string_value: "5"
}
  }
  fields {
key: "beam:option:pipeline_type_check:v1"
value {
  bool_value: true
}
  }
  fields {
key: "beam:option:profile_cpu:v1"
value {
  bool_value: false
}
  }
  fields {
key: "beam:option:profile_memory:v1"
value {
  bool_value: false
}
  }
  fields {
key: "beam:option:profile_sample_rate:v1"
value {
  number_value: 1.0
}
  }
  fields {
key: "beam:option:project:v1"
value {
  string_value: "apache-beam-testing"
}
  }
  fields {
key: "beam:option:retain_docker_containers:v1"
value {
  bool_value: false
}
  }
  fields {
key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
value {
  bool_value: false
}
  }
  fields {
key: "beam:option:runtime_type_check:v1"
value {
  bool_value: false
}
  }
  fields {
key: "beam:option:save_main_session:v1"
value {
  bool_value: false
}
  }
  fields {
key: "beam:option:sdk_location:v1"
value {
  string_value: "container"
}
  }
  fields {
key: "beam:option:sdk_worker_parallelism:v1"
value {
  string_value: "1"
}
  }
  fields {
key: "beam:option:shutdown_sources_on_final_watermark:v1"
value {
  bool_value: false
}
  }
  fields {
key: "beam:option:spark_master_url:v1"
value {
  string_value: "local[4]"
}
  }
  fields {
key: "beam:option:spark_submit_uber_jar:v1"
value {
  bool_value: false
}
  }
  fields {
key: "beam:option:streaming:v1"
value {
  bool_value: false
}
  }
  fields {
key: "beam:option:type_check_strictness:v1"
value {
  string_value: "DEFAULT_TO_ANY"
}
  }
  fields {
key: "beam:option:update:v1"
value {
  bool_value: false
}
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job 
failed. (JobID: 5499c081e0d62b74686ae1fc0fabba12)
at 

Build failed in Jenkins: beam_PostCommit_PortableJar_Spark #286

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[mxm] [BEAM-6008] Make sure to end stream only after sending all messages and


--
[...truncated 29.94 KB...]
Resolving github.com/mitchellh/mapstructure: 
commit='a4e142e9c047c904fa2f1e144d9a84e6133024bc', 
urls=[https://github.com/mitchellh/mapstructure.git, 
g...@github.com:mitchellh/mapstructure.git]
Resolving github.com/nightlyone/lockfile: 
commit='0ad87eef1443f64d3d8c50da647e2b1552851124', 
urls=[https://github.com/nightlyone/lockfile, 
g...@github.com:nightlyone/lockfile.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/openzipkin/zipkin-go: 
commit='3741243b287094fda649c7f0fa74bd51f37dc122', 
urls=[https://github.com/openzipkin/zipkin-go.git, 
g...@github.com:openzipkin/zipkin-go.git]
Resolving github.com/pelletier/go-toml: 
commit='acdc4509485b587f5e675510c4f2c63e90ff68a8', 
urls=[https://github.com/pelletier/go-toml.git, 
g...@github.com:pelletier/go-toml.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/pierrec/lz4: 
commit='ed8d4cc3b461464e69798080a0092bd028910298', 
urls=[https://github.com/pierrec/lz4.git, g...@github.com:pierrec/lz4.git]
Resolving github.com/pierrec/xxHash: 
commit='a0006b13c722f7f12368c00a3d3c2ae8a999a0c6', 
urls=[https://github.com/pierrec/xxHash.git, g...@github.com:pierrec/xxHash.git]
Resolving github.com/pkg/errors: 
commit='30136e27e2ac8d167177e8a583aa4c3fea5be833', 
urls=[https://github.com/pkg/errors.git, g...@github.com:pkg/errors.git]
Resolving github.com/pkg/sftp: 
commit='22e9c1ccc02fc1b9fa3264572e49109b68a86947', 
urls=[https://github.com/pkg/sftp.git, g...@github.com:pkg/sftp.git]
Resolving github.com/prometheus/client_golang: 
commit='9bb6ab929dcbe1c8393cd9ef70387cb69811bd1c', 
urls=[https://github.com/prometheus/client_golang.git, 
g...@github.com:prometheus/client_golang.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/prometheus/procfs: 
commit='cb4147076ac75738c9a7d279075a253c0cc5acbd', 
urls=[https://github.com/prometheus/procfs.git, 
g...@github.com:prometheus/procfs.git]
Resolving github.com/rcrowley/go-metrics: 
commit='8732c616f52954686704c8645fe1a9d59e9df7c1', 
urls=[https://github.com/rcrowley/go-metrics.git, 
g...@github.com:rcrowley/go-metrics.git]
Resolving github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 
g...@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 
g...@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/spf13/afero: 
commit='bb8f1927f2a9d3ab41c9340aa034f6b803f4359c', 
urls=[https://github.com/spf13/afero.git, g...@github.com:spf13/afero.git]
Resolving github.com/spf13/cast: 
commit='acbeb36b902d72a7a4c18e8f3241075e7ab763e4', 
urls=[https://github.com/spf13/cast.git, g...@github.com:spf13/cast.git]
Resolving github.com/spf13/cobra: 
commit='93959269ad99e80983c9ba742a7e01203a4c0e4f', 
urls=[https://github.com/spf13/cobra.git, g...@github.com:spf13/cobra.git]
Resolving github.com/spf13/jwalterweatherman: 
commit='7c0cea34c8ece3fbeb2b27ab9b59511d360fb394', 
urls=[https://github.com/spf13/jwalterweatherman.git, 
g...@github.com:spf13/jwalterweatherman.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/spf13/viper: 
commit='aafc9e6bc7b7bb53ddaa75a5ef49a17d6e654be5', 
urls=[https://github.com/spf13/viper.git, g...@github.com:spf13/viper.git]
Resolving github.com/stathat/go: 
commit='74669b9f388d9d788c97399a0824adbfee78400e', 
urls=[https://github.com/stathat/go.git, g...@github.com:stathat/go.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached 

Build failed in Jenkins: beam_PostCommit_XVR_Flink #1457

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[mxm] [BEAM-6008] Make sure to end stream only after sending all messages and


--
[...truncated 5.11 MB...]
delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 878, in 
apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
  File "apache_beam/runners/common.py", line 885, in 
apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 941, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
raise
  File "apache_beam/runners/common.py", line 883, in 
apache_beam.runners.common.DoFnRunner.process
return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 497, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
self.output_processor.process_outputs(
  File "apache_beam/runners/common.py", line 1028, in 
apache_beam.runners.common._OutputProcessor.process_outputs
self.main_receivers.receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 178, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 657, in 
apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 658, in 
apache_beam.runners.worker.operations.DoOperation.process
delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 878, in 
apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
  File "apache_beam/runners/common.py", line 885, in 
apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 956, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 883, in 
apache_beam.runners.common.DoFnRunner.process
return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 498, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
windowed_value, self.process_method(windowed_value.value))
  File 
"
 line 1437, in 
  File 
"
 line 191, in _equal
BeamAssertException: Failed assert: ['a: 3', 'b: 1', 'c: 2'] == [], missing 
elements ['a: 3', 'b: 1', 'c: 2'] [while running 'assert_that/Match']

at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at 
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:204)
at 
org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
... 1 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for 
instruction 130: Traceback (most recent call last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 167, in _execute
response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 223, in 
lambda: self.create_worker().do_instruction(request), request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 352, in do_instruction
request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 386, in process_bundle
bundle_processor.process_bundle(instruction_id))
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 812, in process_bundle
data.transform_id].process_encoded(data.data)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 205, 

Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Batch #2

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[mzobii.baig] Beam-2535 : Pursue pull request 4700 with manual apply changes 
from

[mzobii.baig] Beam-2535 : Pursue pull request 4700 with manual apply changes 
from

[mzobii.baig] Beam-2535 : Pursue pull request 4700 with manual apply changes 
from

[mzobii.baig] Beam-2535 : Replace timeStamp with outputTimeStamp

[mzobii.baig] Beam-2535 : Apply Spotless

[mzobii.baig] Beam-2535 : Pass outputTimestamp param in onTimer method

[mzobii.baig] Beam-2535 : Minor changed

[rehman.muradali] [BEAM-2535] : Add Commit State in ParDoEvaluator

[rehman.muradali] [BEAM-2535] : Add outputTimestamp in compare method, Revert

[mzobii.baig] Beam-2535 : Modifying default minimum target and GC time

[rehman.muradali] BEAM-2535 : Removal of extra lines

[mzobii.baig] Beam-2535 : Proposed changes

[mzobii.baig] Beam-2535 : Added original PR watermark hold functionality.

[rehman.muradali] [BEAM-2535] Apply Spotless

[mzobii.baig] [Beam-2535] Variable renaming and added output timestamp in

[mzobii.baig] Beam-2535 : Apply Spotless

[mzobii.baig] [Beam-2535] Modify test case

[mzobii.baig] [Beam-2535] Added comments

[mzobii.baig] [Beam-2535] Apply Spotless

[mzobii.baig] [Beam-2535] Set Processing Time with outputTimestamp

[mzobii.baig] [Beam-2535] Minor renaming

[rehman.muradali] [BEAM-2535] Revert Processing Time, Addition of 
OutputTimestamp

[rehman.muradali] [BEAM-2535] Revert TimerReceiver outputTimestamp

[kirillkozlov] Modify AggregateProjectMergeRule to have a condition

[ehudm] [BEAM-8269] Convert from_callable type hints to Beam types

[kirillkozlov] SpotlesApply

[kirillkozlov] Test for a query with a predicate

[kirillkozlov] A list of visited nodes should be unique per onMatch invocation

[rehman.muradali] [BEAM-2535] Revert TimerReceiver outputTimestamp

[kirillkozlov] Make sure all nodes are explored

[dcavazos] [BEAM-7390] Add code snippet for Min

[dcavazos] [BEAM-7390] Add code snippet for Max

[rehman.muradali] [BEAM-2535] Making OnTimer compatible

[rehman.muradali] [BEAM-2535] Making OnTimer compatible

[kirillkozlov] Add a new Jenkins job for SQL perf tests

[kirillkozlov] Test boilerplate

[rehman.muradali] Adding OutputTimestamp in Timer Object

[rehman.muradali] Apply Spotless and checkstyle

[kirillkozlov] Table proxy to add TimeMonitor after the IO

[kirillkozlov] Tests for direct_read w/o push-down and default methods

[mzobii.baig] [Beam-2535] Added watermark functionality for the dataflow runner

[kenn] Use more informative assertions in some py tests

[mzobii.baig] [Beam-2535] Used boolean instead boxed type

[kirillkozlov] Cleanup

[dcavazos] [BEAM-7390] Add code snippet for Sum

[mzobii.baig] [Beam-2535] Modify required watermark hold functionality

[kirillkozlov] Monitor total number of fields read from an IO

[ehudm] Fix _get_args for typing.Tuple in Py3.5.2

[kcweaver] Add FlinkMiniClusterEntryPoint for testing the uber jar submission

[kcweaver] [BEAM-8512] Add integration tests for flink_runner.py.

[kcweaver] Build mini cluster jar excluding unwanted classes.

[kcweaver] Rename to testFlinkUberJarPyX.Y

[kcweaver] Increase timeout on beam_PostCommit_PortableJar_Flink.

[kamil.wasilewski] [BEAM-1440] Provide functions for waiting for BQ job and 
exporting

[kamil.wasilewski] [BEAM-1440] Create _BigQuerySource that implements 
iobase.BoundedSource

[kamil.wasilewski] [BEAM-1440] Reorganised BigQuery read IT tests

[kamil.wasilewski] [BEAM-1440] Create postCommitIT jobs running on Flink Runner

[kamil.wasilewski] [BEAM-1440] Convert strings to bytes on Python 3 if field 
type is BYTES

[kamil.wasilewski] [BEAM-1440]: Support RECORD fields in coder

[kamil.wasilewski] [BEAM-1440] Remove json files after reading

[kamil.wasilewski] [BEAM-1440] Marked classes as private

[kamil.wasilewski] [BEAM-1440] Do not force to create temp dataset when using 
dry run

[echauchot] [BEAM-5192] Migrate ElasticsearchIO to v7

[echauchot] [BEAM-5192] Minor change of ESIO public configuration API:

[robinyqiu] BeamZetaSqlCalcRel prototype

[valentyn] Install SDK after tarball is generated to avoid a race in proto stubs

[kamil.wasilewski] [BEAM-8671] Add Python 3.7 support for LoadTestBuilder

[kamil.wasilewski] [BEAM-8671] Add ParDo test running on Python 3.7

[ehudm] Fix cleanPython race with :clean

[robinyqiu] Fix bug in SingleRowScanConverter

[robinyqiu] Use BeamBigQuerySqlDialect

[boyuanz] [BEAM-8536] Migrate using requested_execution_time to

[pabloem] Initialize logging configuration in Pipeline object

[daniel.o.programmer] [BEAM-7970] Touch-up on Go protobuf generation 
instructions.

[kamil.wasilewski] [BEAM-8979] Remove mypy-protobuf dependency

[echauchot] [BEAM-5192] Fix missing ifs for ES7 specificities.

[echauchot] [BEAM-5192] Remove unneeded transitive dependencies, upgrade ES and

[echauchot] [BEAM-5192] Disable MockHttpTransport plugin to enabe http dialog 

Build failed in Jenkins: beam_PreCommit_Python2_PVR_Flink_Cron #417

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[valentyn] Switch to unittest.SkipTest instead of using nose.

[mxm] [BEAM-6008] Make sure to end stream only after sending all messages and


--
[...truncated 32.83 MB...]
Make sure to call shutdown()/shutdownNow() and wait until 
awaitTermination() returns true.
java.lang.RuntimeException: ManagedChannel allocation site
at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.(ManagedChannelOrphanWrapper.java:94)
at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ManagedChannelOrphanWrapper.(ManagedChannelOrphanWrapper.java:52)
at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ManagedChannelOrphanWrapper.(ManagedChannelOrphanWrapper.java:43)
at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:514)
at 
org.apache.beam.sdk.fn.channel.ManagedChannelFactory.forDescriptor(ManagedChannelFactory.java:44)
at 
org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory.createEnvironment(ExternalEnvironmentFactory.java:112)
at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:200)
at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:184)
at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.(DefaultJobBundleFactory.java:331)
at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.(DefaultJobBundleFactory.java:320)
at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:250)
at 
org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:38)
at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:198)
at 
org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator.open(ExecutableStageDoFnOperator.java:196)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.openAllOperators(StreamTask.java:532)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:396)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
at java.lang.Thread.run(Thread.java:748)

[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)] 
INFO org.apache.flink.runtime.state.heap.HeapKeyedStateBackend - Initializing 
heap keyed state backend with stream factory.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)] 
INFO org.apache.flink.runtime.state.heap.HeapKeyedStateBackend - Initializing 
heap keyed state backend with stream factory.
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel 
for localhost:35359.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with 
unbounded number of workers.
[grpc-default-executor-0] INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService - 
Beam Fn Control client connected with id 31-1
[[1]Create/FlatMap() (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - [1]Create/FlatMap() (2/2) (558fd2e5ee34795da9d729292beeb0a5) switched from RUNNING 
to FINISHED.
[[1]Create/FlatMap() (2/2)] INFO 

Build failed in Jenkins: beam_PostCommit_Python2 #1454

2020-01-15 Thread Apache Jenkins Server
See 


Changes:


--
[...truncated 3.63 MB...]
> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
:84:
 UserWarning: You are using Apache Beam with Python 2. New releases of Apache 
Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
INFO:apache_beam.runners.portability.fn_api_runner_transforms:
  
WARNING:apache_beam.utils.subprocess_server:Starting service with ['java' 
'-jar' 
'
 '--spark-master-url' 'local[4]' '--artifacts-dir' 
'/tmp/beam-temp8Du_43/artifactsmonF2i' '--job-port' '55307' '--artifact-port' 
'0' '--expansion-port' '0']
20/01/15 12:25:59 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: 
ArtifactStagingService started on localhost:40799
20/01/15 12:25:59 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java 
ExpansionService started on localhost:41195
20/01/15 12:25:59 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService 
started on localhost:55307
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--parallelism=2', '--shutdown_sources_on_final_watermark']
20/01/15 12:26:01 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking 
job BeamApp-jenkins-0115122601-ed610cf8_5bdff474-f00a-4ab1-a31e-b6c759bcbf2f
20/01/15 12:26:01 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job 
invocation 
BeamApp-jenkins-0115122601-ed610cf8_5bdff474-f00a-4ab1-a31e-b6c759bcbf2f
INFO:root:Waiting until the pipeline has finished because the environment 
"LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
20/01/15 12:26:02 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
PipelineOptions.filesToStage was not specified. Defaulting to files from the 
classpath
20/01/15 12:26:02 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will 
stage 1 files. (Enable logging at DEBUG level to see which files will be 
staged.)
20/01/15 12:26:02 INFO 
org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand 
new Spark Context.
20/01/15 12:26:03 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
20/01/15 12:26:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
Running job 
BeamApp-jenkins-0115122601-ed610cf8_5bdff474-f00a-4ab1-a31e-b6c759bcbf2f on 
Spark master local[4]
20/01/15 12:26:03 INFO 
org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated 
aggregators accumulator: 
20/01/15 12:26:03 INFO 
org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics 
accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel 
for localhost:42521.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with 
unbounded number of workers.
20/01/15 12:26:06 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam 
Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for 
localhost:38813.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for 
localhost:40361
20/01/15 12:26:06 INFO 
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client 
connected.
20/01/15 12:26:07 WARN 
org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: 
Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not 
consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path 
matching: -*-of-%(num_shards)05d
20/01/15 12:26:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
BeamApp-jenkins-0115122601-ed610cf8_5bdff474-f00a-4ab1-a31e-b6c759bcbf2f: 
Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with 
num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
20/01/15 

beam_PostCommit_PortableJar_Flink - Build # 1116 - Aborted

2020-01-15 Thread Apache Jenkins Server
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink 
(build #1116)

Status: Aborted

Check console output at 
https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1116/ to view 
the results.

-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org

Build failed in Jenkins: beam_PostCommit_Java11_ValidatesRunner_Direct #2997

2020-01-15 Thread Apache Jenkins Server
See 


Changes:


--
[...truncated 579 B...]
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4fc64ce9de73d0ebfc41e6aa04a4c6a5428a79dd (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4fc64ce9de73d0ebfc41e6aa04a4c6a5428a79dd
Commit message: "Merge pull request #10583: [BEAM-6008] Make sure to end stream 
only after sending all messages and state updates"
 > git rev-list --no-walk 4fc64ce9de73d0ebfc41e6aa04a4c6a5428a79dd # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 -Dorg.gradle.java.home=/usr/lib/jvm/java-8-openjdk-amd64 
:runners:direct-java:shadowJar :runners:direct-java:shadowTestJar
Starting a Gradle Daemon (subsequent builds will be faster)
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :runners:direct-java:processTestResources NO-SOURCE
> Task :model:job-management:extractProto
> Task :model:fn-execution:extractProto
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task 

Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #5515

2020-01-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java_PVR_Flink_Streaming #3815

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[mxm] [BEAM-6008] Make sure to end stream only after sending all messages and


--
Started by GitHub push by mxm
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-3 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4fc64ce9de73d0ebfc41e6aa04a4c6a5428a79dd (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4fc64ce9de73d0ebfc41e6aa04a4c6a5428a79dd
Commit message: "Merge pull request #10583: [BEAM-6008] Make sure to end stream 
only after sending all messages and state updates"
 > git rev-list --no-walk 6bc5f6ea1ba117d7e2604ea0a1e83c6caa25fa8e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g 
:runners:flink:1.9:job-server:validatesPortableRunnerStreaming
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:flink:1.9:copyResourcesOverrides NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :runners:core-construction-java:processTestResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto
> Task :runners:portability:java:processTestResources NO-SOURCE
> Task :runners:flink:1.9:copySourceOverrides
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :model:fn-execution:processResources
> Task :model:job-management:processResources
> Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE
> Task :runners:flink:1.9:processResources
> Task :sdks:java:build-tools:compileJava FROM-CACHE
> Task :runners:flink:1.9:copyTestSourceOverrides
> Task :sdks:java:build-tools:processResources
> Task :runners:flink:1.9:processTestResources
> Task :sdks:java:core:processResources
> Task 

Build failed in Jenkins: beam_PostRelease_NightlySnapshot #860

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[dcavazos] [BEAM-7390] Add code snippet for Min

[dcavazos] [BEAM-7390] Add code snippet for Sum

[ehudm] Light cleanup of opcodes.py

[apilloud] [BEAM-8630] Use column numbers for BeamZetaSqlCalRel

[pawel.pasterz] [BEAM-7115] Fix metrics being incorrectly gathered

[mxm] Remove incorrectly tagged test annotation from test case

[mxm] [BEAM-6008] Propagate errors during pipeline execution in Java's

[github] Tighten language and remove distracting link

[pabloem] [BEAM-7390] Add code snippet for Top (#10179)

[bhulette] [BEAM-8993] [SQL] MongoDB predicate push down. (#10417)

[lukecwik] [BEAM-8740] Remove unused dependency from Spark runner (#10564)

[robertwb] [BEAM-6587] Remove hacks due to missing common string coder.

[kirillkozlov] Update data source for SQL performance tests

[github] [BEAM-5605] Add support for channel splitting to the gRPC read "source"

[github] [BEAM-5605] Add support for additional parameters to SplittableDofn

[chadrik] [BEAM-7746] Address changes in code since annotations were introduced

[chadrik] [BEAM-7746]  Typing fixes that require runtime code changes

[chadrik] [BEAM-7746] Avoid creating attributes dynamically, so that they can be

[chadrik] [BEAM-7746] Bugfix: coder id is expected to be str in python3

[chadrik] [BEAM-7746] Explicitly unpack tuple to avoid inferring unbounded tuple

[chadrik] [BEAM-7746] Generate files with protobuf urns as part of gen_protos

[chadrik] [BEAM-7746] Move name and coder to base StateSpec class

[chadrik] [BEAM-7746] Remove reference to missing attribute in

[chadrik] [BEAM-7746] Non-Optional arguments cannot default to None

[chadrik] [BEAM-7746] Avoid reusing variables with different data types

[chadrik] [BEAM-7746] Add StateHandler abstract base class

[chadrik] [BEAM-7746] Add TODO about fixing assignment to

[chadrik] [BEAM-7746] Fix functions that were defined twice

[chadrik] [BEAM-7746] Fix tests that have the same name

[iemejia] [BEAM-9040] Add skipQueries option to skip queries in a Nexmark suite

[iemejia] [BEAM-9040] Add Spark Structured Streaming Runner to Nexmark 
PostCommit

[valentyn] Switch to unittest.SkipTest instead of using nose.

[mxm] [BEAM-6008] Make sure to end stream only after sending all messages and

[chamikara] Sets the correct coder when clustering is enabled for the

[robertwb] Always initalize output processor on construction.

[github] [Go SDK Doc] Update Dead Container Link (#10585)

[github] Merge pull request #10582 for [INFRA-19670] Add .asf.yaml for Github


--
[...truncated 3.52 MB...]
Jan 15, 2020 11:32:18 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 192 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Jan 15, 2020 11:32:19 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/tmp/groovy-generated-8380199067339830793-tmpdir/word-count-beam/target/classes 
to 
gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/tmp/staging/classes-cSjqU4nhVsBgMeLK9UZgfg.jar
Jan 15, 2020 11:32:20 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 191 files cached, 1 files newly uploaded in 1 
seconds
Jan 15, 2020 11:32:20 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ReadLines/Read as step s1
Jan 15, 2020 11:32:21 AM org.apache.beam.sdk.io.FileBasedSource 
getEstimatedSizeBytes
INFO: Filepattern gs://apache-beam-samples/shakespeare/* matched 44 files with 
total size 5443510
Jan 15, 2020 11:32:21 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding WordCount.CountWords/ParDo(ExtractWords) as step s2
Jan 15, 2020 11:32:21 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding WordCount.CountWords/Count.PerElement/Init/Map as step s3
Jan 15, 2020 11:32:21 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey as step 
s4
Jan 15, 2020 11:32:21 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues
 as step s5
Jan 15, 2020 11:32:21 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding MapElements/Map as step s6
Jan 15, 2020 11:32:21 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding WriteCounts/WriteFiles/RewindowIntoGlobal/Window.Assign as step s7
Jan 15, 2020 11:32:21 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 

Build failed in Jenkins: beam_PostCommit_Java_PVR_Flink_Streaming #3816

2020-01-15 Thread Apache Jenkins Server
See 


Changes:


--
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-2 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4fc64ce9de73d0ebfc41e6aa04a4c6a5428a79dd (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4fc64ce9de73d0ebfc41e6aa04a4c6a5428a79dd
Commit message: "Merge pull request #10583: [BEAM-6008] Make sure to end stream 
only after sending all messages and state updates"
 > git rev-list --no-walk 4fc64ce9de73d0ebfc41e6aa04a4c6a5428a79dd # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g 
:runners:flink:1.9:job-server:validatesPortableRunnerStreaming
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:flink:1.9:copyResourcesOverrides NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :sdks:java:extensions:protobuf:extractProto
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:core-construction-java:processTestResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :runners:portability:java:processTestResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :model:job-management:processResources
> Task :runners:flink:1.9:copySourceOverrides
> Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :model:fn-execution:processResources
> Task :runners:flink:1.9:processResources
> Task :sdks:java:build-tools:compileJava FROM-CACHE
> Task :runners:flink:1.9:copyTestSourceOverrides
> Task :sdks:java:build-tools:processResources
> Task :runners:flink:1.9:processTestResources
> Task :sdks:java:build-tools:classes
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task 

Build failed in Jenkins: beam_sonarqube_report #1275

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[valentyn] Switch to unittest.SkipTest instead of using nose.

[mxm] [BEAM-6008] Make sure to end stream only after sending all messages and


--
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init  # 
 > timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4fc64ce9de73d0ebfc41e6aa04a4c6a5428a79dd (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4fc64ce9de73d0ebfc41e6aa04a4c6a5428a79dd
Commit message: "Merge pull request #10583: [BEAM-6008] Make sure to end stream 
only after sending all messages and state updates"
 > git rev-list --no-walk 148cc71b1c3c6d54835c3b72d7842052fcf6c340 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
ERROR: SonarQube installation defined in this job (ASF Sonar Analysis) does not 
match any configured installation. Number of installations that can be 
configured: 0.
If you want to reassign jobs to a different SonarQube installation, check the 
documentation under https://redirect.sonarsource.com/plugins/jenkins.html

-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_LoadTests_Java_Combine_Portable_Flink_Streaming #2

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[mzobii.baig] Beam-2535 : Pursue pull request 4700 with manual apply changes 
from

[mzobii.baig] Beam-2535 : Pursue pull request 4700 with manual apply changes 
from

[mzobii.baig] Beam-2535 : Pursue pull request 4700 with manual apply changes 
from

[mzobii.baig] Beam-2535 : Replace timeStamp with outputTimeStamp

[mzobii.baig] Beam-2535 : Apply Spotless

[mzobii.baig] Beam-2535 : Pass outputTimestamp param in onTimer method

[mzobii.baig] Beam-2535 : Minor changed

[rehman.muradali] [BEAM-2535] : Add Commit State in ParDoEvaluator

[rehman.muradali] [BEAM-2535] : Add outputTimestamp in compare method, Revert

[mzobii.baig] Beam-2535 : Modifying default minimum target and GC time

[rehman.muradali] BEAM-2535 : Removal of extra lines

[mzobii.baig] Beam-2535 : Proposed changes

[mzobii.baig] Beam-2535 : Added original PR watermark hold functionality.

[rehman.muradali] [BEAM-2535] Apply Spotless

[mzobii.baig] [Beam-2535] Variable renaming and added output timestamp in

[mzobii.baig] Beam-2535 : Apply Spotless

[mzobii.baig] [Beam-2535] Modify test case

[mzobii.baig] [Beam-2535] Added comments

[mzobii.baig] [Beam-2535] Apply Spotless

[mzobii.baig] [Beam-2535] Set Processing Time with outputTimestamp

[mzobii.baig] [Beam-2535] Minor renaming

[rehman.muradali] [BEAM-2535] Revert Processing Time, Addition of 
OutputTimestamp

[rehman.muradali] [BEAM-2535] Revert TimerReceiver outputTimestamp

[kirillkozlov] Modify AggregateProjectMergeRule to have a condition

[ehudm] [BEAM-8269] Convert from_callable type hints to Beam types

[kirillkozlov] SpotlesApply

[kirillkozlov] Test for a query with a predicate

[kirillkozlov] A list of visited nodes should be unique per onMatch invocation

[rehman.muradali] [BEAM-2535] Revert TimerReceiver outputTimestamp

[kirillkozlov] Make sure all nodes are explored

[dcavazos] [BEAM-7390] Add code snippet for Min

[dcavazos] [BEAM-7390] Add code snippet for Max

[rehman.muradali] [BEAM-2535] Making OnTimer compatible

[rehman.muradali] [BEAM-2535] Making OnTimer compatible

[kirillkozlov] Add a new Jenkins job for SQL perf tests

[kirillkozlov] Test boilerplate

[rehman.muradali] Adding OutputTimestamp in Timer Object

[rehman.muradali] Apply Spotless and checkstyle

[kirillkozlov] Table proxy to add TimeMonitor after the IO

[kirillkozlov] Tests for direct_read w/o push-down and default methods

[mzobii.baig] [Beam-2535] Added watermark functionality for the dataflow runner

[kenn] Use more informative assertions in some py tests

[mzobii.baig] [Beam-2535] Used boolean instead boxed type

[kirillkozlov] Cleanup

[dcavazos] [BEAM-7390] Add code snippet for Sum

[mzobii.baig] [Beam-2535] Modify required watermark hold functionality

[kirillkozlov] Monitor total number of fields read from an IO

[ehudm] Fix _get_args for typing.Tuple in Py3.5.2

[kcweaver] Add FlinkMiniClusterEntryPoint for testing the uber jar submission

[kcweaver] [BEAM-8512] Add integration tests for flink_runner.py.

[kcweaver] Build mini cluster jar excluding unwanted classes.

[kcweaver] Rename to testFlinkUberJarPyX.Y

[kcweaver] Increase timeout on beam_PostCommit_PortableJar_Flink.

[kamil.wasilewski] [BEAM-1440] Provide functions for waiting for BQ job and 
exporting

[kamil.wasilewski] [BEAM-1440] Create _BigQuerySource that implements 
iobase.BoundedSource

[kamil.wasilewski] [BEAM-1440] Reorganised BigQuery read IT tests

[kamil.wasilewski] [BEAM-1440] Create postCommitIT jobs running on Flink Runner

[kamil.wasilewski] [BEAM-1440] Convert strings to bytes on Python 3 if field 
type is BYTES

[kamil.wasilewski] [BEAM-1440]: Support RECORD fields in coder

[kamil.wasilewski] [BEAM-1440] Remove json files after reading

[kamil.wasilewski] [BEAM-1440] Marked classes as private

[kamil.wasilewski] [BEAM-1440] Do not force to create temp dataset when using 
dry run

[echauchot] [BEAM-5192] Migrate ElasticsearchIO to v7

[echauchot] [BEAM-5192] Minor change of ESIO public configuration API:

[robinyqiu] BeamZetaSqlCalcRel prototype

[valentyn] Install SDK after tarball is generated to avoid a race in proto stubs

[kamil.wasilewski] [BEAM-8671] Add Python 3.7 support for LoadTestBuilder

[kamil.wasilewski] [BEAM-8671] Add ParDo test running on Python 3.7

[ehudm] Fix cleanPython race with :clean

[robinyqiu] Fix bug in SingleRowScanConverter

[robinyqiu] Use BeamBigQuerySqlDialect

[boyuanz] [BEAM-8536] Migrate using requested_execution_time to

[pabloem] Initialize logging configuration in Pipeline object

[daniel.o.programmer] [BEAM-7970] Touch-up on Go protobuf generation 
instructions.

[kamil.wasilewski] [BEAM-8979] Remove mypy-protobuf dependency

[echauchot] [BEAM-5192] Fix missing ifs for ES7 specificities.

[echauchot] [BEAM-5192] Remove unneeded transitive dependencies, upgrade ES and

[echauchot] [BEAM-5192] Disable MockHttpTransport plugin to enabe http 

Jenkins build is back to normal : beam_PostCommit_Python35 #1464

2020-01-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_MongoDBIO_IT #1811

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [website] Added security page

[iemejia] [website] Update the 2.17.0 release blog post to include security 
issues


--
[...truncated 24.30 KB...]
Requirement already satisfied: py>=1.5.0 in 

 (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.8.1)
Collecting wcwidth
  Using cached 
https://files.pythonhosted.org/packages/58/b4/4850a0ccc6f567cc0ebe7060d20ffd4258b8210efadc259da62dc6ed9c65/wcwidth-0.1.8-py2.py3-none-any.whl
Collecting pathlib2>=2.2.0; python_version < "3.6"
  Using cached 
https://files.pythonhosted.org/packages/e9/45/9c82d3666af4ef9f221cbb954e1d77ddbb513faf552aea6df5f37f1a4859/pathlib2-2.3.5-py2.py3-none-any.whl
Requirement already satisfied: pluggy<1.0,>=0.12 in 

 (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.13.1)
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" 
in 

 (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.4.0)
Collecting atomicwrites>=1.0
  Using cached 
https://files.pythonhosted.org/packages/52/90/6155aa926f43f2b2a22b01be7241be3bfd1ceaf7d0b3267213e8127d41f4/atomicwrites-1.3.0-py2.py3-none-any.whl
Collecting packaging
  Using cached 
https://files.pythonhosted.org/packages/d8/5b/3098db49a61ccc8583ffead6aedc226f08ff56dc03106b6ec54451e27a30/packaging-20.0-py2.py3-none-any.whl
Collecting pytest-forked
  Using cached 
https://files.pythonhosted.org/packages/03/1e/81235e1fcfed57a4e679d34794d60c01a1e9a29ef5b9844d797716111d80/pytest_forked-1.1.3-py2.py3-none-any.whl
Collecting execnet>=1.1
  Using cached 
https://files.pythonhosted.org/packages/d3/2e/c63af07fa471e0a02d05793c7a56a9f7d274a8489442a5dc4fb3b2b3c705/execnet-1.7.1-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5
  Using cached 
https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2
  Using cached 
https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1
  Using cached 
https://files.pythonhosted.org/packages/b4/40/a9837291310ee1ccc242ceb6ebfd9eb21539649f193a7c8c86ba15b98539/urllib3-1.25.7-py2.py3-none-any.whl
Collecting certifi>=2017.4.17
  Using cached 
https://files.pythonhosted.org/packages/b9/63/df50cac98ea0d5b006c55a399c3bf1db9da7b5a24de7890bc9cfd5dd9e99/certifi-2019.11.28-py2.py3-none-any.whl
Requirement already satisfied: zipp>=0.5 in 

 (from importlib-metadata>=0.12; python_version < 
"3.8"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.0.0)
Collecting apipkg>=1.4
  Using cached 
https://files.pythonhosted.org/packages/67/08/4815a09603fc800209431bec5b8bd2acf2f95abdfb558a44a42507fb94da/apipkg-1.5-py2.py3-none-any.whl
Installing collected packages: crcmod, dill, fastavro, docopt, idna, chardet, 
urllib3, certifi, requests, hdfs, httplib2, pbr, mock, numpy, pymongo, pyasn1, 
pyasn1-modules, rsa, oauth2client, pyparsing, pydot, python-dateutil, pytz, 
typing, typing-extensions, avro-python3, pyarrow, freezegun, nose, 
nose-xunitmp, pandas, parameterized, pyhamcrest, pyyaml, requests-mock, 
tenacity, attrs, wcwidth, pathlib2, atomicwrites, packaging, pytest, 
pytest-forked, apipkg, execnet, pytest-xdist, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam apipkg-1.5 atomicwrites-1.3.0 attrs-19.3.0 
avro-python3-1.9.1 certifi-2019.11.28 chardet-3.0.4 crcmod-1.7 dill-0.3.1.1 
docopt-0.6.2 execnet-1.7.1 fastavro-0.21.24 freezegun-0.3.13 hdfs-2.5.8 
httplib2-0.12.0 idna-2.8 mock-2.0.0 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.18.1 
oauth2client-3.0.0 packaging-20.0 pandas-0.24.2 parameterized-0.7.1 
pathlib2-2.3.5 pbr-5.4.4 pyarrow-0.15.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 
pydot-1.4.1 pyhamcrest-1.9.0 pymongo-3.10.1 pyparsing-2.4.6 pytest-4.6.9 
pytest-forked-1.1.3 pytest-xdist-1.31.0 python-dateutil-2.8.1 pytz-2019.3 
pyyaml-5.3 requests-2.22.0 requests-mock-1.7.0 rsa-4.0 tenacity-5.1.5 
typing-3.7.4.1 typing-extensions-3.7.4.1 urllib3-1.25.7 wcwidth-0.1.8
Traceback (most recent call last):
  File "/usr/lib/python3.5/runpy.py", line 174, in _run_module_as_main
mod_name, mod_spec, code = _get_module_details(mod_name, _Error)
  File "/usr/lib/python3.5/runpy.py", line 109, in _get_module_details

Build failed in Jenkins: beam_PostCommit_Py_ValCont #5289

2020-01-15 Thread Apache Jenkins Server
mit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py;,>
 line 812, in run
test(orig)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 45, in __call__
return self.run(*arg, **kwarg)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 133, in run
self.runTest(result)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 151, in runTest
test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 58, in test_metrics_fnapi_it
result = self.run_pipeline(experiment='beam_fn_api')
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 39, in run_pipeline
test_pipeline = TestPipeline(is_integration_test=True)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/testing/test_pipeline.py;,>
 line 108, in __init__
super(TestPipeline, self).__init__(runner, options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py;,>
 line 184, in __init__
errors = PipelineOptionsValidator(self._options, runner).validate()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options_validator.py;,>
 line 113, in validate
errors.extend(self.options.view_as(cls).validate(self))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 591, in validate
self.view_as(GoogleCloudOptions).region = self._get_default_gcp_region()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 559, in _get_default_gcp_region
raw_output = processes.check_output(cmd, stderr=DEVNULL)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/utils/processes.py;,>
 line 85, in check_output
out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python2.7/subprocess.py", line 568, in check_output
output, unused_err = process.communicate()
  File "/usr/lib/python2.7/subprocess.py", line 792, in communicate
stdout = _eintr_retry_call(self.stdout.read)
  File "/usr/lib/python2.7/subprocess.py", line 476, in _eintr_retry_call
return func(*args)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py;,>
 line 276, in signalhandler
raise TimedOutException()
TimedOutException: 'test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)'

--
XML: nosetests-python2.7_sdk.xml
--
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
--
Ran 2 tests in 904.030s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20200115-161727
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:30a26a1daecdafd96c099a326afbac7bb301c3e20c9e0697f375903e326161f6
Deleted: sha256:f6c2ef0a3e8c15ecc40de66e8d70fb4ad9ef2f58d6076a695b762d63fe672b8a
Deleted: sha256:c16aa017700dde422f6530a3787d316230175aed01acd86ec43651f41a3dcc2f
Deleted: sha256:d7b070e3c754ba4e1b9441677cedc938c1265b2aa626218c70f3175b0f5d1ed4
Deleted: sha256:5345d27255c231e6e644ef5cb8ad8bc328084f6548410bf2cceabfe941176743
Deleted: sha256:923e7bfb2d00c6069d460e5809e70eb9c28f923a295686f5ceb15f7eeaba9eac
Deleted: sha256:11350394606648dd4b421a917198465b8e2a24d5cff7bfbd6db02a746373a7b8
Deleted: sha256:25d0c81b8887d602f9ae96d33beeb72f52cc3c9697af7ad4594a87174c50bca2
Deleted: sha256:3940256f2ee61cbc47b266bb86fccfef54b0f7d1fe78986969813426b76e6587
Deleted: sha256:9ed59e954b0a7ae377fdf17ea934b3906da2dd255eae57845a872f42cd420acd
Deleted: sha2

Build failed in Jenkins: beam_PostCommit_Java_PVR_Flink_Streaming #3818

2020-01-15 Thread Apache Jenkins Server
See 


Changes:


--
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-1 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2b07e0efb5db918d462873dcdd0055285fe7bf7a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2b07e0efb5db918d462873dcdd0055285fe7bf7a
Commit message: "Merge pull request #10581: [website] Add security page and 
update 2.17.0 release blog post to reference it"
 > git rev-list --no-walk 2b07e0efb5db918d462873dcdd0055285fe7bf7a # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g 
:runners:flink:1.9:job-server:validatesPortableRunnerStreaming
Starting a Gradle Daemon (subsequent builds will be faster)
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:flink:1.9:copyResourcesOverrides NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:core-construction-java:processTestResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :sdks:java:extensions:protobuf:extractProto
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :runners:portability:java:processTestResources NO-SOURCE
> Task :model:fn-execution:processResources
> Task :model:job-management:processResources
> Task :runners:flink:1.9:copySourceOverrides
> Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE
> Task :runners:flink:1.9:processResources
> Task :runners:flink:1.9:copyTestSourceOverrides
> Task :runners:flink:1.9:processTestResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:build-tools:compileJava FROM-CACHE
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :sdks:java:build-tools:processResources
> Task :sdks:java:build-tools:classes
> Task :sdks:java:build-tools:jar
> Task 

Jenkins build is back to normal : beam_PostCommit_Python_MongoDBIO_IT #1812

2020-01-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming #474

2020-01-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_XVR_Flink #1459

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [website] Added security page

[iemejia] [website] Update the 2.17.0 release blog post to include security 
issues


--
[...truncated 4.62 MB...]
  File "apache_beam/runners/worker/operations.py", line 178, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 657, in 
apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 658, in 
apache_beam.runners.worker.operations.DoOperation.process
delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 878, in 
apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
  File "apache_beam/runners/common.py", line 885, in 
apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 956, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 883, in 
apache_beam.runners.common.DoFnRunner.process
return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 498, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
windowed_value, self.process_method(windowed_value.value))
  File 
"
 line 1437, in 
  File 
"
 line 191, in _equal
BeamAssertException: Failed assert: ['a: 3', 'b: 1', 'c: 2'] == [], missing 
elements ['a: 3', 'b: 1', 'c: 2'] [while running 'assert_that/Match']

at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at 
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:204)
at 
org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
... 1 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for 
instruction 130: Traceback (most recent call last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 167, in _execute
response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 223, in 
lambda: self.create_worker().do_instruction(request), request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 352, in do_instruction
request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 386, in process_bundle
bundle_processor.process_bundle(instruction_id))
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 812, in process_bundle
data.transform_id].process_encoded(data.data)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 205, in process_encoded
self.output(decoded_value)
  File "apache_beam/runners/worker/operations.py", line 302, in 
apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
  File "apache_beam/runners/worker/operations.py", line 304, in 
apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 178, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 657, in 
apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 658, in 

Jenkins build is back to normal : beam_PostCommit_Python37 #1367

2020-01-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_PortableJar_Spark #288

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [website] Added security page

[iemejia] [website] Update the 2.17.0 release blog post to include security 
issues


--
[...truncated 28.65 KB...]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/kr/fs: commit='2788f0dbd16903de03cb8186e5c7d97b69ad387b', 
urls=[https://github.com/kr/fs.git, g...@github.com:kr/fs.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/magiconair/properties: 
commit='49d762b9817ba1c2e9d0c69183c2b4a8b8f1d934', 
urls=[https://github.com/magiconair/properties.git, 
g...@github.com:magiconair/properties.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/mitchellh/go-homedir: 
commit='b8bc1bf767474819792c23f32d8286a45736f1c6', 
urls=[https://github.com/mitchellh/go-homedir.git, 
g...@github.com:mitchellh/go-homedir.git]
Resolving github.com/mitchellh/mapstructure: 
commit='a4e142e9c047c904fa2f1e144d9a84e6133024bc', 
urls=[https://github.com/mitchellh/mapstructure.git, 
g...@github.com:mitchellh/mapstructure.git]
Resolving github.com/nightlyone/lockfile: 
commit='0ad87eef1443f64d3d8c50da647e2b1552851124', 
urls=[https://github.com/nightlyone/lockfile, 
g...@github.com:nightlyone/lockfile.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/openzipkin/zipkin-go: 
commit='3741243b287094fda649c7f0fa74bd51f37dc122', 
urls=[https://github.com/openzipkin/zipkin-go.git, 
g...@github.com:openzipkin/zipkin-go.git]
Resolving github.com/pelletier/go-toml: 
commit='acdc4509485b587f5e675510c4f2c63e90ff68a8', 
urls=[https://github.com/pelletier/go-toml.git, 
g...@github.com:pelletier/go-toml.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/pierrec/lz4: 
commit='ed8d4cc3b461464e69798080a0092bd028910298', 
urls=[https://github.com/pierrec/lz4.git, g...@github.com:pierrec/lz4.git]
Resolving github.com/pierrec/xxHash: 
commit='a0006b13c722f7f12368c00a3d3c2ae8a999a0c6', 
urls=[https://github.com/pierrec/xxHash.git, g...@github.com:pierrec/xxHash.git]
Resolving github.com/pkg/errors: 
commit='30136e27e2ac8d167177e8a583aa4c3fea5be833', 
urls=[https://github.com/pkg/errors.git, g...@github.com:pkg/errors.git]
Resolving github.com/pkg/sftp: 
commit='22e9c1ccc02fc1b9fa3264572e49109b68a86947', 
urls=[https://github.com/pkg/sftp.git, g...@github.com:pkg/sftp.git]
Resolving github.com/prometheus/client_golang: 
commit='9bb6ab929dcbe1c8393cd9ef70387cb69811bd1c', 
urls=[https://github.com/prometheus/client_golang.git, 
g...@github.com:prometheus/client_golang.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/prometheus/procfs: 
commit='cb4147076ac75738c9a7d279075a253c0cc5acbd', 
urls=[https://github.com/prometheus/procfs.git, 
g...@github.com:prometheus/procfs.git]
Resolving github.com/rcrowley/go-metrics: 
commit='8732c616f52954686704c8645fe1a9d59e9df7c1', 
urls=[https://github.com/rcrowley/go-metrics.git, 
g...@github.com:rcrowley/go-metrics.git]
Resolving github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 
g...@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 
g...@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 

Build failed in Jenkins: beam_PostCommit_Py_ValCont #5290

2020-01-15 Thread Apache Jenkins Server
 "/usr/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 58, in test_metrics_fnapi_it
result = self.run_pipeline(experiment='beam_fn_api')
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 39, in run_pipeline
test_pipeline = TestPipeline(is_integration_test=True)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/testing/test_pipeline.py;,>
 line 108, in __init__
super(TestPipeline, self).__init__(runner, options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py;,>
 line 184, in __init__
errors = PipelineOptionsValidator(self._options, runner).validate()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options_validator.py;,>
 line 113, in validate
errors.extend(self.options.view_as(cls).validate(self))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 590, in validate
if self.view_as(GoogleCloudOptions).region is None:
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 328, in view_as
view = cls(self._flags)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 205, in __init__
cls._add_argparse_args(parser)  # type: ignore
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 465, in _add_argparse_args
('The URL for the Dataflow API. If not set, the default public URL '
  File "/usr/lib/python2.7/argparse.py", line 1291, in add_argument
action_class = self._pop_action_class(kwargs)
  File "/usr/lib/python2.7/argparse.py", line 1437, in _pop_action_class
return self._registry_get('action', action, action)
  File "/usr/lib/python2.7/argparse.py", line 1239, in _registry_get
return self._registries[registry_name].get(value, default)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py;,>
 line 276, in signalhandler
raise TimedOutException()
TimedOutException: 'test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)'

--
XML: nosetests-python2.7_sdk.xml
--
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
--
Ran 2 tests in 904.450s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20200115-180347
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:83d2f5f6c0ef9fd1bf847e2de994c12b74cd6d69f4cce07bd3f9911527b2b93d
Deleted: sha256:1314bfc7f43362a1b395e74773f4b5e1d316fe03e922484dcf9735d16c2c48f5
Deleted: sha256:d548b7fbca4840a5664d6fa593d2e4a4067211dc23a49a779aac08add3c05003
Deleted: sha256:19b7150caea429f59e210d8630343704340a8c1ebdea9c2e4b0518494f298150
Deleted: sha256:973853390042b2953c413e4e74d16e7b8547973c67ac92869be4711313fe4e34
Deleted: sha256:bbe847c003f3307bb4d19fe6fff23c4b319b0597f51dac3a25a7316354a3e44b
Deleted: sha256:4f86db08c16692610dd8b4e2ad706dc68c2a1afa2db66f6e4ca3c8c46d3d7fcf
Deleted: sha256:2818ecad3761c4dcedc18cf3cd1890ba0ee87abe15af85e5330eec80c3899c92
Deleted: sha256:5b95cf2bdb5f1d26b3d94361462124c03456f12811dcd9e6e386c75264e0a62f
Deleted: sha256:1327169bccebc7d91d1a0d2f2dc33cb902b0abf371df85b377bd2e15027aed70
Deleted: sha256:8ec1e4c4b78760937ea93eb30c345b1f4ed09cc19cfb61802180b5eb9cd9b103
Deleted: sha256:1b5479f0b3c9c7f14f5ef794cf5c29aaf467770e05b798b22c9669b1d5180d47
Deleted: sha256:a8539235ce2286b3277f39d454fef1af4f6816f854fec28d067dd15a609a8485
Deleted: sha256:26d5a93a2230760cf15a150534db8b6a7d386e68eec681028caf8d02356a3e31
Deleted: sha256:dfa148de83a0b4b8ff3243ef7862fb9ed8668e94dcf8fa1227d2abea13e969d7
Deleted: sha256:57a70736197f2eb2e9c51b7d1b1f608b0aaf21d8f8ca3254d84fe2401666a8f4
Deleted: sha256:054b88eb8dfaff6736cc92dc434607e8fdc7753a0deb3dea7c2bcdfd1e893e65
Deleted: sha256:735e4aa5c07ab349b98476429f619f5fa2ee3655bb992498a113b082c85f3671
Deleted: sha256:0062225e8141f6401a50

Jenkins build is back to normal : beam_PreCommit_Java_Cron #2279

2020-01-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python2 #1455

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [website] Added security page

[iemejia] [website] Update the 2.17.0 release blog post to include security 
issues


--
[...truncated 3.62 MB...]
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
20/01/15 16:29:40 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting 
job metrics for 
BeamApp-jenkins-0115162919-2068293a_6e228906-4c38-4c4d-abe0-de31e8178d11
20/01/15 16:29:40 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished 
getting job metrics for 
BeamApp-jenkins-0115162919-2068293a_6e228906-4c38-4c4d-abe0-de31e8178d11
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
self.__target(*self.__args, **self.__kwargs)
  File 
"
 line 649, in pull_responses
for response in responses:
  File 
"
 line 413, in next
return self._next()
  File 
"
 line 703, in _next
raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = 
"{"created":"@1579105780.687302786","description":"Error received from peer 
ipv4:127.0.0.1:40189","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Socket
 closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
self.__target(*self.__args, **self.__kwargs)
  File 
"
 line 137, in run
for work_request in control_stub.Control(get_responses()):
  File 
"
 line 413, in next
return self._next()
  File 
"
 line 703, in _next
raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = 
"{"created":"@1579105780.687435730","description":"Error received from peer 
ipv4:127.0.0.1:42209","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Socket
 closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data 
plane.
Traceback (most recent call last):
  File 
"
 line 423, in _read_inputs
for elements in elements_iterator:
  File 
"
 line 413, in next
return self._next()
  File 
"
 line 703, in _next
raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = 
"{"created":"@1579105780.687284694","description":"Error received from peer 
ipv4:127.0.0.1:44017","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Socket
 closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
self.__target(*self.__args, **self.__kwargs)
  File 
"
 line 438, in 
target=lambda: self._read_inputs(elements_iterator),
  File 
"
 line 423, in _read_inputs
for elements in 

Build failed in Jenkins: beam_PostCommit_Python37 #1366

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [website] Added security page

[iemejia] [website] Update the 2.17.0 release blog post to include security 
issues


--
[...truncated 2.63 MB...]
self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
  File 
"
 line 649, in pull_responses
for response in responses:
  File 
"
 line 416, in __next__
return self._next()
  File 
"
 line 703, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that 
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = 
"{"created":"@1579106083.321397521","description":"Error received from peer 
ipv4:127.0.0.1:43053","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Socket
 closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
  File 
"
 line 137, in run
for work_request in control_stub.Control(get_responses()):
  File 
"
 line 416, in __next__
return self._next()
  File 
"
 line 703, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that 
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = 
"{"created":"@1579106083.321427427","description":"Error received from peer 
ipv4:127.0.0.1:46577","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Socket
 closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data 
plane.
Traceback (most recent call last):
  File 
"
 line 423, in _read_inputs
for elements in elements_iterator:
  File 
"
 line 416, in __next__
return self._next()
  File 
"
 line 703, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that 
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = 
"{"created":"@1579106083.321391152","description":"Error received from peer 
ipv4:127.0.0.1:41789","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Socket
 closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
  File 
"
 line 438, in 
target=lambda: self._read_inputs(elements_iterator),
  File 
"
 line 423, in _read_inputs
for elements in elements_iterator:
  File 
"
 line 416, in __next__
return self._next()
  File 
"
 line 703, in _next

Build failed in Jenkins: beam_PostCommit_Python35 #1463

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [website] Added security page

[iemejia] [website] Update the 2.17.0 release blog post to include security 
issues


--
[...truncated 2.64 MB...]
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader 
service.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting 
down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager 
removed spill file directory /tmp/flink-io-608faeab-fb9f-4b90-a6e6-20c1b8f8c6e9
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the 
network environment and its components.
[ForkJoinPool.commonPool-worker-9] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache 
directory /tmp/flink-web-ui
[flink-runner-job-invoker] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down 
cluster because application is in CANCELED, diagnostics 
DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher 
akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all 
currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-6] INFO 
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing 
the SlotManager.
[flink-akka.actor.default-dispatcher-6] INFO 
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - 
Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager 
removed spill file directory 
/tmp/flink-netty-shuffle-d12fd0d9-f810-4e5d-a397-6222bedd89fe
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the 
kvState service and its components.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader 
service.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.filecache.FileCache - removed file cache directory 
/tmp/flink-dist-cache-77293989-eb3b-4635-89b9-6be7e90abb38
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor 
akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator
 - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher 
akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - 
Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - 
Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - 
Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - 
Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - 
Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-7] INFO 
org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-7] INFO 
org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-7] INFO 
org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:37531
[flink-akka.actor.default-dispatcher-7] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO 
org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 15657 
msecs
[flink-runner-job-invoker] INFO 
org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO 
org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : 
MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 

Build failed in Jenkins: beam_PostCommit_Java11_ValidatesRunner_Direct #2999

2020-01-15 Thread Apache Jenkins Server
See 


Changes:


--
[...truncated 83 B...]
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2b07e0efb5db918d462873dcdd0055285fe7bf7a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2b07e0efb5db918d462873dcdd0055285fe7bf7a
Commit message: "Merge pull request #10581: [website] Add security page and 
update 2.17.0 release blog post to reference it"
 > git rev-list --no-walk 2b07e0efb5db918d462873dcdd0055285fe7bf7a # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 -Dorg.gradle.java.home=/usr/lib/jvm/java-8-openjdk-amd64 
:runners:direct-java:shadowJar :runners:direct-java:shadowTestJar
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:direct-java:processTestResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :model:job-management:extractProto
> Task :model:fn-execution:extractProto
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava 

Build failed in Jenkins: beam_PostCommit_PortableJar_Spark #289

2020-01-15 Thread Apache Jenkins Server
See 


Changes:


--
[...truncated 28.79 KB...]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/kr/fs: commit='2788f0dbd16903de03cb8186e5c7d97b69ad387b', 
urls=[https://github.com/kr/fs.git, g...@github.com:kr/fs.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/magiconair/properties: 
commit='49d762b9817ba1c2e9d0c69183c2b4a8b8f1d934', 
urls=[https://github.com/magiconair/properties.git, 
g...@github.com:magiconair/properties.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/mitchellh/go-homedir: 
commit='b8bc1bf767474819792c23f32d8286a45736f1c6', 
urls=[https://github.com/mitchellh/go-homedir.git, 
g...@github.com:mitchellh/go-homedir.git]
Resolving github.com/mitchellh/mapstructure: 
commit='a4e142e9c047c904fa2f1e144d9a84e6133024bc', 
urls=[https://github.com/mitchellh/mapstructure.git, 
g...@github.com:mitchellh/mapstructure.git]
Resolving github.com/nightlyone/lockfile: 
commit='0ad87eef1443f64d3d8c50da647e2b1552851124', 
urls=[https://github.com/nightlyone/lockfile, 
g...@github.com:nightlyone/lockfile.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/openzipkin/zipkin-go: 
commit='3741243b287094fda649c7f0fa74bd51f37dc122', 
urls=[https://github.com/openzipkin/zipkin-go.git, 
g...@github.com:openzipkin/zipkin-go.git]
Resolving github.com/pelletier/go-toml: 
commit='acdc4509485b587f5e675510c4f2c63e90ff68a8', 
urls=[https://github.com/pelletier/go-toml.git, 
g...@github.com:pelletier/go-toml.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/pierrec/lz4: 
commit='ed8d4cc3b461464e69798080a0092bd028910298', 
urls=[https://github.com/pierrec/lz4.git, g...@github.com:pierrec/lz4.git]
Resolving github.com/pierrec/xxHash: 
commit='a0006b13c722f7f12368c00a3d3c2ae8a999a0c6', 
urls=[https://github.com/pierrec/xxHash.git, g...@github.com:pierrec/xxHash.git]
Resolving github.com/pkg/errors: 
commit='30136e27e2ac8d167177e8a583aa4c3fea5be833', 
urls=[https://github.com/pkg/errors.git, g...@github.com:pkg/errors.git]
Resolving github.com/pkg/sftp: 
commit='22e9c1ccc02fc1b9fa3264572e49109b68a86947', 
urls=[https://github.com/pkg/sftp.git, g...@github.com:pkg/sftp.git]
Resolving github.com/prometheus/client_golang: 
commit='9bb6ab929dcbe1c8393cd9ef70387cb69811bd1c', 
urls=[https://github.com/prometheus/client_golang.git, 
g...@github.com:prometheus/client_golang.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/prometheus/procfs: 
commit='cb4147076ac75738c9a7d279075a253c0cc5acbd', 
urls=[https://github.com/prometheus/procfs.git, 
g...@github.com:prometheus/procfs.git]
Resolving github.com/rcrowley/go-metrics: 
commit='8732c616f52954686704c8645fe1a9d59e9df7c1', 
urls=[https://github.com/rcrowley/go-metrics.git, 
g...@github.com:rcrowley/go-metrics.git]
Resolving github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 
g...@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 
g...@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/spf13/afero: 
commit='bb8f1927f2a9d3ab41c9340aa034f6b803f4359c', 
urls=[https://github.com/spf13/afero.git, g...@github.com:spf13/afero.git]
Resolving github.com/spf13/cast: 
commit='acbeb36b902d72a7a4c18e8f3241075e7ab763e4', 

Build failed in Jenkins: beam_PostCommit_XVR_Flink #1460

2020-01-15 Thread Apache Jenkins Server
See 


Changes:


--
[...truncated 5.12 MB...]
  File "apache_beam/runners/worker/operations.py", line 657, in 
apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 658, in 
apache_beam.runners.worker.operations.DoOperation.process
delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 878, in 
apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
  File "apache_beam/runners/common.py", line 885, in 
apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 941, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
raise
  File "apache_beam/runners/common.py", line 883, in 
apache_beam.runners.common.DoFnRunner.process
return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 497, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
self.output_processor.process_outputs(
  File "apache_beam/runners/common.py", line 1028, in 
apache_beam.runners.common._OutputProcessor.process_outputs
self.main_receivers.receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 178, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 657, in 
apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 658, in 
apache_beam.runners.worker.operations.DoOperation.process
delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 878, in 
apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
  File "apache_beam/runners/common.py", line 885, in 
apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 956, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 883, in 
apache_beam.runners.common.DoFnRunner.process
return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 498, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
windowed_value, self.process_method(windowed_value.value))
  File 
"
 line 1437, in 
  File 
"
 line 191, in _equal
BeamAssertException: Failed assert: ['a: 3', 'b: 1', 'c: 2'] == [], missing 
elements ['a: 3', 'b: 1', 'c: 2'] [while running 'assert_that/Match']

at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at 
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:204)
at 
org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
... 1 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for 
instruction 130: Traceback (most recent call last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 167, in _execute
response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 223, in 
lambda: self.create_worker().do_instruction(request), request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 352, in do_instruction
request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 386, in process_bundle
bundle_processor.process_bundle(instruction_id))
  File 

Build failed in Jenkins: beam_sonarqube_report #1276

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [website] Added security page

[iemejia] [website] Update the 2.17.0 release blog post to include security 
issues


--
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-10 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init  # 
 > timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2b07e0efb5db918d462873dcdd0055285fe7bf7a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2b07e0efb5db918d462873dcdd0055285fe7bf7a
Commit message: "Merge pull request #10581: [website] Add security page and 
update 2.17.0 release blog post to reference it"
 > git rev-list --no-walk 4fc64ce9de73d0ebfc41e6aa04a4c6a5428a79dd # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
ERROR: SonarQube installation defined in this job (ASF Sonar Analysis) does not 
match any configured installation. Number of installations that can be 
configured: 0.
If you want to reassign jobs to a different SonarQube installation, check the 
documentation under https://redirect.sonarsource.com/plugins/jenkins.html

-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PreCommit_Python2_PVR_Flink_Cron #418

2020-01-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java11_ValidatesRunner_Direct #2998

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [website] Added security page

[iemejia] [website] Update the 2.17.0 release blog post to include security 
issues


--
[...truncated 100 B...]
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2b07e0efb5db918d462873dcdd0055285fe7bf7a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2b07e0efb5db918d462873dcdd0055285fe7bf7a
Commit message: "Merge pull request #10581: [website] Add security page and 
update 2.17.0 release blog post to reference it"
 > git rev-list --no-walk 4fc64ce9de73d0ebfc41e6aa04a4c6a5428a79dd # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 -Dorg.gradle.java.home=/usr/lib/jvm/java-8-openjdk-amd64 
:runners:direct-java:shadowJar :runners:direct-java:shadowTestJar
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :runners:direct-java:processTestResources NO-SOURCE
> Task :model:job-management:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :model:fn-execution:processResources
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming #473

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [website] Added security page

[iemejia] [website] Update the 2.17.0 release blog post to include security 
issues


--
[...truncated 23.77 KB...]
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:io:jdbc:compileJava FROM-CACHE
> Task :sdks:java:io:jdbc:classes UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:io:common:testJar
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:extensions:protobuf:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:fn-execution:jar
> Task :sdks:java:io:jdbc:jar
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :runners:core-construction-java:jar
> Task :sdks:java:testing:test-utils:compileTestJava FROM-CACHE
> Task :sdks:java:testing:test-utils:testClasses UP-TO-DATE
> Task :sdks:java:testing:test-utils:testJar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :sdks:java:io:hadoop-common:compileJava FROM-CACHE
> Task :sdks:java:io:hadoop-common:classes UP-TO-DATE
> Task :sdks:java:io:hadoop-common:jar
> Task :sdks:java:harness:jar
> Task :sdks:java:io:hadoop-format:compileJava FROM-CACHE
> Task :sdks:java:io:hadoop-format:classes UP-TO-DATE
> Task :sdks:java:io:hadoop-format:jar
> Task :sdks:java:core:compileTestJava FROM-CACHE
> Task :sdks:java:core:testClasses
> Task :sdks:java:core:shadowTestJar
> Task :runners:core-java:compileTestJava FROM-CACHE
> Task :runners:core-java:testClasses UP-TO-DATE
> Task :runners:core-java:testJar

> Task :examples:java:compileJava
Note: 

 uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :examples:java:classes
> Task :examples:java:jar
> Task :sdks:java:harness:shadowJar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :runners:spark:compileJava FROM-CACHE
> Task :runners:spark:classes
> Task :runners:spark:jar
> Task :runners:spark:compileTestJava FROM-CACHE
> Task :runners:spark:testClasses
> Task :runners:spark:testJar
> Task :runners:direct-java:shadowJar

> Task :examples:java:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :examples:java:testClasses
> Task :examples:java:testJar

> Task :sdks:java:io:hadoop-format:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :sdks:java:io:hadoop-format:testClasses
> Task :runners:spark:validatesStructuredStreamingRunnerBatch

org.apache.beam.sdk.transforms.ViewTest > testEmptyMultimapSideInput FAILED
org.apache.beam.sdk.Pipeline$PipelineExecutionException at ViewTest.java:830
Caused by: java.io.FileNotFoundException

org.apache.beam.sdk.transforms.ViewTest > testCombinedMapSideInput FAILED
java.lang.IllegalStateException at ViewTest.java:1334
Caused by: io.netty.channel.ChannelException at ViewTest.java:1334
Caused by: java.io.IOException at ViewTest.java:1334

org.apache.beam.sdk.transforms.ViewTest > testMultimapSideInputIsImmutable 
FAILED
org.apache.spark.SparkException at ViewTest.java:913
Caused by: java.io.IOException
Caused by: io.netty.channel.ChannelException
Caused by: 

Build failed in Jenkins: beam_PreCommit_CommunityMetrics_Cron #1756

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [website] Added security page

[iemejia] [website] Update the 2.17.0 release blog post to include security 
issues


--
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-15 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 >  
 > # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2b07e0efb5db918d462873dcdd0055285fe7bf7a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2b07e0efb5db918d462873dcdd0055285fe7bf7a
Commit message: "Merge pull request #10581: [website] Add security page and 
update 2.17.0 release blog post to reference it"
 > git rev-list --no-walk 4fc64ce9de73d0ebfc41e6aa04a4c6a5428a79dd # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g :communityMetricsPreCommit
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :beam-test-infra-metrics:composeUp FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-test-infra-metrics:composeUp'.
> Exit-code 255 when calling docker-compose, stdout: postgresql uses an image, 
> skipping
  prometheus uses an image, skipping
  pushgateway uses an image, skipping
  alertmanager uses an image, skipping
  Building grafana
  [12572] Failed to execute script docker-compose
  Traceback (most recent call last):
File "bin/docker-compose", line 6, in 
File "compose/cli/main.py", line 71, in main
File "compose/cli/main.py", line 127, in perform_command
File "compose/cli/main.py", line 287, in build
File "compose/project.py", line 386, in build
File "compose/project.py", line 368, in build_service
File "compose/service.py", line 1084, in build
File "site-packages/docker/api/build.py", line 260, in build
File "site-packages/docker/api/build.py", line 307, in _set_auth_headers
File "site-packages/docker/auth.py", line 310, in get_all_credentials
File "site-packages/docker/auth.py", line 262, in 
_resolve_authconfig_credstore
File "site-packages/docker/auth.py", line 287, in _get_store_instance
File "site-packages/dockerpycreds/store.py", line 25, in __init__
  dockerpycreds.errors.InitializationError: docker-credential-gcloud not 
installed or not available in PATH

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use 

Build failed in Jenkins: beam_PostCommit_Python2 #1456

2020-01-15 Thread Apache Jenkins Server
See 


Changes:


--
[...truncated 3.62 MB...]
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
:84:
 UserWarning: You are using Apache Beam with Python 2. New releases of Apache 
Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
INFO:apache_beam.runners.portability.fn_api_runner_transforms:
  
WARNING:apache_beam.utils.subprocess_server:Starting service with ['java' 
'-jar' 
'
 '--spark-master-url' 'local[4]' '--artifacts-dir' 
'/tmp/beam-tempKWVREm/artifactsEcmhKe' '--job-port' '58111' '--artifact-port' 
'0' '--expansion-port' '0']
20/01/15 18:14:59 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: 
ArtifactStagingService started on localhost:39855
20/01/15 18:14:59 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java 
ExpansionService started on localhost:33031
20/01/15 18:14:59 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService 
started on localhost:58111
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--parallelism=2', '--shutdown_sources_on_final_watermark']
20/01/15 18:15:00 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking 
job BeamApp-jenkins-0115181500-19249cd1_845e73b5-7028-4da0-9a0c-b22d850ec8ae
20/01/15 18:15:00 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job 
invocation 
BeamApp-jenkins-0115181500-19249cd1_845e73b5-7028-4da0-9a0c-b22d850ec8ae
INFO:root:Waiting until the pipeline has finished because the environment 
"LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
20/01/15 18:15:01 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
PipelineOptions.filesToStage was not specified. Defaulting to files from the 
classpath
20/01/15 18:15:01 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will 
stage 1 files. (Enable logging at DEBUG level to see which files will be 
staged.)
20/01/15 18:15:01 INFO 
org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand 
new Spark Context.
20/01/15 18:15:02 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
20/01/15 18:15:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
Running job 
BeamApp-jenkins-0115181500-19249cd1_845e73b5-7028-4da0-9a0c-b22d850ec8ae on 
Spark master local[4]
20/01/15 18:15:03 INFO 
org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated 
aggregators accumulator: 
20/01/15 18:15:03 INFO 
org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics 
accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel 
for localhost:39423.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with 
unbounded number of workers.
20/01/15 18:15:05 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam 
Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for 
localhost:32929.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for 
localhost:33691
20/01/15 18:15:05 INFO 
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client 
connected.
20/01/15 18:15:06 WARN 
org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: 
Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not 
consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path 
matching: -*-of-%(num_shards)05d
20/01/15 18:15:10 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
BeamApp-jenkins-0115181500-19249cd1_845e73b5-7028-4da0-9a0c-b22d850ec8ae: 
Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with 
num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 

beam_PostCommit_PortableJar_Flink - Build # 1118 - Aborted

2020-01-15 Thread Apache Jenkins Server
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink 
(build #1118)

Status: Aborted

Check console output at 
https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1118/ to view 
the results.

-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org

Build failed in Jenkins: beam_PostCommit_Python2 #1457

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[boyuanz] Moving to 2.20.0-SNAPSHOT on master branch.


--
[...truncated 3.64 MB...]
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
20/01/15 19:27:30 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting 
job metrics for 
BeamApp-jenkins-0115192715-9c5b172e_81679d1b-5ba4-4ccf-b243-44f4d8f3418a
20/01/15 19:27:30 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished 
getting job metrics for 
BeamApp-jenkins-0115192715-9c5b172e_81679d1b-5ba4-4ccf-b243-44f4d8f3418a
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
self.__target(*self.__args, **self.__kwargs)
  File 
"
 line 137, in run
for work_request in control_stub.Control(get_responses()):
  File 
"
 line 413, in next
return self._next()
  File 
"
 line 703, in _next
raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = 
"{"created":"@1579116451.134124366","description":"Error received from peer 
ipv4:127.0.0.1:39223","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Socket
 closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data 
plane.
Traceback (most recent call last):
  File 
"
 line 423, in _read_inputs
for elements in elements_iterator:
  File 
"
 line 413, in next
return self._next()
  File 
"
 line 703, in _next
raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = 
"{"created":"@1579116451.134046701","description":"Error received from peer 
ipv4:127.0.0.1:41459","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Socket
 closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
self.__target(*self.__args, **self.__kwargs)
  File 
"
 line 438, in 
target=lambda: self._read_inputs(elements_iterator),
  File 
"
 line 423, in _read_inputs
for elements in elements_iterator:
  File 
"
 line 413, in next
return self._next()
  File 
"
 line 703, in _next
raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = 
"{"created":"@1579116451.134046701","description":"Error received from peer 
ipv4:127.0.0.1:41459","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Socket
 closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
self.__target(*self.__args, **self.__kwargs)
  File 
"
 line 649, in pull_responses
for response in responses:
  File 

Build failed in Jenkins: beam_PostCommit_XVR_Flink #1462

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[ehudm] [BEAM-8525] Support Const base in binary_subscr

[ehudm] Do not perform test on Py2.7


--
[...truncated 5.14 MB...]
delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 878, in 
apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
  File "apache_beam/runners/common.py", line 885, in 
apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 941, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
raise
  File "apache_beam/runners/common.py", line 883, in 
apache_beam.runners.common.DoFnRunner.process
return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 497, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
self.output_processor.process_outputs(
  File "apache_beam/runners/common.py", line 1028, in 
apache_beam.runners.common._OutputProcessor.process_outputs
self.main_receivers.receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 178, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 657, in 
apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 658, in 
apache_beam.runners.worker.operations.DoOperation.process
delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 878, in 
apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
  File "apache_beam/runners/common.py", line 885, in 
apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 956, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 883, in 
apache_beam.runners.common.DoFnRunner.process
return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 498, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
windowed_value, self.process_method(windowed_value.value))
  File 
"
 line 1437, in 
  File 
"
 line 191, in _equal
BeamAssertException: Failed assert: ['a: 3', 'b: 1', 'c: 2'] == [], missing 
elements ['a: 3', 'b: 1', 'c: 2'] [while running 'assert_that/Match']

at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at 
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:204)
at 
org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
... 1 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for 
instruction 130: Traceback (most recent call last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 167, in _execute
response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 223, in 
lambda: self.create_worker().do_instruction(request), request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 352, in do_instruction
request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 386, in process_bundle
bundle_processor.process_bundle(instruction_id))
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 812, in process_bundle
data.transform_id].process_encoded(data.data)
  File 

Build failed in Jenkins: beam_PostCommit_PortableJar_Spark #292

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[lukecwik] [BEAM-8676] sdks/java: gax and grpc upgrades (#10554)


--
[...truncated 474.00 KB...]
  Found link 
https://files.pythonhosted.org/packages/8d/c7/f05c87812fa5d9562ecbc5f4f1fc1570444f53c81c834a7f662af406e3c1/pip-0.5.tar.gz#sha256=328d8412782f22568508a0d0c78a49c9920a82e44c8dfca49954fe525c152b2a
 (from https://pypi.org/simple/pip/), version: 0.5
  Found link 
https://files.pythonhosted.org/packages/9a/aa/f536b6d14fe03343367da2ff44eee28f340ae650cd017ca088b6be13084a/pip-0.5.1.tar.gz#sha256=e27650538c41fe1007a41abd4cfd0f905b822622cbe1f8e7e09d1215af207694
 (from https://pypi.org/simple/pip/), version: 0.5.1
  Found link 
https://files.pythonhosted.org/packages/db/e6/fdf7be8a17b032c533d3f91e91e2c63dd81d3627cbe4113248a00c2d39d8/pip-0.6.tar.gz#sha256=4cf47db6815b2f435d1f44e1f35ff04823043f6161f7df9aec71a123b0c47f0d
 (from https://pypi.org/simple/pip/), version: 0.6
  Found link 
https://files.pythonhosted.org/packages/91/cd/105f4d3c75d0ae18e12623acc96f42168aaba408dd6e43c4505aa21f8e37/pip-0.6.1.tar.gz#sha256=efe47e84ffeb0ea4804f9858b8a94bebd07f5452f907ebed36d03aed06a9f9ec
 (from https://pypi.org/simple/pip/), version: 0.6.1
  Found link 
https://files.pythonhosted.org/packages/1c/c7/c0e1a9413c37828faf290f29a85a4d6034c145cc04bf1622ba8beb662ad8/pip-0.6.2.tar.gz#sha256=1c1a504d7e70d2c24246f95bd16e3d5fcec740fd144df69a407bf65a2ee67586
 (from https://pypi.org/simple/pip/), version: 0.6.2
  Found link 
https://files.pythonhosted.org/packages/3f/af/c4b9d49fb0f286996b28dbc0955c3ad359794697eb98e0e69863908070b0/pip-0.6.3.tar.gz#sha256=1a6df71eb29b98cba11bde6d6a0d8c6dd8b0518e74ceb71fb31ea4fbb42fd313
 (from https://pypi.org/simple/pip/), version: 0.6.3
  Found link 
https://files.pythonhosted.org/packages/ec/7a/6fe91ff0079ad0437830957c459d52f3923e516f5b453218f2a93d09a427/pip-0.7.tar.gz#sha256=ceaea0b9e494d893c8a191895301b79c1db33e41f14d3ad93e3d28a8b4e9bf27
 (from https://pypi.org/simple/pip/), version: 0.7
  Found link 
https://files.pythonhosted.org/packages/a5/63/11303863c2f5e9d9a15d89fcf7513a4b60987007d418862e0fb65c09fff7/pip-0.7.1.tar.gz#sha256=f54f05aa17edd0036de433c44892c8fedb1fd2871c97829838feb995818d24c3
 (from https://pypi.org/simple/pip/), version: 0.7.1
  Found link 
https://files.pythonhosted.org/packages/cd/a9/1debaa96bbc1005c1c8ad3b79fec58c198d35121546ea2e858ce0894268a/pip-0.7.2.tar.gz#sha256=98df2eb779358412bbbae75980171ae85deebc846d87e244d086520b1212da09
 (from https://pypi.org/simple/pip/), version: 0.7.2
  Found link 
https://files.pythonhosted.org/packages/74/54/f785c327fb3d163560a879b36edae5c78ee07806be282c9d4807f6be7dd1/pip-0.8.tar.gz#sha256=9017e4484a212dd4e1a43dd9f039dd7fc8338d4eea1c339d5ae1c80726de5b0f
 (from https://pypi.org/simple/pip/), version: 0.8
  Found link 
https://files.pythonhosted.org/packages/5c/79/5e8381cc3078bae92166f2ba96de8355e8c181926505ba8882f7b099a500/pip-0.8.1.tar.gz#sha256=7176a87f35675f6468341212f3b959bb51d23ea66eb1c3692bf746c45c716fa2
 (from https://pypi.org/simple/pip/), version: 0.8.1
  Found link 
https://files.pythonhosted.org/packages/17/3e/0a98ab032991518741e7e712a719633e6ae160f51b3d3e855194530fd308/pip-0.8.2.tar.gz#sha256=f80a3549c048bc3bbcb47844826e9c7c6fcd87e77b92bef0d9e66d1b397c4962
 (from https://pypi.org/simple/pip/), version: 0.8.2
  Found link 
https://files.pythonhosted.org/packages/f7/9a/943fc6d879ed7220bac2e7e53096bfe78abec88d77f2f516400e0129679e/pip-0.8.3.tar.gz#sha256=1be2e18edd38aa75b5e4ef38a99ec33ba9247177cfcb4a6d2d2b3e73430e3001
 (from https://pypi.org/simple/pip/), version: 0.8.3
  Found link 
https://files.pythonhosted.org/packages/24/33/6eb675fb6db7b71d69d6928b33dea61b8bf5cfe1e5649be70ec84ce2fc09/pip-1.0.tar.gz#sha256=34ba07e2d14ba86d5088ba896ac80bed845a9b276ab8acb279b8d99bc77fec8e
 (from https://pypi.org/simple/pip/), version: 1.0
  Found link 
https://files.pythonhosted.org/packages/10/d9/f584e6107ef98ad7e5d0f756bfee12561fa6a4712ffdb7209e0e1fd4/pip-1.0.1.tar.gz#sha256=37d2f18213d3845d2038dd3686bc71fc12bb41ad66c945a8b0dfec2879f3497b
 (from https://pypi.org/simple/pip/), version: 1.0.1
  Found link 
https://files.pythonhosted.org/packages/16/90/5e6f80364d8a656f60681dfb7330298edef292d43e1499bcb3a4c71ff0b9/pip-1.0.2.tar.gz#sha256=a6ed9b36aac2f121c01a2c9e0307a9e4d9438d100a407db701ac65479a3335d2
 (from https://pypi.org/simple/pip/), version: 1.0.2
  Found link 
https://files.pythonhosted.org/packages/25/57/0d42cf5307d79913a082c5c4397d46f3793bc35e1138a694136d6e31be99/pip-1.1.tar.gz#sha256=993804bb947d18508acee02141281c77d27677f8c14eaa64d6287a1c53ef01c8
 (from https://pypi.org/simple/pip/), version: 1.1
  Found link 
https://files.pythonhosted.org/packages/ba/c3/4e1f892f41aaa217fe0d1f827fa05928783349c69f3cc06fdd68e112678a/pip-1.2.tar.gz#sha256=2b168f1987403f1dc6996a1f22a6f6637b751b7ab6ff27e78380b8d6e70aa314
 (from https://pypi.org/simple/pip/), version: 1.2
  Found link 

Build failed in Jenkins: beam_PostCommit_Java11_ValidatesRunner_Direct #3004

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[hannahjiang] [BEAM-9084] cleaning up docker image tag

[hannahjiang] [BEAM-9084] fix Java spotless


--
[...truncated 1.21 KB...]
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 003c89135754aac3f7e80f50523c8f7caa4ffcee (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 003c89135754aac3f7e80f50523c8f7caa4ffcee
Commit message: "Merge pull request #10557 from 
Hannah-Jiang/cleaning_up_docker_tags"
 > git rev-list --no-walk 2fd785dedb979a248e63c6385f978fd18fd2fbc4 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 -Dorg.gradle.java.home=/usr/lib/jvm/java-8-openjdk-amd64 
:runners:direct-java:shadowJar :runners:direct-java:shadowTestJar
Starting a Gradle Daemon, 1 busy and 1 stopped Daemons could not be reused, use 
--status for details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :runners:direct-java:processTestResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :runners:local-java:jar
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task 

Jenkins build is back to normal : beam_PostCommit_Python_VR_Spark #2057

2020-01-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python2 #1459

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[lukecwik] [BEAM-8676] sdks/java: gax and grpc upgrades (#10554)


--
[...truncated 3.61 MB...]
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
:84:
 UserWarning: You are using Apache Beam with Python 2. New releases of Apache 
Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
INFO:apache_beam.runners.portability.fn_api_runner_transforms:
  
WARNING:apache_beam.utils.subprocess_server:Starting service with ['java' 
'-jar' 
'
 '--spark-master-url' 'local[4]' '--artifacts-dir' 
'/tmp/beam-tempwpFmfM/artifacts1FPwRK' '--job-port' '54099' '--artifact-port' 
'0' '--expansion-port' '0']
20/01/15 21:30:06 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: 
ArtifactStagingService started on localhost:43679
20/01/15 21:30:06 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java 
ExpansionService started on localhost:42085
20/01/15 21:30:06 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService 
started on localhost:54099
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--parallelism=2', '--shutdown_sources_on_final_watermark']
20/01/15 21:30:08 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking 
job BeamApp-jenkins-0115213008-26f81d72_d54e0267-af3e-4f18-8c5a-6c79e45eb3d2
20/01/15 21:30:08 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job 
invocation 
BeamApp-jenkins-0115213008-26f81d72_d54e0267-af3e-4f18-8c5a-6c79e45eb3d2
INFO:root:Waiting until the pipeline has finished because the environment 
"LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
20/01/15 21:30:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
PipelineOptions.filesToStage was not specified. Defaulting to files from the 
classpath
20/01/15 21:30:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will 
stage 1 files. (Enable logging at DEBUG level to see which files will be 
staged.)
20/01/15 21:30:09 INFO 
org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand 
new Spark Context.
20/01/15 21:30:10 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
20/01/15 21:30:11 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
Running job 
BeamApp-jenkins-0115213008-26f81d72_d54e0267-af3e-4f18-8c5a-6c79e45eb3d2 on 
Spark master local[4]
20/01/15 21:30:11 INFO 
org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated 
aggregators accumulator: 
20/01/15 21:30:11 INFO 
org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics 
accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel 
for localhost:44645.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with 
unbounded number of workers.
20/01/15 21:30:13 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam 
Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for 
localhost:36181.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for 
localhost:44971
20/01/15 21:30:13 INFO 
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client 
connected.
20/01/15 21:30:13 WARN 
org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: 
Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not 
consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path 
matching: -*-of-%(num_shards)05d
20/01/15 21:30:15 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
BeamApp-jenkins-0115213008-26f81d72_d54e0267-af3e-4f18-8c5a-6c79e45eb3d2: 
Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with 
num_shards: 4 

Build failed in Jenkins: beam_PostCommit_Java_PVR_Flink_Streaming #3819

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[boyuanz] Moving to 2.20.0-SNAPSHOT on master branch.


--
Started by GitHub push by asfgit
Started by GitHub push by asfgit
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-15 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2e06ca4d0a9249d1bd7f4bfe421c332648230fe3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2e06ca4d0a9249d1bd7f4bfe421c332648230fe3
Commit message: "Moving to 2.20.0-SNAPSHOT on master branch."
 > git rev-list --no-walk 2b07e0efb5db918d462873dcdd0055285fe7bf7a # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g 
:runners:flink:1.9:job-server:validatesPortableRunnerStreaming
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :runners:flink:1.9:copyResourcesOverrides NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :runners:core-construction-java:processTestResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :runners:portability:java:processTestResources NO-SOURCE
> Task :runners:flink:1.9:copySourceOverrides
> Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE
> Task :model:job-management:processResources
> Task :runners:flink:1.9:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :model:fn-execution:processResources
> Task :runners:flink:1.9:copyTestSourceOverrides
> Task :runners:flink:1.9:processTestResources
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task 

Jenkins build is back to normal : beam_LoadTests_Python_coGBK_Flink_Batch #174

2020-01-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python37 #1368

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[boyuanz] Moving to 2.20.0-SNAPSHOT on master branch.


--
[...truncated 2.76 MB...]
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 

apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 

apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 

apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 

apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: No unique name set for 
transform fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: start 

apache_beam.runners.worker.bundle_processor: DEBUG: start 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 

apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
root: INFO: Job status: RUNNING
root: INFO: Job status: DONE
root: INFO: Job status: RUNNING
- >> end captured logging << -

--
XML: nosetests-postCommitIT-flink-py37.xml
--
XML: 

--
Ran 4 tests in 44.006s

FAILED (SKIP=2, errors=1)

> Task :sdks:python:test-suites:portable:py37:postCommitPy37IT FAILED

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-15_11_17_30-18424516463501322867?project=apache-beam-testing
:1421:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-15_11_31_58-4142052142960651545?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-15_11_40_26-1614199871594416436?project=apache-beam-testing
Worker logs: 

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Direct #2379

2020-01-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_XVR_Flink #1463

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[lukecwik] [BEAM-8676] sdks/java: gax and grpc upgrades (#10554)


--
[...truncated 5.15 MB...]
delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 878, in 
apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
  File "apache_beam/runners/common.py", line 885, in 
apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 941, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
raise
  File "apache_beam/runners/common.py", line 883, in 
apache_beam.runners.common.DoFnRunner.process
return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 497, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
self.output_processor.process_outputs(
  File "apache_beam/runners/common.py", line 1028, in 
apache_beam.runners.common._OutputProcessor.process_outputs
self.main_receivers.receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 178, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 657, in 
apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 658, in 
apache_beam.runners.worker.operations.DoOperation.process
delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 878, in 
apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
  File "apache_beam/runners/common.py", line 885, in 
apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 956, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 883, in 
apache_beam.runners.common.DoFnRunner.process
return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 498, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
windowed_value, self.process_method(windowed_value.value))
  File 
"
 line 1437, in 
  File 
"
 line 191, in _equal
BeamAssertException: Failed assert: ['a: 3', 'b: 1', 'c: 2'] == [], missing 
elements ['a: 3', 'b: 1', 'c: 2'] [while running 'assert_that/Match']

at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at 
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:204)
at 
org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
... 1 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for 
instruction 134: Traceback (most recent call last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 167, in _execute
response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 223, in 
lambda: self.create_worker().do_instruction(request), request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 352, in do_instruction
request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 386, in process_bundle
bundle_processor.process_bundle(instruction_id))
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 812, in process_bundle
data.transform_id].process_encoded(data.data)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 205, in 

Build failed in Jenkins: beam_PostCommit_Java_PVR_Flink_Streaming #3821

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[lukecwik] [BEAM-8676] sdks/java: gax and grpc upgrades (#10554)


--
[...truncated 3.33 KB...]
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:flink:1.9:copyResourcesOverrides NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-construction-java:processTestResources NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :sdks:java:extensions:protobuf:extractProto
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :runners:portability:java:processTestResources NO-SOURCE
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :runners:flink:1.9:copySourceOverrides
> Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE
> Task :runners:flink:1.9:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :runners:flink:1.9:copyTestSourceOverrides
> Task :runners:flink:1.9:processTestResources
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :sdks:java:build-tools:compileJava
> Task :sdks:java:build-tools:processResources
> Task :sdks:java:build-tools:classes
> Task :sdks:java:build-tools:jar
> Task :model:pipeline:compileJava
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:pipeline:shadowJar
> Task :model:job-management:compileJava
> Task :model:job-management:classes
> Task :model:job-management:shadowJar
> Task :model:fn-execution:compileJava
> Task :model:fn-execution:classes
> Task :model:fn-execution:shadowJar

> Task :sdks:java:core:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:core:jar

> Task :vendor:sdks-java-extensions-protobuf:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:extensions:protobuf:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:fn-execution:compileJava
Note: 

 uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task 

Build failed in Jenkins: beam_PostCommit_Java11_ValidatesRunner_Direct #3003

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[lukecwik] [BEAM-9030] Migrate Beam to use beam-vendor-grpc-1_26_0 (#10578)


--
[...truncated 12.80 KB...]
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:pipeline:shadowJar
> Task :model:job-management:compileJava
> Task :model:job-management:classes
> Task :model:job-management:shadowJar
> Task :model:fn-execution:compileJava
> Task :model:fn-execution:classes
> Task :model:fn-execution:shadowJar

> Task :sdks:java:core:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :runners:local-java:compileJava
> Task :runners:local-java:classes
> Task :runners:local-java:jar

> Task :vendor:sdks-java-extensions-protobuf:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :vendor:sdks-java-extensions-protobuf:classes
> Task :vendor:sdks-java-extensions-protobuf:shadowJar

> Task :sdks:java:fn-execution:compileJava
Note: 

 uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:fn-execution:classes
> Task :sdks:java:fn-execution:jar

> Task :sdks:java:extensions:google-cloud-platform-core:compileJava
Note: 

 uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :sdks:java:extensions:google-cloud-platform-core:classes
> Task :sdks:java:extensions:google-cloud-platform-core:jar

> Task :runners:core-construction-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:core-construction-java:classes
> Task :runners:core-construction-java:jar

> Task :runners:core-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:core-java:classes
> Task :runners:core-java:jar
> Task :sdks:java:core:compileTestJava

> Task :sdks:java:harness:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:harness:classes
> Task :sdks:java:harness:jar

> Task :sdks:java:core:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:core:testClasses
> Task :sdks:java:core:shadowTestJar
> Task :sdks:java:harness:shadowJar

> Task :runners:core-java:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:core-java:testClasses
> Task :runners:core-java:testJar

> Task :runners:java-fn-execution:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:java-fn-execution:classes
> Task :runners:java-fn-execution:jar

> Task :runners:direct-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for 

Build failed in Jenkins: beam_PostCommit_PortableJar_Spark #294

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[hannahjiang] [BEAM-9084] cleaning up docker image tag

[hannahjiang] [BEAM-9084] fix Java spotless


--
[...truncated 30.07 KB...]
Resolving github.com/spf13/afero: 
commit='bb8f1927f2a9d3ab41c9340aa034f6b803f4359c', 
urls=[https://github.com/spf13/afero.git, g...@github.com:spf13/afero.git]
Resolving github.com/spf13/cast: 
commit='acbeb36b902d72a7a4c18e8f3241075e7ab763e4', 
urls=[https://github.com/spf13/cast.git, g...@github.com:spf13/cast.git]
Resolving github.com/spf13/cobra: 
commit='93959269ad99e80983c9ba742a7e01203a4c0e4f', 
urls=[https://github.com/spf13/cobra.git, g...@github.com:spf13/cobra.git]
Resolving github.com/spf13/jwalterweatherman: 
commit='7c0cea34c8ece3fbeb2b27ab9b59511d360fb394', 
urls=[https://github.com/spf13/jwalterweatherman.git, 
g...@github.com:spf13/jwalterweatherman.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/spf13/viper: 
commit='aafc9e6bc7b7bb53ddaa75a5ef49a17d6e654be5', 
urls=[https://github.com/spf13/viper.git, g...@github.com:spf13/viper.git]
Resolving github.com/stathat/go: 
commit='74669b9f388d9d788c97399a0824adbfee78400e', 
urls=[https://github.com/stathat/go.git, g...@github.com:stathat/go.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/xordataexchange/crypt: 
commit='b2862e3d0a775f18c7cfe02273500ae307b61218', 
urls=[https://github.com/xordataexchange/crypt.git, 
g...@github.com:xordataexchange/crypt.git]
Resolving go.opencensus.io: commit='aa2b39d1618ef56ba156f27cfcdae9042f68f0bc', 
urls=[https://github.com/census-instrumentation/opencensus-go]

> Task :sdks:java:core:shadowJar

> Task :sdks:python:setupVirtualenv
Collecting virtualenv>=14.0.0
  Using cached 
https://files.pythonhosted.org/packages/05/f1/2e07e8ca50e047b9cc9ad56cf4291f4e041fa73207d000a095fe478abf84/virtualenv-16.7.9-py2.py3-none-any.whl
Collecting py<2,>=1.4.17
  Using cached 
https://files.pythonhosted.org/packages/99/8d/21e1767c009211a62a8e3067280bfce76e89c9f876180308515942304d2d/py-1.8.1-py2.py3-none-any.whl
Processing 
/home/jenkins/.cache/pip/wheels/66/13/60/ef107438d90e4aad6320e3424e50cfce5e16d1e9aad6d38294/filelock-3.0.12-cp27-none-any.whl
Collecting protobuf>=3.5.0.post1
  Using cached 
https://files.pythonhosted.org/packages/13/5c/ba4572a4d952b8db68c4534168a6d2a946b354de5e2b779efb44d4d0b72c/protobuf-3.11.2-cp27-cp27mu-manylinux1_x86_64.whl
Collecting grpcio>=1.14.2
  Using cached 
https://files.pythonhosted.org/packages/9d/19/b1f8354f5aeda4a3da1b31e32381def4af25e22df16a62f51237f50d5964/grpcio-1.26.0-cp27-cp27mu-manylinux2010_x86_64.whl
Collecting importlib-metadata>=0.12; python_version < "3.8"
  Using cached 
https://files.pythonhosted.org/packages/d7/31/74dcb59a601b95fce3b0334e8fc9db758f78e43075f22aeb3677dfb19f4c/importlib_metadata-1.4.0-py2.py3-none-any.whl
Collecting futures>=2.2.0; python_version < "3.2"
  Using cached 
https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting enum34>=1.0.4; python_version < "3.4"
  Using cached 
https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting contextlib2; python_version < "3"
  Using cached 
https://files.pythonhosted.org/packages/85/60/370352f7ef6aa96c52fb001831622f50f923c1d575427d021b8ab3311236/contextlib2-0.6.0.post1-py2.py3-none-any.whl
Collecting zipp>=0.5
  Using cached 
https://files.pythonhosted.org/packages/f4/50/cc72c5bcd48f6e98219fc4a88a5227e9e28b81637a99c49feba1d51f4d50/zipp-1.0.0-py2.py3-none-any.whl
Collecting pathlib2; python_version < "3"
  Using cached 
https://files.pythonhosted.org/packages/e9/45/9c82d3666af4ef9f221cbb954e1d77ddbb513faf552aea6df5f37f1a4859/pathlib2-2.3.5-py2.py3-none-any.whl
Collecting configparser>=3.5; python_version < "3"
  Using cached 
https://files.pythonhosted.org/packages/7a/2a/95ed0501cf5d8709490b1d3a3f9b5cf340da6c433f896bbe9ce08dbe6785/configparser-4.0.2-py2.py3-none-any.whl
Collecting more-itertools
  Using cached 

Build failed in Jenkins: beam_PostCommit_Java_PortabilityApi #3845

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[lukecwik] [BEAM-8676] sdks/java: gax and grpc upgrades (#10554)


--
[...truncated 49.52 KB...]
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:io:google-cloud-platform:classes
> Task :sdks:java:io:google-cloud-platform:jar

> Task :sdks:java:harness:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:harness:classes

> Task :sdks:java:core:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:core:testClasses

> Task :examples:java:compileJava
Note: 

 uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :examples:java:classes
> Task :examples:java:jar

> Task :sdks:go:resolveBuildDependencies
Resolving google.golang.org/api: 
commit='386d4e5f4f92f86e6aec85985761bba4b938a2d5', 
urls=[https://code.googlesource.com/google-api-go-client]

> Task :runners:google-cloud-dataflow-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :sdks:java:core:shadowTestJar

> Task :examples:java:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :examples:java:testClasses
> Task :examples:java:testJar
> Task :sdks:java:harness:shadowJar
> Task :sdks:java:harness:jar
> Task :sdks:java:container:copyDockerfileDependencies

> Task :sdks:go:resolveBuildDependencies
Resolving google.golang.org/genproto: 
commit='2b5a72b8730b0b16380010cfe5286c42108d88e7', 
urls=[https://github.com/google/go-genproto]
Resolving google.golang.org/grpc: 
commit='7646b5360d049a7ca31e9133315db43456f39e2e', 
urls=[https://github.com/grpc/grpc-go]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]

> Task :sdks:java:extensions:google-cloud-platform-core:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :sdks:java:extensions:google-cloud-platform-core:testClasses
> Task :sdks:java:extensions:google-cloud-platform-core:testJar
> Task :sdks:go:installDependencies

> Task :runners:java-fn-execution:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:go:buildLinuxAmd64
> Task :sdks:go:goBuild

> Task :sdks:java:container:resolveBuildDependencies
Resolving 
./github.com/apache/beam/sdks/go@

> Task :sdks:java:container:installDependencies
> Task :runners:java-fn-execution:classes
> Task :runners:java-fn-execution:jar
> Task :sdks:java:container:buildLinuxAmd64
> Task :sdks:java:container:goBuild
> Task :sdks:java:container:dockerPrepare

> Task :runners:direct-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:direct-java:classes
> Task :runners:direct-java:shadowJar

> Task :sdks:java:container:docker
 ---> Running in ddc923535e39
Removing intermediate container ddc923535e39
 ---> 4564eda5cc1a
Step 3/9 : ADD target/slf4j-api.jar /opt/apache/beam/jars/
 ---> 1f51ff356f92
Step 4/9 : ADD target/slf4j-jdk14.jar /opt/apache/beam/jars/
 ---> bb36ba45068c
Step 5/9 : ADD target/beam-sdks-java-harness.jar /opt/apache/beam/jars/
 ---> 8ab5f1fee83c
Step 6/9 : ADD target/beam-sdks-java-io-kafka.jar /opt/apache/beam/jars/
 ---> c96dcb42f55d
Step 7/9 : ADD target/kafka-clients.jar /opt/apache/beam/jars/

Build failed in Jenkins: beam_PostCommit_Py_ValCont #5294

2020-01-15 Thread Apache Jenkins Server
cs_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
--
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py;,>
 line 812, in run
test(orig)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 45, in __call__
return self.run(*arg, **kwarg)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 133, in run
self.runTest(result)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 151, in runTest
test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 58, in test_metrics_fnapi_it
result = self.run_pipeline(experiment='beam_fn_api')
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 39, in run_pipeline
test_pipeline = TestPipeline(is_integration_test=True)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/testing/test_pipeline.py;,>
 line 108, in __init__
super(TestPipeline, self).__init__(runner, options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py;,>
 line 184, in __init__
errors = PipelineOptionsValidator(self._options, runner).validate()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options_validator.py;,>
 line 113, in validate
errors.extend(self.options.view_as(cls).validate(self))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 328, in view_as
view = cls(self._flags)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 205, in __init__
cls._add_argparse_args(parser)  # type: ignore
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 485, in _add_argparse_args
help='GCS path for saving temporary workflow jobs.')
  File "/usr/lib/python2.7/argparse.py", line 1294, in add_argument
action = action_class(**kwargs)
  File "/usr/lib/python2.7/argparse.py", line 807, in __init__
def __init__(self,
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py;,>
 line 276, in signalhandler
raise TimedOutException()
TimedOutException: 'test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)'

--
XML: nosetests-python2.7_sdk.xml
--
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
--
Ran 2 tests in 904.137s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20200115-214503
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:95c85eef4b2e41dc2b539ac64acd7f5c4502cb0b517d5aec7fee7e5b5f968823
Deleted: sha256:4f99b1154694a5fb16a587d96d8902311c1ed19c6c394474e166abd1a66c5470
Deleted: sha256:7215c7ef685b38bf6367048dbe0b784d32b417fb5834886b0691703462a223b1
Deleted: sha256:86cc148ed5d516bb9beb276043909b3ec8efdd139fba7e7526ff453396f5f3dd
Deleted: sha256:2709eb58344cde11db31d00c17e322e561afb1d2276715b72453d08e105b3170
Deleted: sha256:6db607bdea5a54093a2a6b0f5ffbcee0d74aa17ab0ba1d0ab2072ecc8fe808e0
Deleted: sha256:ec670ca0fc0da5906f37d2ec337e4fbba143a3898687b1cfed6a72c4e4b1863a
Deleted: sha256:52c8412e7286eb19703ba3201f932c79418f5708caa38222ddd766785954afb3
Deleted: sha256:01608e9d4ecbc47339c308f39b796167d9fec3115a8c272dad171326274b2ada
Deleted: sha256:248538c

Jenkins build is back to normal : beam_PostCommit_Python35 #1467

2020-01-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python2 #1458

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[ehudm] [BEAM-8525] Support Const base in binary_subscr

[ehudm] Do not perform test on Py2.7


--
[...truncated 3.60 MB...]
WARNING:apache_beam.utils.subprocess_server:Starting service with ['java' 
'-jar' 
'
 '--spark-master-url' 'local[4]' '--artifacts-dir' 
'/tmp/beam-tempmXEqfE/artifactsDCnAoY' '--job-port' '59063' '--artifact-port' 
'0' '--expansion-port' '0']
20/01/15 20:29:09 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: 
ArtifactStagingService started on localhost:35489
20/01/15 20:29:09 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java 
ExpansionService started on localhost:44701
20/01/15 20:29:09 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService 
started on localhost:59063
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--parallelism=2', '--shutdown_sources_on_final_watermark']
20/01/15 20:29:11 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking 
job BeamApp-jenkins-0115202911-93ee97ce_89a086f3-e97f-4088-9587-9f7ebd07198a
20/01/15 20:29:11 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job 
invocation 
BeamApp-jenkins-0115202911-93ee97ce_89a086f3-e97f-4088-9587-9f7ebd07198a
INFO:root:Waiting until the pipeline has finished because the environment 
"LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
20/01/15 20:29:12 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
PipelineOptions.filesToStage was not specified. Defaulting to files from the 
classpath
20/01/15 20:29:12 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will 
stage 1 files. (Enable logging at DEBUG level to see which files will be 
staged.)
20/01/15 20:29:12 INFO 
org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand 
new Spark Context.
20/01/15 20:29:12 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_autocomplete_it 
(apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
20/01/15 20:29:13 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
Running job 
BeamApp-jenkins-0115202911-93ee97ce_89a086f3-e97f-4088-9587-9f7ebd07198a on 
Spark master local[4]
20/01/15 20:29:13 INFO 
org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated 
aggregators accumulator: 
20/01/15 20:29:13 INFO 
org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics 
accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel 
for localhost:39139.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with 
unbounded number of workers.
20/01/15 20:29:15 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam 
Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for 
localhost:45731.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for 
localhost:33439
20/01/15 20:29:15 INFO 
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client 
connected.
20/01/15 20:29:15 WARN 
org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: 
Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not 
consistent with equals. That might cause issues on some runners.
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path 
matching: -*-of-%(num_shards)05d
20/01/15 20:29:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
BeamApp-jenkins-0115202911-93ee97ce_89a086f3-e97f-4088-9587-9f7ebd07198a: 
Pipeline translated successfully. Computing outputs
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with 
num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
20/01/15 20:29:18 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 

Build failed in Jenkins: beam_PostCommit_XVR_Flink #1464

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[hannahjiang] [BEAM-9084] cleaning up docker image tag

[hannahjiang] [BEAM-9084] fix Java spotless

[lukecwik] [BEAM-9030] Migrate Beam to use beam-vendor-grpc-1_26_0 (#10578)


--
[...truncated 5.14 MB...]
with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 658, in 
apache_beam.runners.worker.operations.DoOperation.process
delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 878, in 
apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
  File "apache_beam/runners/common.py", line 885, in 
apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 941, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
raise
  File "apache_beam/runners/common.py", line 883, in 
apache_beam.runners.common.DoFnRunner.process
return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 497, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
self.output_processor.process_outputs(
  File "apache_beam/runners/common.py", line 1028, in 
apache_beam.runners.common._OutputProcessor.process_outputs
self.main_receivers.receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 178, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 657, in 
apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 658, in 
apache_beam.runners.worker.operations.DoOperation.process
delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 878, in 
apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
  File "apache_beam/runners/common.py", line 885, in 
apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 956, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 883, in 
apache_beam.runners.common.DoFnRunner.process
return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 498, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
windowed_value, self.process_method(windowed_value.value))
  File 
"
 line 1437, in 
  File 
"
 line 191, in _equal
BeamAssertException: Failed assert: ['a: 3', 'b: 1', 'c: 2'] == [], missing 
elements ['a: 3', 'b: 1', 'c: 2'] [while running 'assert_that/Match']

at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at 
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:204)
at 
org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
... 1 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for 
instruction 134: Traceback (most recent call last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 167, in _execute
response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 223, in 
lambda: self.create_worker().do_instruction(request), request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 352, in do_instruction
request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 386, in process_bundle
bundle_processor.process_bundle(instruction_id))
  File 

Build failed in Jenkins: beam_PostCommit_Java_PortabilityApi #3846

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[hannahjiang] [BEAM-9084] cleaning up docker image tag

[hannahjiang] [BEAM-9084] fix Java spotless

[lukecwik] [BEAM-9030] Migrate Beam to use beam-vendor-grpc-1_26_0 (#10578)


--
[...truncated 28.91 KB...]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/kr/fs: commit='2788f0dbd16903de03cb8186e5c7d97b69ad387b', 
urls=[https://github.com/kr/fs.git, g...@github.com:kr/fs.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/magiconair/properties: 
commit='49d762b9817ba1c2e9d0c69183c2b4a8b8f1d934', 
urls=[https://github.com/magiconair/properties.git, 
g...@github.com:magiconair/properties.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/mitchellh/go-homedir: 
commit='b8bc1bf767474819792c23f32d8286a45736f1c6', 
urls=[https://github.com/mitchellh/go-homedir.git, 
g...@github.com:mitchellh/go-homedir.git]
Resolving github.com/mitchellh/mapstructure: 
commit='a4e142e9c047c904fa2f1e144d9a84e6133024bc', 
urls=[https://github.com/mitchellh/mapstructure.git, 
g...@github.com:mitchellh/mapstructure.git]
Resolving github.com/nightlyone/lockfile: 
commit='0ad87eef1443f64d3d8c50da647e2b1552851124', 
urls=[https://github.com/nightlyone/lockfile, 
g...@github.com:nightlyone/lockfile.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/openzipkin/zipkin-go: 
commit='3741243b287094fda649c7f0fa74bd51f37dc122', 
urls=[https://github.com/openzipkin/zipkin-go.git, 
g...@github.com:openzipkin/zipkin-go.git]
Resolving github.com/pelletier/go-toml: 
commit='acdc4509485b587f5e675510c4f2c63e90ff68a8', 
urls=[https://github.com/pelletier/go-toml.git, 
g...@github.com:pelletier/go-toml.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/pierrec/lz4: 
commit='ed8d4cc3b461464e69798080a0092bd028910298', 
urls=[https://github.com/pierrec/lz4.git, g...@github.com:pierrec/lz4.git]
Resolving github.com/pierrec/xxHash: 
commit='a0006b13c722f7f12368c00a3d3c2ae8a999a0c6', 
urls=[https://github.com/pierrec/xxHash.git, g...@github.com:pierrec/xxHash.git]
Resolving github.com/pkg/errors: 
commit='30136e27e2ac8d167177e8a583aa4c3fea5be833', 
urls=[https://github.com/pkg/errors.git, g...@github.com:pkg/errors.git]
Resolving github.com/pkg/sftp: 
commit='22e9c1ccc02fc1b9fa3264572e49109b68a86947', 
urls=[https://github.com/pkg/sftp.git, g...@github.com:pkg/sftp.git]
Resolving github.com/prometheus/client_golang: 
commit='9bb6ab929dcbe1c8393cd9ef70387cb69811bd1c', 
urls=[https://github.com/prometheus/client_golang.git, 
g...@github.com:prometheus/client_golang.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/prometheus/procfs: 
commit='cb4147076ac75738c9a7d279075a253c0cc5acbd', 
urls=[https://github.com/prometheus/procfs.git, 
g...@github.com:prometheus/procfs.git]
Resolving github.com/rcrowley/go-metrics: 
commit='8732c616f52954686704c8645fe1a9d59e9df7c1', 
urls=[https://github.com/rcrowley/go-metrics.git, 
g...@github.com:rcrowley/go-metrics.git]
Resolving github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 
g...@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 
g...@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 

Build failed in Jenkins: beam_PostCommit_Java_PVR_Flink_Streaming #3823

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[hannahjiang] [BEAM-9084] cleaning up docker image tag

[hannahjiang] [BEAM-9084] fix Java spotless


--
Started by GitHub push by ibzib
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 003c89135754aac3f7e80f50523c8f7caa4ffcee (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 003c89135754aac3f7e80f50523c8f7caa4ffcee
Commit message: "Merge pull request #10557 from 
Hannah-Jiang/cleaning_up_docker_tags"
 > git rev-list --no-walk 2fd785dedb979a248e63c6385f978fd18fd2fbc4 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g 
:runners:flink:1.9:job-server:validatesPortableRunnerStreaming
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:flink:1.9:copyResourcesOverrides NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :runners:core-construction-java:processTestResources NO-SOURCE
> Task :model:job-management:extractProto
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :runners:portability:java:processTestResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :runners:flink:1.9:copySourceOverrides
> Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :runners:flink:1.9:processResources
> Task :runners:flink:1.9:copyTestSourceOverrides
> Task :runners:flink:1.9:processTestResources
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :sdks:java:core:generateTestAvroJava
> Task 

Build failed in Jenkins: beam_PostCommit_Java_PVR_Spark_Batch #1811

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[lukecwik] [BEAM-8676] sdks/java: gax and grpc upgrades (#10554)


--
[...truncated 68.00 KB...]
Caused by: java.util.concurrent.ExecutionException at 
ReshuffleTest.java:251
Caused by: java.io.FileNotFoundException

org.apache.beam.sdk.transforms.ReshuffleTest > testReshufflePreservesTimestamps 
FAILED
java.lang.RuntimeException at ReshuffleTest.java:148
Caused by: java.lang.RuntimeException at ReshuffleTest.java:148
Caused by: java.util.concurrent.ExecutionException at 
ReshuffleTest.java:148
Caused by: java.io.FileNotFoundException

org.apache.beam.sdk.transforms.ReshuffleTest > 
testReshuffleAfterSlidingWindowsAndGroupByKey FAILED
java.lang.RuntimeException at ReshuffleTest.java:211
Caused by: java.lang.RuntimeException at ReshuffleTest.java:211
Caused by: java.util.concurrent.ExecutionException at 
ReshuffleTest.java:211
Caused by: java.io.FileNotFoundException

org.apache.beam.sdk.transforms.ReshuffleTest > testJustReshuffle FAILED
java.lang.RuntimeException at ReshuffleTest.java:102
Caused by: java.lang.RuntimeException at ReshuffleTest.java:102
Caused by: java.util.concurrent.ExecutionException at 
ReshuffleTest.java:102
Caused by: java.io.FileNotFoundException

org.apache.beam.sdk.transforms.ParDoTest$BasicTests > testParDo FAILED
java.lang.RuntimeException at ParDoTest.java:360

org.apache.beam.sdk.transforms.ParDoTest$BasicTests > 
testPipelineOptionsParameter FAILED
java.lang.RuntimeException at ParDoTest.java:558
Caused by: java.lang.RuntimeException at ParDoTest.java:558
Caused by: java.util.concurrent.ExecutionException at 
ParDoTest.java:558
Caused by: java.lang.RuntimeException
Caused by: 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException

org.apache.beam.sdk.transforms.ParDoTest$BasicTests > testParDoEmpty FAILED
java.lang.RuntimeException at ParDoTest.java:376
Caused by: java.lang.RuntimeException at ParDoTest.java:376
Caused by: java.util.concurrent.ExecutionException at 
ParDoTest.java:376
Caused by: java.lang.RuntimeException
Caused by: 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException

org.apache.beam.sdk.transforms.ParDoTest$BasicTests > 
testParDoInCustomTransform FAILED
java.lang.RuntimeException at ParDoTest.java:422
Caused by: java.lang.RuntimeException at ParDoTest.java:422
Caused by: java.util.concurrent.ExecutionException at 
ParDoTest.java:422
Caused by: java.lang.RuntimeException
Caused by: 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException
Caused by: 
org.apache.beam.vendor.grpc.v1p21p0.io.netty.channel.ChannelException
Caused by: 
java.lang.reflect.InvocationTargetException
Caused by: 
org.apache.beam.vendor.grpc.v1p21p0.io.netty.channel.ChannelException
Caused by: java.net.SocketException

org.apache.beam.sdk.transforms.ParDoTest$BasicTests > testParDoEmptyOutputs 
FAILED
java.lang.RuntimeException at ParDoTest.java:392
Caused by: java.lang.RuntimeException at ParDoTest.java:392
Caused by: java.util.concurrent.ExecutionException at 
ParDoTest.java:392
Caused by: java.io.FileNotFoundException

org.apache.beam.sdk.transforms.ParDoTest$TimerCoderInferenceTests > 
testValueStateCoderInferenceFromInputCoder FAILED
java.lang.RuntimeException at ParDoTest.java:4022
Caused by: java.lang.RuntimeException at ParDoTest.java:4022
Caused by: java.util.concurrent.ExecutionException at 
ParDoTest.java:4022
Caused by: java.io.FileNotFoundException

org.apache.beam.sdk.transforms.ParDoTest$TimerCoderInferenceTests > 
testValueStateCoderInference FAILED
java.lang.RuntimeException at ParDoTest.java:3952
Caused by: java.lang.RuntimeException at ParDoTest.java:3952
Caused by: java.util.concurrent.ExecutionException at 
ParDoTest.java:3952
Caused by: java.io.FileNotFoundException

org.apache.beam.sdk.transforms.ParDoTest$LifecycleTests > 
testWindowingInStartAndFinishBundle FAILED
java.lang.RuntimeException at ParDoTest.java:1518
Caused by: java.lang.RuntimeException at ParDoTest.java:1518
Caused by: java.util.concurrent.ExecutionException at 
ParDoTest.java:1518
Caused by: java.io.FileNotFoundException

org.apache.beam.sdk.transforms.WithTimestampsTest > 
withTimestampsShouldApplyTimestamps FAILED
java.lang.RuntimeException at WithTimestampsTest.java:78
Caused by: 

Build failed in Jenkins: beam_PostCommit_Java11_ValidatesRunner_Direct #3000

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[boyuanz] Moving to 2.20.0-SNAPSHOT on master branch.


--
[...truncated 12.05 KB...]
> Task :model:pipeline:extractProto
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:pipeline:shadowJar
> Task :model:job-management:compileJava
> Task :model:job-management:classes
> Task :model:job-management:shadowJar
> Task :model:fn-execution:compileJava
> Task :model:fn-execution:classes
> Task :model:fn-execution:shadowJar

> Task :sdks:java:core:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :runners:local-java:compileJava
> Task :runners:local-java:classes
> Task :runners:local-java:jar

> Task :vendor:sdks-java-extensions-protobuf:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :vendor:sdks-java-extensions-protobuf:classes
> Task :vendor:sdks-java-extensions-protobuf:shadowJar

> Task :sdks:java:fn-execution:compileJava
Note: 

 uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:fn-execution:classes
> Task :sdks:java:fn-execution:jar

> Task :sdks:java:extensions:google-cloud-platform-core:compileJava
Note: 

 uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :sdks:java:extensions:google-cloud-platform-core:classes
> Task :sdks:java:extensions:google-cloud-platform-core:jar

> Task :runners:core-construction-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:core-construction-java:classes
> Task :runners:core-construction-java:jar
> Task :sdks:java:core:compileTestJava

> Task :runners:core-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:core-java:classes
> Task :runners:core-java:jar

> Task :sdks:java:harness:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:harness:classes
> Task :sdks:java:harness:jar

> Task :sdks:java:core:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:core:testClasses
> Task :sdks:java:core:shadowTestJar
> Task :sdks:java:harness:shadowJar

> Task :runners:core-java:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:core-java:testClasses
> Task :runners:core-java:testJar

> Task :runners:java-fn-execution:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:java-fn-execution:classes
> Task :runners:java-fn-execution:jar

> Task 

Build failed in Jenkins: beam_PostCommit_Py_ValCont #5293

2020-01-15 Thread Apache Jenkins Server
ontainer/venv/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py;,>
 line 812, in run
test(orig)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 45, in __call__
return self.run(*arg, **kwarg)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 133, in run
self.runTest(result)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 151, in runTest
test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 58, in test_metrics_fnapi_it
result = self.run_pipeline(experiment='beam_fn_api')
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 39, in run_pipeline
test_pipeline = TestPipeline(is_integration_test=True)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/testing/test_pipeline.py;,>
 line 108, in __init__
super(TestPipeline, self).__init__(runner, options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py;,>
 line 184, in __init__
errors = PipelineOptionsValidator(self._options, runner).validate()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options_validator.py;,>
 line 113, in validate
errors.extend(self.options.view_as(cls).validate(self))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 591, in validate
self.view_as(GoogleCloudOptions).region = self._get_default_gcp_region()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 559, in _get_default_gcp_region
raw_output = processes.check_output(cmd, stderr=DEVNULL)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/utils/processes.py;,>
 line 85, in check_output
out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python2.7/subprocess.py", line 568, in check_output
output, unused_err = process.communicate()
  File "/usr/lib/python2.7/subprocess.py", line 792, in communicate
stdout = _eintr_retry_call(self.stdout.read)
  File "/usr/lib/python2.7/subprocess.py", line 476, in _eintr_retry_call
return func(*args)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py;,>
 line 276, in signalhandler
raise TimedOutException()
TimedOutException: 'test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)'

--
XML: nosetests-python2.7_sdk.xml
--
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
--
Ran 2 tests in 904.133s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20200115-211421
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:a1f2dddc87449aa929d8b69924c61bab55d09e70235bfafdfb4c0b75ab4c723e
Deleted: sha256:995a0fdc81954c63e3f6f609cad0efa1464f1daac336d222a8621a62a2ad2a29
Deleted: sha256:ca4c898f99f04abb165dae12f378240125aebb2facf698e7ce905185035689ec
Deleted: sha256:5b198c42654dfbc8ce97d600a155087327e807723d761a7de52c1665c6bd6298
Deleted: sha256:7d53533a9ba0a33dce974922fc506586cf4b24acbe65cdd6238d5e0baaa690c8
Deleted: sha256:b4315c706aedee0cd6c9f95d6ce58347466bc350c6dfdbe48c5ac1209833e501
Deleted: sha256:68a094ce151ce1ae5ea15af578bebf8c7c0c5215f930d2d666b84260044571d8
Deleted: sha256:da3bdc88ea905ecfd8797d70751e3272f20b396d8113cb3fc2b57e843ae5a9db
Deleted: sha256:0acb943c05454d277bc3ca8ef858385b33bd0e66308e3f8ad94df68d30d810af
Deleted: sha256:592d0efd29161248dca3fb6c08632587e978a781e65f861ff033722291bf418b
Deleted: sha256:3e9

Build failed in Jenkins: beam_PostCommit_PortableJar_Spark #295

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[crites] Changes watermark advance from 1001 to 1000 since Dataflow TestStream


--
[...truncated 28.57 KB...]
Resolving github.com/hashicorp/hcl: 
commit='23c074d0eceb2b8a5bfdbb271ab780cde70f05a8', 
urls=[https://github.com/hashicorp/hcl.git, g...@github.com:hashicorp/hcl.git]
Resolving github.com/ianlancetaylor/demangle: 
commit='4883227f66371e02c4948937d3e2be1664d9be38', 
urls=[https://github.com/ianlancetaylor/demangle.git, 
g...@github.com:ianlancetaylor/demangle.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/kr/fs: commit='2788f0dbd16903de03cb8186e5c7d97b69ad387b', 
urls=[https://github.com/kr/fs.git, g...@github.com:kr/fs.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/magiconair/properties: 
commit='49d762b9817ba1c2e9d0c69183c2b4a8b8f1d934', 
urls=[https://github.com/magiconair/properties.git, 
g...@github.com:magiconair/properties.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/mitchellh/go-homedir: 
commit='b8bc1bf767474819792c23f32d8286a45736f1c6', 
urls=[https://github.com/mitchellh/go-homedir.git, 
g...@github.com:mitchellh/go-homedir.git]
Resolving github.com/mitchellh/mapstructure: 
commit='a4e142e9c047c904fa2f1e144d9a84e6133024bc', 
urls=[https://github.com/mitchellh/mapstructure.git, 
g...@github.com:mitchellh/mapstructure.git]
Resolving github.com/nightlyone/lockfile: 
commit='0ad87eef1443f64d3d8c50da647e2b1552851124', 
urls=[https://github.com/nightlyone/lockfile, 
g...@github.com:nightlyone/lockfile.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/openzipkin/zipkin-go: 
commit='3741243b287094fda649c7f0fa74bd51f37dc122', 
urls=[https://github.com/openzipkin/zipkin-go.git, 
g...@github.com:openzipkin/zipkin-go.git]
Resolving github.com/pelletier/go-toml: 
commit='acdc4509485b587f5e675510c4f2c63e90ff68a8', 
urls=[https://github.com/pelletier/go-toml.git, 
g...@github.com:pelletier/go-toml.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/pierrec/lz4: 
commit='ed8d4cc3b461464e69798080a0092bd028910298', 
urls=[https://github.com/pierrec/lz4.git, g...@github.com:pierrec/lz4.git]
Resolving github.com/pierrec/xxHash: 
commit='a0006b13c722f7f12368c00a3d3c2ae8a999a0c6', 
urls=[https://github.com/pierrec/xxHash.git, g...@github.com:pierrec/xxHash.git]
Resolving github.com/pkg/errors: 
commit='30136e27e2ac8d167177e8a583aa4c3fea5be833', 
urls=[https://github.com/pkg/errors.git, g...@github.com:pkg/errors.git]
Resolving github.com/pkg/sftp: 
commit='22e9c1ccc02fc1b9fa3264572e49109b68a86947', 
urls=[https://github.com/pkg/sftp.git, g...@github.com:pkg/sftp.git]
Resolving github.com/prometheus/client_golang: 
commit='9bb6ab929dcbe1c8393cd9ef70387cb69811bd1c', 
urls=[https://github.com/prometheus/client_golang.git, 
g...@github.com:prometheus/client_golang.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/prometheus/procfs: 
commit='cb4147076ac75738c9a7d279075a253c0cc5acbd', 
urls=[https://github.com/prometheus/procfs.git, 
g...@github.com:prometheus/procfs.git]
Resolving github.com/rcrowley/go-metrics: 
commit='8732c616f52954686704c8645fe1a9d59e9df7c1', 
urls=[https://github.com/rcrowley/go-metrics.git, 
g...@github.com:rcrowley/go-metrics.git]
Resolving github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 
g...@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 

beam_PostCommit_PortableJar_Flink - Build # 1119 - Aborted

2020-01-15 Thread Apache Jenkins Server
The Apache Jenkins build system has built beam_PostCommit_PortableJar_Flink 
(build #1119)

Status: Aborted

Check console output at 
https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/1119/ to view 
the results.

-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org

Build failed in Jenkins: beam_PostCommit_Java11_ValidatesRunner_Direct #3005

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[crites] Changes watermark advance from 1001 to 1000 since Dataflow TestStream


--
[...truncated 684 B...]
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 931108c7a104985b6c182385ddbc6bd767ba0127 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 931108c7a104985b6c182385ddbc6bd767ba0127
Commit message: "Merge pull request #10601 from acrites/window-resolution"
 > git rev-list --no-walk 003c89135754aac3f7e80f50523c8f7caa4ffcee # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 -Dorg.gradle.java.home=/usr/lib/jvm/java-8-openjdk-amd64 
:runners:direct-java:shadowJar :runners:direct-java:shadowTestJar
Starting a Gradle Daemon, 2 busy and 1 stopped Daemons could not be reused, use 
--status for details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :model:job-management:extractProto
> Task :model:fn-execution:extractProto
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :runners:direct-java:processTestResources NO-SOURCE
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Direct #2378

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[boyuanz] Moving to 2.20.0-SNAPSHOT on master branch.


--
Started by GitHub push by asfgit
Started by GitHub push by asfgit
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2e06ca4d0a9249d1bd7f4bfe421c332648230fe3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2e06ca4d0a9249d1bd7f4bfe421c332648230fe3
Commit message: "Moving to 2.20.0-SNAPSHOT on master branch."
 > git rev-list --no-walk 2b07e0efb5db918d462873dcdd0055285fe7bf7a # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 :runners:direct-java:validatesRunner
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:core-construction-java:processTestResources NO-SOURCE
> Task :runners:direct-java:processTestResources NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task 

Build failed in Jenkins: beam_PostCommit_Py_ValCont #5291

2020-01-15 Thread Apache Jenkins Server
lib/python2.7/site-packages/nose/plugins/multiprocess.py;,>
 line 812, in run
test(orig)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 45, in __call__
return self.run(*arg, **kwarg)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 133, in run
self.runTest(result)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 151, in runTest
test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 58, in test_metrics_fnapi_it
result = self.run_pipeline(experiment='beam_fn_api')
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 39, in run_pipeline
test_pipeline = TestPipeline(is_integration_test=True)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/testing/test_pipeline.py;,>
 line 108, in __init__
super(TestPipeline, self).__init__(runner, options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py;,>
 line 184, in __init__
errors = PipelineOptionsValidator(self._options, runner).validate()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options_validator.py;,>
 line 113, in validate
errors.extend(self.options.view_as(cls).validate(self))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 591, in validate
self.view_as(GoogleCloudOptions).region = self._get_default_gcp_region()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 559, in _get_default_gcp_region
raw_output = processes.check_output(cmd, stderr=DEVNULL)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/utils/processes.py;,>
 line 85, in check_output
out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python2.7/subprocess.py", line 568, in check_output
output, unused_err = process.communicate()
  File "/usr/lib/python2.7/subprocess.py", line 792, in communicate
stdout = _eintr_retry_call(self.stdout.read)
  File "/usr/lib/python2.7/subprocess.py", line 476, in _eintr_retry_call
return func(*args)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py;,>
 line 276, in signalhandler
raise TimedOutException()
TimedOutException: 'test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)'

--
XML: nosetests-python2.7_sdk.xml
--
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
--
Ran 2 tests in 903.663s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20200115-191122
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:29f4f76f5d96f13496446bfaf03a73337a06431d2220c2433b020c378146a815
Deleted: sha256:ed9adfb99658a4edb9bc3273ee163db1a7f66bd21e95c00ace13b8dbdd945f64
Deleted: sha256:fd3b1aa3e68fbc79aca55796a3b6802fbedd898ef93de7004342444d84012311
Deleted: sha256:c7b86f71e5f4c3923064f0a5b18709b8a22cca69e0b2cfae7e3003d70885ff96
Deleted: sha256:424968546b942d924d975de9c6c78ca5891538bb33c7f72a649233cb49c1dd13
Deleted: sha256:b52cd1502dac2b96d1a2d97832ef1ab872aa63be573329bb3d5a741db2df9463
Deleted: sha256:a1cadadbd46eeb9e03070e88fc603e5e5a00cff77e31bbaf61d25880777e59e8
Deleted: sha256:26f90e190a090da3ff9736d522342c526e79fd43e86975362126d628ac9375e6
Deleted: sha256:cf49351007ec1c0f436d6963477632e808b5b181e61e7497122bb1eb63280b6b
Deleted: sha256:1ad46ac39f41fe6a943ac8eb609ff324c70125f6bcbe7b278fdf2a7bcf6f5fc5
Deleted: sha256:b22ed5febd1fb

Build failed in Jenkins: beam_PostCommit_PortableJar_Spark #291

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[ehudm] [BEAM-8525] Support Const base in binary_subscr

[ehudm] Do not perform test on Py2.7


--
[...truncated 28.58 KB...]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/kr/fs: commit='2788f0dbd16903de03cb8186e5c7d97b69ad387b', 
urls=[https://github.com/kr/fs.git, g...@github.com:kr/fs.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/magiconair/properties: 
commit='49d762b9817ba1c2e9d0c69183c2b4a8b8f1d934', 
urls=[https://github.com/magiconair/properties.git, 
g...@github.com:magiconair/properties.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/mitchellh/go-homedir: 
commit='b8bc1bf767474819792c23f32d8286a45736f1c6', 
urls=[https://github.com/mitchellh/go-homedir.git, 
g...@github.com:mitchellh/go-homedir.git]
Resolving github.com/mitchellh/mapstructure: 
commit='a4e142e9c047c904fa2f1e144d9a84e6133024bc', 
urls=[https://github.com/mitchellh/mapstructure.git, 
g...@github.com:mitchellh/mapstructure.git]
Resolving github.com/nightlyone/lockfile: 
commit='0ad87eef1443f64d3d8c50da647e2b1552851124', 
urls=[https://github.com/nightlyone/lockfile, 
g...@github.com:nightlyone/lockfile.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/openzipkin/zipkin-go: 
commit='3741243b287094fda649c7f0fa74bd51f37dc122', 
urls=[https://github.com/openzipkin/zipkin-go.git, 
g...@github.com:openzipkin/zipkin-go.git]
Resolving github.com/pelletier/go-toml: 
commit='acdc4509485b587f5e675510c4f2c63e90ff68a8', 
urls=[https://github.com/pelletier/go-toml.git, 
g...@github.com:pelletier/go-toml.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/pierrec/lz4: 
commit='ed8d4cc3b461464e69798080a0092bd028910298', 
urls=[https://github.com/pierrec/lz4.git, g...@github.com:pierrec/lz4.git]
Resolving github.com/pierrec/xxHash: 
commit='a0006b13c722f7f12368c00a3d3c2ae8a999a0c6', 
urls=[https://github.com/pierrec/xxHash.git, g...@github.com:pierrec/xxHash.git]
Resolving github.com/pkg/errors: 
commit='30136e27e2ac8d167177e8a583aa4c3fea5be833', 
urls=[https://github.com/pkg/errors.git, g...@github.com:pkg/errors.git]
Resolving github.com/pkg/sftp: 
commit='22e9c1ccc02fc1b9fa3264572e49109b68a86947', 
urls=[https://github.com/pkg/sftp.git, g...@github.com:pkg/sftp.git]
Resolving github.com/prometheus/client_golang: 
commit='9bb6ab929dcbe1c8393cd9ef70387cb69811bd1c', 
urls=[https://github.com/prometheus/client_golang.git, 
g...@github.com:prometheus/client_golang.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/prometheus/procfs: 
commit='cb4147076ac75738c9a7d279075a253c0cc5acbd', 
urls=[https://github.com/prometheus/procfs.git, 
g...@github.com:prometheus/procfs.git]
Resolving github.com/rcrowley/go-metrics: 
commit='8732c616f52954686704c8645fe1a9d59e9df7c1', 
urls=[https://github.com/rcrowley/go-metrics.git, 
g...@github.com:rcrowley/go-metrics.git]
Resolving github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 
g...@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 
g...@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, 

Build failed in Jenkins: beam_PostCommit_Java11_ValidatesRunner_Direct #3002

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[lukecwik] [BEAM-8676] sdks/java: gax and grpc upgrades (#10554)


--
[...truncated 12.03 KB...]
> Task :model:pipeline:extractProto
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:pipeline:shadowJar
> Task :model:job-management:compileJava
> Task :model:job-management:classes
> Task :model:job-management:shadowJar
> Task :model:fn-execution:compileJava
> Task :model:fn-execution:classes
> Task :model:fn-execution:shadowJar

> Task :sdks:java:core:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :runners:local-java:compileJava
> Task :runners:local-java:classes
> Task :runners:local-java:jar

> Task :vendor:sdks-java-extensions-protobuf:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :vendor:sdks-java-extensions-protobuf:classes
> Task :vendor:sdks-java-extensions-protobuf:shadowJar

> Task :sdks:java:fn-execution:compileJava
Note: 

 uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:fn-execution:classes
> Task :sdks:java:fn-execution:jar

> Task :sdks:java:extensions:google-cloud-platform-core:compileJava
Note: 

 uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :sdks:java:extensions:google-cloud-platform-core:classes
> Task :sdks:java:extensions:google-cloud-platform-core:jar

> Task :runners:core-construction-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:core-construction-java:classes
> Task :runners:core-construction-java:jar
> Task :sdks:java:core:compileTestJava

> Task :runners:core-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:core-java:classes
> Task :runners:core-java:jar

> Task :sdks:java:harness:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:harness:classes
> Task :sdks:java:harness:jar

> Task :sdks:java:core:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:core:testClasses
> Task :sdks:java:core:shadowTestJar
> Task :sdks:java:harness:shadowJar

> Task :runners:core-java:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:core-java:testClasses
> Task :runners:core-java:testJar

> Task :runners:java-fn-execution:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:java-fn-execution:classes
> Task :runners:java-fn-execution:jar

> 

Build failed in Jenkins: beam_PostCommit_PortableJar_Spark #293

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[lukecwik] [BEAM-9030] Migrate Beam to use beam-vendor-grpc-1_26_0 (#10578)


--
[...truncated 480.60 KB...]
  Found link 
https://files.pythonhosted.org/packages/8d/c7/f05c87812fa5d9562ecbc5f4f1fc1570444f53c81c834a7f662af406e3c1/pip-0.5.tar.gz#sha256=328d8412782f22568508a0d0c78a49c9920a82e44c8dfca49954fe525c152b2a
 (from https://pypi.org/simple/pip/), version: 0.5
  Found link 
https://files.pythonhosted.org/packages/9a/aa/f536b6d14fe03343367da2ff44eee28f340ae650cd017ca088b6be13084a/pip-0.5.1.tar.gz#sha256=e27650538c41fe1007a41abd4cfd0f905b822622cbe1f8e7e09d1215af207694
 (from https://pypi.org/simple/pip/), version: 0.5.1
  Found link 
https://files.pythonhosted.org/packages/db/e6/fdf7be8a17b032c533d3f91e91e2c63dd81d3627cbe4113248a00c2d39d8/pip-0.6.tar.gz#sha256=4cf47db6815b2f435d1f44e1f35ff04823043f6161f7df9aec71a123b0c47f0d
 (from https://pypi.org/simple/pip/), version: 0.6
  Found link 
https://files.pythonhosted.org/packages/91/cd/105f4d3c75d0ae18e12623acc96f42168aaba408dd6e43c4505aa21f8e37/pip-0.6.1.tar.gz#sha256=efe47e84ffeb0ea4804f9858b8a94bebd07f5452f907ebed36d03aed06a9f9ec
 (from https://pypi.org/simple/pip/), version: 0.6.1
  Found link 
https://files.pythonhosted.org/packages/1c/c7/c0e1a9413c37828faf290f29a85a4d6034c145cc04bf1622ba8beb662ad8/pip-0.6.2.tar.gz#sha256=1c1a504d7e70d2c24246f95bd16e3d5fcec740fd144df69a407bf65a2ee67586
 (from https://pypi.org/simple/pip/), version: 0.6.2
  Found link 
https://files.pythonhosted.org/packages/3f/af/c4b9d49fb0f286996b28dbc0955c3ad359794697eb98e0e69863908070b0/pip-0.6.3.tar.gz#sha256=1a6df71eb29b98cba11bde6d6a0d8c6dd8b0518e74ceb71fb31ea4fbb42fd313
 (from https://pypi.org/simple/pip/), version: 0.6.3
  Found link 
https://files.pythonhosted.org/packages/ec/7a/6fe91ff0079ad0437830957c459d52f3923e516f5b453218f2a93d09a427/pip-0.7.tar.gz#sha256=ceaea0b9e494d893c8a191895301b79c1db33e41f14d3ad93e3d28a8b4e9bf27
 (from https://pypi.org/simple/pip/), version: 0.7
  Found link 
https://files.pythonhosted.org/packages/a5/63/11303863c2f5e9d9a15d89fcf7513a4b60987007d418862e0fb65c09fff7/pip-0.7.1.tar.gz#sha256=f54f05aa17edd0036de433c44892c8fedb1fd2871c97829838feb995818d24c3
 (from https://pypi.org/simple/pip/), version: 0.7.1
  Found link 
https://files.pythonhosted.org/packages/cd/a9/1debaa96bbc1005c1c8ad3b79fec58c198d35121546ea2e858ce0894268a/pip-0.7.2.tar.gz#sha256=98df2eb779358412bbbae75980171ae85deebc846d87e244d086520b1212da09
 (from https://pypi.org/simple/pip/), version: 0.7.2
  Found link 
https://files.pythonhosted.org/packages/74/54/f785c327fb3d163560a879b36edae5c78ee07806be282c9d4807f6be7dd1/pip-0.8.tar.gz#sha256=9017e4484a212dd4e1a43dd9f039dd7fc8338d4eea1c339d5ae1c80726de5b0f
 (from https://pypi.org/simple/pip/), version: 0.8
  Found link 
https://files.pythonhosted.org/packages/5c/79/5e8381cc3078bae92166f2ba96de8355e8c181926505ba8882f7b099a500/pip-0.8.1.tar.gz#sha256=7176a87f35675f6468341212f3b959bb51d23ea66eb1c3692bf746c45c716fa2
 (from https://pypi.org/simple/pip/), version: 0.8.1
  Found link 
https://files.pythonhosted.org/packages/17/3e/0a98ab032991518741e7e712a719633e6ae160f51b3d3e855194530fd308/pip-0.8.2.tar.gz#sha256=f80a3549c048bc3bbcb47844826e9c7c6fcd87e77b92bef0d9e66d1b397c4962
 (from https://pypi.org/simple/pip/), version: 0.8.2
  Found link 
https://files.pythonhosted.org/packages/f7/9a/943fc6d879ed7220bac2e7e53096bfe78abec88d77f2f516400e0129679e/pip-0.8.3.tar.gz#sha256=1be2e18edd38aa75b5e4ef38a99ec33ba9247177cfcb4a6d2d2b3e73430e3001
 (from https://pypi.org/simple/pip/), version: 0.8.3
  Found link 
https://files.pythonhosted.org/packages/24/33/6eb675fb6db7b71d69d6928b33dea61b8bf5cfe1e5649be70ec84ce2fc09/pip-1.0.tar.gz#sha256=34ba07e2d14ba86d5088ba896ac80bed845a9b276ab8acb279b8d99bc77fec8e
 (from https://pypi.org/simple/pip/), version: 1.0
  Found link 
https://files.pythonhosted.org/packages/10/d9/f584e6107ef98ad7e5d0f756bfee12561fa6a4712ffdb7209e0e1fd4/pip-1.0.1.tar.gz#sha256=37d2f18213d3845d2038dd3686bc71fc12bb41ad66c945a8b0dfec2879f3497b
 (from https://pypi.org/simple/pip/), version: 1.0.1
  Found link 
https://files.pythonhosted.org/packages/16/90/5e6f80364d8a656f60681dfb7330298edef292d43e1499bcb3a4c71ff0b9/pip-1.0.2.tar.gz#sha256=a6ed9b36aac2f121c01a2c9e0307a9e4d9438d100a407db701ac65479a3335d2
 (from https://pypi.org/simple/pip/), version: 1.0.2
  Found link 
https://files.pythonhosted.org/packages/25/57/0d42cf5307d79913a082c5c4397d46f3793bc35e1138a694136d6e31be99/pip-1.1.tar.gz#sha256=993804bb947d18508acee02141281c77d27677f8c14eaa64d6287a1c53ef01c8
 (from https://pypi.org/simple/pip/), version: 1.1
  Found link 
https://files.pythonhosted.org/packages/ba/c3/4e1f892f41aaa217fe0d1f827fa05928783349c69f3cc06fdd68e112678a/pip-1.2.tar.gz#sha256=2b168f1987403f1dc6996a1f22a6f6637b751b7ab6ff27e78380b8d6e70aa314
 (from https://pypi.org/simple/pip/), version: 1.2
  Found link 

Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #2056

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[hannahjiang] [BEAM-9084] cleaning up docker image tag

[hannahjiang] [BEAM-9084] fix Java spotless


--
[...truncated 119.41 KB...]
cls._job_endpoint = cls._create_job_endpoint()
  File "apache_beam/runners/portability/portable_runner_test.py", line 154, in 
_create_job_endpoint
return cls._start_local_runner_subprocess_job_service()
  File "apache_beam/runners/portability/portable_runner_test.py", line 129, in 
_start_local_runner_subprocess_job_service
cls._subprocess.returncode)
RuntimeError: Subprocess terminated unexpectedly with exit code 1.

==
ERROR: test_pardo_state_timers (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 374, in 
test_pardo_state_timers
self._run_pardo_state_timers(False)
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 423, in 
_run_pardo_state_timers
with self.create_pipeline() as p:
  File "apache_beam/runners/portability/portable_runner_test.py", line 196, in 
create_pipeline
return beam.Pipeline(self.get_runner(), self.create_options())
  File 
"
 line 91, in create_options
options = super(SparkRunnerTest, self).create_options()
  File "apache_beam/runners/portability/portable_runner_test.py", line 184, in 
create_options
options.view_as(PortableOptions).job_endpoint = self._get_job_endpoint()
  File "apache_beam/runners/portability/portable_runner_test.py", line 148, in 
_get_job_endpoint
cls._job_endpoint = cls._create_job_endpoint()
  File "apache_beam/runners/portability/portable_runner_test.py", line 154, in 
_create_job_endpoint
return cls._start_local_runner_subprocess_job_service()
  File "apache_beam/runners/portability/portable_runner_test.py", line 129, in 
_start_local_runner_subprocess_job_service
cls._subprocess.returncode)
RuntimeError: Subprocess terminated unexpectedly with exit code 1.

==
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in 
test_pardo_state_with_custom_key_coder
with self.create_pipeline() as p:
  File "apache_beam/runners/portability/portable_runner_test.py", line 196, in 
create_pipeline
return beam.Pipeline(self.get_runner(), self.create_options())
  File 
"
 line 91, in create_options
options = super(SparkRunnerTest, self).create_options()
  File "apache_beam/runners/portability/portable_runner_test.py", line 184, in 
create_options
options.view_as(PortableOptions).job_endpoint = self._get_job_endpoint()
  File "apache_beam/runners/portability/portable_runner_test.py", line 148, in 
_get_job_endpoint
cls._job_endpoint = cls._create_job_endpoint()
  File "apache_beam/runners/portability/portable_runner_test.py", line 154, in 
_create_job_endpoint
return cls._start_local_runner_subprocess_job_service()
  File "apache_beam/runners/portability/portable_runner_test.py", line 129, in 
_start_local_runner_subprocess_job_service
cls._subprocess.returncode)
RuntimeError: Subprocess terminated unexpectedly with exit code 1.

==
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in 
test_pardo_timers
with self.create_pipeline() as p:
  File "apache_beam/runners/portability/portable_runner_test.py", line 196, in 
create_pipeline
return beam.Pipeline(self.get_runner(), self.create_options())
  File 
"
 line 91, in create_options
options = super(SparkRunnerTest, self).create_options()
  File "apache_beam/runners/portability/portable_runner_test.py", line 184, in 
create_options
options.view_as(PortableOptions).job_endpoint = self._get_job_endpoint()
  File "apache_beam/runners/portability/portable_runner_test.py", line 148, in 
_get_job_endpoint
cls._job_endpoint = 

Build failed in Jenkins: beam_PostCommit_Java_PVR_Flink_Streaming #3822

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[lukecwik] [BEAM-9030] Migrate Beam to use beam-vendor-grpc-1_26_0 (#10578)


--
[...truncated 3.76 KB...]
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:flink:1.9:copyResourcesOverrides NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :runners:core-construction-java:processTestResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :sdks:java:extensions:protobuf:extractProto
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :model:job-management:processResources
> Task :runners:flink:1.9:copySourceOverrides
> Task :runners:portability:java:processTestResources NO-SOURCE
> Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE
> Task :model:fn-execution:processResources
> Task :runners:flink:1.9:processResources
> Task :runners:flink:1.9:copyTestSourceOverrides
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :runners:flink:1.9:processTestResources
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :sdks:java:build-tools:compileJava
> Task :sdks:java:build-tools:processResources
> Task :sdks:java:build-tools:classes
> Task :sdks:java:build-tools:jar
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:pipeline:shadowJar
> Task :model:job-management:compileJava
> Task :model:job-management:classes
> Task :model:job-management:shadowJar
> Task :model:fn-execution:compileJava
> Task :model:fn-execution:classes
> Task :model:fn-execution:shadowJar

> Task :sdks:java:core:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:core:jar

> Task :vendor:sdks-java-extensions-protobuf:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :vendor:sdks-java-extensions-protobuf:classes

> Task :sdks:java:extensions:protobuf:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:extensions:protobuf:classes

> Task :sdks:java:fn-execution:compileJava
Note: 

 uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with 

Build failed in Jenkins: beam_PostCommit_XVR_Flink #1461

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[boyuanz] Moving to 2.20.0-SNAPSHOT on master branch.


--
[...truncated 5.14 MB...]
  File "apache_beam/runners/worker/operations.py", line 657, in 
apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 658, in 
apache_beam.runners.worker.operations.DoOperation.process
delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 878, in 
apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
  File "apache_beam/runners/common.py", line 885, in 
apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 941, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
raise
  File "apache_beam/runners/common.py", line 883, in 
apache_beam.runners.common.DoFnRunner.process
return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 497, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
self.output_processor.process_outputs(
  File "apache_beam/runners/common.py", line 1028, in 
apache_beam.runners.common._OutputProcessor.process_outputs
self.main_receivers.receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 178, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 657, in 
apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 658, in 
apache_beam.runners.worker.operations.DoOperation.process
delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 878, in 
apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
  File "apache_beam/runners/common.py", line 885, in 
apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 956, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 883, in 
apache_beam.runners.common.DoFnRunner.process
return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 498, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
windowed_value, self.process_method(windowed_value.value))
  File 
"
 line 1437, in 
  File 
"
 line 191, in _equal
BeamAssertException: Failed assert: ['a: 3', 'b: 1', 'c: 2'] == [], missing 
elements ['a: 3', 'b: 1', 'c: 2'] [while running 'assert_that/Match']

at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at 
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:204)
at 
org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
... 1 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for 
instruction 131: Traceback (most recent call last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 167, in _execute
response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 223, in 
lambda: self.create_worker().do_instruction(request), request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 352, in do_instruction
request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 386, in process_bundle
bundle_processor.process_bundle(instruction_id))
  File 

Build failed in Jenkins: beam_PostCommit_PortableJar_Spark #290

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[boyuanz] Moving to 2.20.0-SNAPSHOT on master branch.


--
[...truncated 29.56 KB...]
Resolving github.com/magiconair/properties: 
commit='49d762b9817ba1c2e9d0c69183c2b4a8b8f1d934', 
urls=[https://github.com/magiconair/properties.git, 
g...@github.com:magiconair/properties.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/mitchellh/go-homedir: 
commit='b8bc1bf767474819792c23f32d8286a45736f1c6', 
urls=[https://github.com/mitchellh/go-homedir.git, 
g...@github.com:mitchellh/go-homedir.git]
Resolving github.com/mitchellh/mapstructure: 
commit='a4e142e9c047c904fa2f1e144d9a84e6133024bc', 
urls=[https://github.com/mitchellh/mapstructure.git, 
g...@github.com:mitchellh/mapstructure.git]
Resolving github.com/nightlyone/lockfile: 
commit='0ad87eef1443f64d3d8c50da647e2b1552851124', 
urls=[https://github.com/nightlyone/lockfile, 
g...@github.com:nightlyone/lockfile.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/openzipkin/zipkin-go: 
commit='3741243b287094fda649c7f0fa74bd51f37dc122', 
urls=[https://github.com/openzipkin/zipkin-go.git, 
g...@github.com:openzipkin/zipkin-go.git]
Resolving github.com/pelletier/go-toml: 
commit='acdc4509485b587f5e675510c4f2c63e90ff68a8', 
urls=[https://github.com/pelletier/go-toml.git, 
g...@github.com:pelletier/go-toml.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/pierrec/lz4: 
commit='ed8d4cc3b461464e69798080a0092bd028910298', 
urls=[https://github.com/pierrec/lz4.git, g...@github.com:pierrec/lz4.git]
Resolving github.com/pierrec/xxHash: 
commit='a0006b13c722f7f12368c00a3d3c2ae8a999a0c6', 
urls=[https://github.com/pierrec/xxHash.git, g...@github.com:pierrec/xxHash.git]
Resolving github.com/pkg/errors: 
commit='30136e27e2ac8d167177e8a583aa4c3fea5be833', 
urls=[https://github.com/pkg/errors.git, g...@github.com:pkg/errors.git]
Resolving github.com/pkg/sftp: 
commit='22e9c1ccc02fc1b9fa3264572e49109b68a86947', 
urls=[https://github.com/pkg/sftp.git, g...@github.com:pkg/sftp.git]
Resolving github.com/prometheus/client_golang: 
commit='9bb6ab929dcbe1c8393cd9ef70387cb69811bd1c', 
urls=[https://github.com/prometheus/client_golang.git, 
g...@github.com:prometheus/client_golang.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/prometheus/procfs: 
commit='cb4147076ac75738c9a7d279075a253c0cc5acbd', 
urls=[https://github.com/prometheus/procfs.git, 
g...@github.com:prometheus/procfs.git]
Resolving github.com/rcrowley/go-metrics: 
commit='8732c616f52954686704c8645fe1a9d59e9df7c1', 
urls=[https://github.com/rcrowley/go-metrics.git, 
g...@github.com:rcrowley/go-metrics.git]
Resolving github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 
g...@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 
g...@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/spf13/afero: 
commit='bb8f1927f2a9d3ab41c9340aa034f6b803f4359c', 
urls=[https://github.com/spf13/afero.git, g...@github.com:spf13/afero.git]
Resolving github.com/spf13/cast: 
commit='acbeb36b902d72a7a4c18e8f3241075e7ab763e4', 
urls=[https://github.com/spf13/cast.git, g...@github.com:spf13/cast.git]
Resolving github.com/spf13/cobra: 
commit='93959269ad99e80983c9ba742a7e01203a4c0e4f', 
urls=[https://github.com/spf13/cobra.git, g...@github.com:spf13/cobra.git]
Resolving github.com/spf13/jwalterweatherman: 
commit='7c0cea34c8ece3fbeb2b27ab9b59511d360fb394', 
urls=[https://github.com/spf13/jwalterweatherman.git, 
g...@github.com:spf13/jwalterweatherman.git]

Build failed in Jenkins: beam_PostCommit_Java11_ValidatesRunner_Direct #3001

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[ehudm] [BEAM-8525] Support Const base in binary_subscr

[ehudm] Do not perform test on Py2.7


--
[...truncated 793 B...]
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1cd8e9b9cf6fe62ae13b334b0960315445e6f3fe (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1cd8e9b9cf6fe62ae13b334b0960315445e6f3fe
Commit message: "Merge pull request #9944: [BEAM-8525] Support Const base in 
binary_subscr"
 > git rev-list --no-walk 2e06ca4d0a9249d1bd7f4bfe421c332648230fe3 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 -Dorg.gradle.java.home=/usr/lib/jvm/java-8-openjdk-amd64 
:runners:direct-java:shadowJar :runners:direct-java:shadowTestJar
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:direct-java:processTestResources NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :model:fn-execution:processResources
> Task :model:job-management:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :runners:local-java:jar
> Task 

Build failed in Jenkins: beam_PostCommit_Java_PVR_Flink_Streaming #3820

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[ehudm] [BEAM-8525] Support Const base in binary_subscr

[ehudm] Do not perform test on Py2.7


--
Started by GitHub push by udim
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-2 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1cd8e9b9cf6fe62ae13b334b0960315445e6f3fe (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1cd8e9b9cf6fe62ae13b334b0960315445e6f3fe
Commit message: "Merge pull request #9944: [BEAM-8525] Support Const base in 
binary_subscr"
 > git rev-list --no-walk 2e06ca4d0a9249d1bd7f4bfe421c332648230fe3 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g 
:runners:flink:1.9:job-server:validatesPortableRunnerStreaming
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:flink:1.9:copyResourcesOverrides NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:core-construction-java:processTestResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :runners:portability:java:processTestResources NO-SOURCE
> Task :runners:flink:1.9:copySourceOverrides
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE
> Task :model:fn-execution:processResources
> Task :runners:flink:1.9:processResources
> Task :model:job-management:processResources
> Task :runners:flink:1.9:copyTestSourceOverrides
> Task :runners:flink:1.9:processTestResources
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> 

Build failed in Jenkins: beam_PostCommit_Py_ValCont #5292

2020-01-15 Thread Apache Jenkins Server
ont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py;,>
 line 812, in run
test(orig)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 45, in __call__
return self.run(*arg, **kwarg)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 133, in run
self.runTest(result)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/case.py;,>
 line 151, in runTest
test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
testMethod()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 58, in test_metrics_fnapi_it
result = self.run_pipeline(experiment='beam_fn_api')
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py;,>
 line 39, in run_pipeline
test_pipeline = TestPipeline(is_integration_test=True)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/testing/test_pipeline.py;,>
 line 108, in __init__
super(TestPipeline, self).__init__(runner, options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py;,>
 line 184, in __init__
errors = PipelineOptionsValidator(self._options, runner).validate()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options_validator.py;,>
 line 113, in validate
errors.extend(self.options.view_as(cls).validate(self))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 591, in validate
self.view_as(GoogleCloudOptions).region = self._get_default_gcp_region()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/options/pipeline_options.py;,>
 line 559, in _get_default_gcp_region
raw_output = processes.check_output(cmd, stderr=DEVNULL)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/utils/processes.py;,>
 line 85, in check_output
out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python2.7/subprocess.py", line 568, in check_output
output, unused_err = process.communicate()
  File "/usr/lib/python2.7/subprocess.py", line 792, in communicate
stdout = _eintr_retry_call(self.stdout.read)
  File "/usr/lib/python2.7/subprocess.py", line 476, in _eintr_retry_call
return func(*args)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py;,>
 line 276, in signalhandler
raise TimedOutException()
TimedOutException: 'test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)'

--
XML: nosetests-python2.7_sdk.xml
--
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
--
Ran 2 tests in 903.864s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20200115-200349
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:8d2b6bc20510f42a489622fb1f54d0caa207643e4ab66a7fc3527d1f52e8a771
Deleted: sha256:242731f156d9ee0c004457b3269eaa7eecdbcc1671055f63bdb74ef4f5296a1b
Deleted: sha256:4482a7c228465c068e8f98d78891a01cf4f7f100d5c714b09b1dfda44424fdf2
Deleted: sha256:38d1324f42522a40a82dd9576f7f9776212cd4145b77a264b2144733877545a4
Deleted: sha256:4b38c3211577a3f7159df903577d2dc076a88fe5b773d77e4c589cf7b5964f1f
Deleted: sha256:fdefa401b6c0448f91322c6192093c570edae93ba6cb855ee41f9eac29194b67
Deleted: sha256:3df63dcdffd497dc8bd118eb2b287cd3685df5a725509e6a4037cf4703686854
Deleted: sha256:18c7c487e0e137b101fe9865594dc954bba510aa133b7d1f4016f9cea8070050
Deleted: sha256:9cf99fd62b76ed959252e9b1be0d27b6bbe0cfc69cc1d2c4eb4d0d8636359ea5
Deleted: sha256:137b8ae2547e810fb05203886bfc91e21a49053ae0b8f0e24de430b07ddfb306
Deleted: sha256:4fe

Build failed in Jenkins: beam_PostCommit_Python35 #1466

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[ehudm] [BEAM-8525] Support Const base in binary_subscr

[ehudm] Do not perform test on Py2.7


--
[...truncated 2.54 MB...]
Traceback (most recent call last):
  File 
"
 line 423, in _read_inputs
for elements in elements_iterator:
  File 
"
 line 416, in __next__
return self._next()
  File 
"
 line 703, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that 
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = 
"{"created":"@1579119997.620134827","description":"Error received from peer 
ipv4:127.0.0.1:35113","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Socket
 closed","grpc_status":14}"
>
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
self.run()
  File "/usr/lib/python3.5/threading.py", line 862, in run
self._target(*self._args, **self._kwargs)
  File 
"
 line 137, in run
for work_request in control_stub.Control(get_responses()):
  File 
"
 line 416, in __next__
return self._next()
  File 
"
 line 703, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that 
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = 
"{"created":"@1579119997.620326676","description":"Error received from peer 
ipv4:127.0.0.1:38921","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Socket
 closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
self.run()
  File "/usr/lib/python3.5/threading.py", line 862, in run
self._target(*self._args, **self._kwargs)
  File 
"
 line 649, in pull_responses
for response in responses:
  File 
"
 line 416, in __next__
return self._next()
  File 
"
 line 703, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that 
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = 
"{"created":"@1579119997.620145961","description":"Error received from peer 
ipv4:127.0.0.1:34271","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Socket
 closed","grpc_status":14}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
self.run()
  File "/usr/lib/python3.5/threading.py", line 862, in run
self._target(*self._args, **self._kwargs)
  File 
"
 line 438, in 
target=lambda: self._read_inputs(elements_iterator),
  File 
"
 line 423, in _read_inputs
for elements in elements_iterator:
  File 
"
 line 416, in __next__
return self._next()
  File 
"
 

Jenkins build is back to normal : beam_PostCommit_Python37 #1369

2020-01-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Py_ValCont #5301

2020-01-15 Thread Apache Jenkins Server
See 


Changes:


--
[...truncated 630.65 KB...]
copying apache_beam/transforms/display.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/display_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/dofn_lifecycle_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/environments.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/environments_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/external.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/external_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/external_test_it.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/external_test_py3.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/external_test_py37.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/ptransform.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/ptransform_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/py_dataflow_distribution_counter.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/sideinputs.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/sideinputs_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/stats.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/stats_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/timeutil.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/transforms_keyword_only_args_test_py3.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/trigger.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/trigger_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/userstate.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/userstate_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/util.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/util_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/window.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/window_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/transforms/write_ptransform_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/transforms
copying apache_beam/typehints/__init__.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/typehints/decorators.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/typehints/decorators_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/typehints/decorators_test_py3.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/typehints/native_type_compatibility.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/typehints/native_type_compatibility_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/typehints/opcodes.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/typehints/schemas.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/typehints/schemas_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference_test_py3.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typecheck.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typed_pipeline_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typed_pipeline_test_py3.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints_test.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints_test_py3.py -> 
apache-beam-2.20.0.dev0/apache_beam/typehints
copying apache_beam/utils/__init__.py -> 
apache-beam-2.20.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations.py -> 
apache-beam-2.20.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations_test.py -> 

Build failed in Jenkins: beam_PostCommit_PortableJar_Flink #1126

2020-01-15 Thread Apache Jenkins Server
See 


Changes:


--
[...truncated 41.34 KB...]
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: ws_comma
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Refactored 

RefactoringTool: Refactored 

RefactoringTool: Refactored 

RefactoringTool: Refactored 

RefactoringTool: Refactored 

RefactoringTool: Refactored 

RefactoringTool: Refactored 

RefactoringTool: Refactored 

Build failed in Jenkins: beam_PostCommit_XVR_Flink #1471

2020-01-15 Thread Apache Jenkins Server
See 


Changes:


--
[...truncated 5.14 MB...]
  File "apache_beam/runners/common.py", line 878, in 
apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
  File "apache_beam/runners/common.py", line 885, in 
apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 941, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
raise
  File "apache_beam/runners/common.py", line 883, in 
apache_beam.runners.common.DoFnRunner.process
return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 497, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
self.output_processor.process_outputs(
  File "apache_beam/runners/common.py", line 1028, in 
apache_beam.runners.common._OutputProcessor.process_outputs
self.main_receivers.receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 178, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 657, in 
apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 658, in 
apache_beam.runners.worker.operations.DoOperation.process
delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 878, in 
apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
  File "apache_beam/runners/common.py", line 885, in 
apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 956, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 883, in 
apache_beam.runners.common.DoFnRunner.process
return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 498, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
windowed_value, self.process_method(windowed_value.value))
  File 
"
 line 1437, in 
  File 
"
 line 191, in _equal
BeamAssertException: Failed assert: ['a: 3', 'b: 1', 'c: 2'] == [], missing 
elements ['a: 3', 'b: 1', 'c: 2'] [while running 'assert_that/Match']

at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at 
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:204)
at 
org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
... 1 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for 
instruction 135: Traceback (most recent call last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 167, in _execute
response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 223, in 
lambda: self.create_worker().do_instruction(request), request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 352, in do_instruction
request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 386, in process_bundle
bundle_processor.process_bundle(instruction_id))
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 812, in process_bundle
data.transform_id].process_encoded(data.data)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 205, in process_encoded
self.output(decoded_value)
  File "apache_beam/runners/worker/operations.py", line 302, in 

Build failed in Jenkins: beam_PostCommit_Java_PortabilityApi #3853

2020-01-15 Thread Apache Jenkins Server
See 


Changes:


--
[...truncated 32.31 KB...]
Resolving github.com/pierrec/lz4: 
commit='ed8d4cc3b461464e69798080a0092bd028910298', 
urls=[https://github.com/pierrec/lz4.git, g...@github.com:pierrec/lz4.git]
Resolving github.com/pierrec/xxHash: 
commit='a0006b13c722f7f12368c00a3d3c2ae8a999a0c6', 
urls=[https://github.com/pierrec/xxHash.git, g...@github.com:pierrec/xxHash.git]
Resolving github.com/pkg/errors: 
commit='30136e27e2ac8d167177e8a583aa4c3fea5be833', 
urls=[https://github.com/pkg/errors.git, g...@github.com:pkg/errors.git]
Resolving github.com/pkg/sftp: 
commit='22e9c1ccc02fc1b9fa3264572e49109b68a86947', 
urls=[https://github.com/pkg/sftp.git, g...@github.com:pkg/sftp.git]
Resolving github.com/prometheus/client_golang: 
commit='9bb6ab929dcbe1c8393cd9ef70387cb69811bd1c', 
urls=[https://github.com/prometheus/client_golang.git, 
g...@github.com:prometheus/client_golang.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/prometheus/procfs: 
commit='cb4147076ac75738c9a7d279075a253c0cc5acbd', 
urls=[https://github.com/prometheus/procfs.git, 
g...@github.com:prometheus/procfs.git]
Resolving github.com/rcrowley/go-metrics: 
commit='8732c616f52954686704c8645fe1a9d59e9df7c1', 
urls=[https://github.com/rcrowley/go-metrics.git, 
g...@github.com:rcrowley/go-metrics.git]
Resolving github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 
g...@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/cpuguy83/go-md2man: 
commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', 
urls=[https://github.com/cpuguy83/go-md2man.git, 
g...@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/spf13/afero: 
commit='bb8f1927f2a9d3ab41c9340aa034f6b803f4359c', 
urls=[https://github.com/spf13/afero.git, g...@github.com:spf13/afero.git]
Resolving github.com/spf13/cast: 
commit='acbeb36b902d72a7a4c18e8f3241075e7ab763e4', 
urls=[https://github.com/spf13/cast.git, g...@github.com:spf13/cast.git]
Resolving github.com/spf13/cobra: 
commit='93959269ad99e80983c9ba742a7e01203a4c0e4f', 
urls=[https://github.com/spf13/cobra.git, g...@github.com:spf13/cobra.git]
Resolving github.com/spf13/jwalterweatherman: 
commit='7c0cea34c8ece3fbeb2b27ab9b59511d360fb394', 
urls=[https://github.com/spf13/jwalterweatherman.git, 
g...@github.com:spf13/jwalterweatherman.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/spf13/viper: 
commit='aafc9e6bc7b7bb53ddaa75a5ef49a17d6e654be5', 
urls=[https://github.com/spf13/viper.git, g...@github.com:spf13/viper.git]
Resolving github.com/stathat/go: 
commit='74669b9f388d9d788c97399a0824adbfee78400e', 
urls=[https://github.com/stathat/go.git, g...@github.com:stathat/go.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/xordataexchange/crypt: 
commit='b2862e3d0a775f18c7cfe02273500ae307b61218', 
urls=[https://github.com/xordataexchange/crypt.git, 
g...@github.com:xordataexchange/crypt.git]
Resolving go.opencensus.io: commit='aa2b39d1618ef56ba156f27cfcdae9042f68f0bc', 
urls=[https://github.com/census-instrumentation/opencensus-go]

> Task :sdks:java:core:shadowJar
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :runners:local-java:jar
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Flink #6365

2020-01-15 Thread Apache Jenkins Server
See 


Changes:

[mxm] [BEAM-6008] Make sure to end stream only after sending all messages and


--
Started by GitHub push by mxm
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4fc64ce9de73d0ebfc41e6aa04a4c6a5428a79dd (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4fc64ce9de73d0ebfc41e6aa04a4c6a5428a79dd
Commit message: "Merge pull request #10583: [BEAM-6008] Make sure to end stream 
only after sending all messages and state updates"
 > git rev-list --no-walk 6bc5f6ea1ba117d7e2604ea0a1e83c6caa25fa8e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g :runners:flink:1.9:validatesRunner
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :runners:flink:1.9:copyResourcesOverrides NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :model:job-management:extractProto
> Task :model:fn-execution:extractProto
> Task :runners:flink:1.9:copySourceOverrides
> Task :model:job-management:processResources
> Task :runners:flink:1.9:copyTestResourcesOverrides NO-SOURCE
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :runners:flink:1.9:processResources
> Task :sdks:java:build-tools:compileJava FROM-CACHE
> Task :sdks:java:build-tools:processResources
> Task :sdks:java:build-tools:classes
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :sdks:java:build-tools:jar
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task 

  1   2   >