Build failed in Jenkins: beam_PostCommit_Python_Verify #4766

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[daniel.o.programmer] [BEAM-3513] Removing PrimitiveCombineGroupedValues 
override w/ FnAPI.

--
[...truncated 1.06 MB...]
test_type_check_violation_invalid_simple_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_valid_composite_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_type_check_violation_valid_simple_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_enforce_kv_type_constraint 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_be_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_have_length_2 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_proxy_to_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_constraint_compatibility 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_repr (apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_proxy_to_union 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_getitem_sequence_not_allowed 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_any_return_type_hint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_primitive_type_or_type_constraint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_single_return_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_no_kwargs_accepted 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_composite_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_simple_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_violation 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_compatibility (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_invalid_elem_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_must_be_set 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_composite_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_simple_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_any_argument_type_hint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_basic_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_composite_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_invalid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_must_be_primitive_type_or_constraint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_mix_positional_and_keyword_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_simple_type_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_functions_as_regular_generator 
(apache_beam.typehints.typehints_test.TestGeneratorWrapper) ... ok
test_compatibility (apache_beam.typehints.typehints_test.TupleHintTestCase) ... 
ok
test_compatibility_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_getitem_invalid_ellipsis_type_param 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_getitem_params_must_be_type_or_constraint 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_raw_tuple (apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_composite_type 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_composite_type_arbitrary_length 
(apache_beam.

Jenkins build is back to normal : beam_PostCommit_Java_GradleBuild #154

2018-04-23 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Python_Verify #4767

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[geet.kumar75] BEAM-4038: Support Kafka Headers in KafkaIO

[geet.kumar75] BEAM-4038: Update license information

[geet.kumar75] Update code formatting

[geet.kumar75] Remove custom implementations of KafkaHeader, KafkaHeaders, etc.

[geet.kumar75] Changes based on review comments

[geet.kumar75] Support kafka versions 0.10.1.0 and above

[kirpichov] Consistently handle EmptyMatchTreatment

--
[...truncated 1.06 MB...]
test_type_check_violation_valid_simple_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_enforce_kv_type_constraint 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_be_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_have_length_2 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_proxy_to_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_constraint_compatibility 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_repr (apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_proxy_to_union 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_getitem_sequence_not_allowed 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_any_return_type_hint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_primitive_type_or_type_constraint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_single_return_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_no_kwargs_accepted 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_composite_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_simple_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_violation 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_compatibility (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_invalid_elem_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_must_be_set 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_composite_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_simple_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_any_argument_type_hint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_basic_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_composite_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_invalid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_must_be_primitive_type_or_constraint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_mix_positional_and_keyword_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_simple_type_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_functions_as_regular_generator 
(apache_beam.typehints.typehints_test.TestGeneratorWrapper) ... ok
test_compatibility (apache_beam.typehints.typehints_test.TupleHintTestCase) ... 
ok
test_compatibility_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_getitem_invalid_ellipsis_type_param 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_getitem_params_must_be_type_or_constraint 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_raw_tuple (apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_composite_type 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #185

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[echauchot] Introduce MetricsPusher in runner core to regularly push aggregated

[echauchot] Instanciate MetricsPusher in runner-specific code because we need

[echauchot] Improve MetricsPusher: do not aggregate metrics when not needed, 
improve

[echauchot] Create JsonMetricsSerializer

[echauchot] Stop MetricsPusher thread by observing pipeline state and improve 
the

[echauchot] Make metrics sink configurable through PipelineOptions, pass

[echauchot] Add MetricsPusher tests specific to Spark (because Spark streaming 
tests

[echauchot] Add a MetricksPusher test to runner-core (batch and streaming are 
done

[echauchot] Push metrics at the end of a batch pipeline in flink runner

[echauchot] improve MetricsPusher lifecycle and thread safety

[echauchot] Make MetricsPusher merge a list a MetricsContainerStepMaps because 
there

[echauchot] Fix thread synchronisation and replace usages of instance variable 
by

[echauchot] Clear dummyMetricsSink before test

[echauchot] Push metrics at the end of a batch pipeline in spark runner

[echauchot] Improve MetricsPusher teardown to enable multiple pipelines in a 
single

[echauchot] Manually generate json and remove jackson

[echauchot] Replace use of http client by use of java.net.HttpUrlConnection and 
deal

[echauchot] Remove DEFAULT_PERIOD constant in favor of already existing

[echauchot] Remove unneeded null check, format

[echauchot] convert MetricsSink to an interface with a single writeMetrics 
method

[echauchot] Remove MetricsSerializer base class and inline serialization in

[echauchot] Dynamically create the sinks by reflection

[echauchot] Split DummyMetricsSink into NoOpMetricsSink (default sink) and

[echauchot] Reduce overhead when no metricsSink is provided, do not start 
polling

[echauchot] Make MetricsPusher a regular object instead of a singleton to allow

[echauchot] Explicitely start MetricsPusher from the runners

[echauchot] Separate MetricsHttpSink POC to a new runners-extensions artifact 
and

[echauchot] Fix cycle bug between teardown() and pushmetrics()

[echauchot] Update MetricsPusher and TestMetricsSink to new serializable

[echauchot] Use regular jackson object mapper to serialize metrics now that 
they are

[echauchot] Give MetricsPusher a bit of time to push before assert in test

[echauchot] Make MetricsPusher thread a daemon

[echauchot] Fix build and clean: dependencies, rat, checkstyle, findbugs, remove

[echauchot] Move build to gradle

[echauchot] MetricsSink no more needs to be generic

[echauchot] SparkRunnerDebugger does not need to export metrics as it does not 
run

[echauchot] Move MetricsHttpSink and related classes to a new sub-module

[kirpichov] Consistently handle EmptyMatchTreatment

--
Started by GitHub push by iemejia
[EnvInject] - Loading node environment variables.
Building remotely on beam8 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 14525484ac19e295ca2811323c04af4a10a2477e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 14525484ac19e295ca2811323c04af4a10a2477e
Commit message: "Merge pull request #4548: [BEAM-3310] Add metrics pusher"
 > git rev-list --no-walk e6681e8aac364b11304fdd4a85642a89307eeaee # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 --info --continue --max-workers=2 :beam-runners-spark:validatesRunner
Initialized native services in: /home/jenkins/.gradle/native
Using 2 worker leases.
Starting Build
Settings evaluated using settings file 
'
Using local directory build cache for the root build (location = 
/home/jenkins/.gradle/caches/build-cache-1, removeUnusedEntriesAfter = 7 days).
Projects loaded. Root project using build fil

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #206

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[echauchot] Introduce MetricsPusher in runner core to regularly push aggregated

[echauchot] Instanciate MetricsPusher in runner-specific code because we need

[echauchot] Improve MetricsPusher: do not aggregate metrics when not needed, 
improve

[echauchot] Create JsonMetricsSerializer

[echauchot] Stop MetricsPusher thread by observing pipeline state and improve 
the

[echauchot] Make metrics sink configurable through PipelineOptions, pass

[echauchot] Add MetricsPusher tests specific to Spark (because Spark streaming 
tests

[echauchot] Add a MetricksPusher test to runner-core (batch and streaming are 
done

[echauchot] Push metrics at the end of a batch pipeline in flink runner

[echauchot] improve MetricsPusher lifecycle and thread safety

[echauchot] Make MetricsPusher merge a list a MetricsContainerStepMaps because 
there

[echauchot] Fix thread synchronisation and replace usages of instance variable 
by

[echauchot] Clear dummyMetricsSink before test

[echauchot] Push metrics at the end of a batch pipeline in spark runner

[echauchot] Improve MetricsPusher teardown to enable multiple pipelines in a 
single

[echauchot] Manually generate json and remove jackson

[echauchot] Replace use of http client by use of java.net.HttpUrlConnection and 
deal

[echauchot] Remove DEFAULT_PERIOD constant in favor of already existing

[echauchot] Remove unneeded null check, format

[echauchot] convert MetricsSink to an interface with a single writeMetrics 
method

[echauchot] Remove MetricsSerializer base class and inline serialization in

[echauchot] Dynamically create the sinks by reflection

[echauchot] Split DummyMetricsSink into NoOpMetricsSink (default sink) and

[echauchot] Reduce overhead when no metricsSink is provided, do not start 
polling

[echauchot] Make MetricsPusher a regular object instead of a singleton to allow

[echauchot] Explicitely start MetricsPusher from the runners

[echauchot] Separate MetricsHttpSink POC to a new runners-extensions artifact 
and

[echauchot] Fix cycle bug between teardown() and pushmetrics()

[echauchot] Update MetricsPusher and TestMetricsSink to new serializable

[echauchot] Use regular jackson object mapper to serialize metrics now that 
they are

[echauchot] Give MetricsPusher a bit of time to push before assert in test

[echauchot] Make MetricsPusher thread a daemon

[echauchot] Fix build and clean: dependencies, rat, checkstyle, findbugs, remove

[echauchot] Move build to gradle

[echauchot] MetricsSink no more needs to be generic

[echauchot] SparkRunnerDebugger does not need to export metrics as it does not 
run

[echauchot] Move MetricsHttpSink and related classes to a new sub-module

[kirpichov] Consistently handle EmptyMatchTreatment

--
Started by GitHub push by iemejia
[EnvInject] - Loading node environment variables.
Building remotely on beam8 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 14525484ac19e295ca2811323c04af4a10a2477e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 14525484ac19e295ca2811323c04af4a10a2477e
Commit message: "Merge pull request #4548: [BEAM-3310] Add metrics pusher"
 > git rev-list --no-walk e6681e8aac364b11304fdd4a85642a89307eeaee # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 --info --continue --max-workers=2 :beam-runners-flink_2.11:validatesRunner
Initialized native services in: /home/jenkins/.gradle/native
Using 2 worker leases.
Starting Build
Invalidating in-memory cache of 
/home/jenkins/.gradle/caches/4.6/fileHashes/fileHashes.bin
Invalidating in-memory cache of 
/home/jenkins/.gradle/caches/4.6/fileHashes/resourceHashesCache.bin
Settings evaluated using settings file 
'

Build failed in Jenkins: beam_PostCommit_Python_Verify #4768

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[echauchot] Introduce MetricsPusher in runner core to regularly push aggregated

[echauchot] Instanciate MetricsPusher in runner-specific code because we need

[echauchot] Improve MetricsPusher: do not aggregate metrics when not needed, 
improve

[echauchot] Create JsonMetricsSerializer

[echauchot] Stop MetricsPusher thread by observing pipeline state and improve 
the

[echauchot] Make metrics sink configurable through PipelineOptions, pass

[echauchot] Add MetricsPusher tests specific to Spark (because Spark streaming 
tests

[echauchot] Add a MetricksPusher test to runner-core (batch and streaming are 
done

[echauchot] Push metrics at the end of a batch pipeline in flink runner

[echauchot] improve MetricsPusher lifecycle and thread safety

[echauchot] Make MetricsPusher merge a list a MetricsContainerStepMaps because 
there

[echauchot] Fix thread synchronisation and replace usages of instance variable 
by

[echauchot] Clear dummyMetricsSink before test

[echauchot] Push metrics at the end of a batch pipeline in spark runner

[echauchot] Improve MetricsPusher teardown to enable multiple pipelines in a 
single

[echauchot] Manually generate json and remove jackson

[echauchot] Replace use of http client by use of java.net.HttpUrlConnection and 
deal

[echauchot] Remove DEFAULT_PERIOD constant in favor of already existing

[echauchot] Remove unneeded null check, format

[echauchot] convert MetricsSink to an interface with a single writeMetrics 
method

[echauchot] Remove MetricsSerializer base class and inline serialization in

[echauchot] Dynamically create the sinks by reflection

[echauchot] Split DummyMetricsSink into NoOpMetricsSink (default sink) and

[echauchot] Reduce overhead when no metricsSink is provided, do not start 
polling

[echauchot] Make MetricsPusher a regular object instead of a singleton to allow

[echauchot] Explicitely start MetricsPusher from the runners

[echauchot] Separate MetricsHttpSink POC to a new runners-extensions artifact 
and

[echauchot] Fix cycle bug between teardown() and pushmetrics()

[echauchot] Update MetricsPusher and TestMetricsSink to new serializable

[echauchot] Use regular jackson object mapper to serialize metrics now that 
they are

[echauchot] Give MetricsPusher a bit of time to push before assert in test

[echauchot] Make MetricsPusher thread a daemon

[echauchot] Fix build and clean: dependencies, rat, checkstyle, findbugs, remove

[echauchot] Move build to gradle

[echauchot] MetricsSink no more needs to be generic

[echauchot] SparkRunnerDebugger does not need to export metrics as it does not 
run

[echauchot] Move MetricsHttpSink and related classes to a new sub-module

--
[...truncated 1.06 MB...]
test_type_check_violation_valid_simple_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_enforce_kv_type_constraint 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_be_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_have_length_2 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_proxy_to_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_constraint_compatibility 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_repr (apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_proxy_to_union 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_getitem_sequence_not_allowed 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_any_return_type_hint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_primitive_type_or_type_constraint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_single_return_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_no_kwargs_accepted 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_composite_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_simple_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_violation 
(apache_beam.typehints.typehints_test.ReturnsDe

Build failed in Jenkins: beam_PostCommit_Python_Verify #4769

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[rangadi] Add 10 millis sleep when there are no elements left in a partition.

--
[...truncated 1.06 MB...]
test_type_check_violation_valid_simple_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_enforce_kv_type_constraint 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_be_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_have_length_2 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_proxy_to_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_constraint_compatibility 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_repr (apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_proxy_to_union 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_getitem_sequence_not_allowed 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_any_return_type_hint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_primitive_type_or_type_constraint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_single_return_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_no_kwargs_accepted 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_composite_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_simple_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_violation 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_compatibility (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_invalid_elem_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_must_be_set 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_composite_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_simple_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_any_argument_type_hint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_basic_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_composite_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_invalid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_must_be_primitive_type_or_constraint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_mix_positional_and_keyword_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_simple_type_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_functions_as_regular_generator 
(apache_beam.typehints.typehints_test.TestGeneratorWrapper) ... ok
test_compatibility (apache_beam.typehints.typehints_test.TupleHintTestCase) ... 
ok
test_compatibility_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_getitem_invalid_ellipsis_type_param 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_getitem_params_must_be_type_or_constraint 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_raw_tuple (apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_composite_type 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_composite_type_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_simple_type_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_simple_types 
(apache_beam.typehints.typehints_test

Build failed in Jenkins: beam_PerformanceTests_Python #1185

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[geet.kumar75] BEAM-4038: Support Kafka Headers in KafkaIO

[geet.kumar75] BEAM-4038: Update license information

[geet.kumar75] Update code formatting

[geet.kumar75] Remove custom implementations of KafkaHeader, KafkaHeaders, etc.

[geet.kumar75] Changes based on review comments

[daniel.o.programmer] [BEAM-3513] Removing PrimitiveCombineGroupedValues 
override w/ FnAPI.

[echauchot] Introduce MetricsPusher in runner core to regularly push aggregated

[echauchot] Instanciate MetricsPusher in runner-specific code because we need

[echauchot] Improve MetricsPusher: do not aggregate metrics when not needed, 
improve

[echauchot] Create JsonMetricsSerializer

[echauchot] Stop MetricsPusher thread by observing pipeline state and improve 
the

[echauchot] Make metrics sink configurable through PipelineOptions, pass

[echauchot] Add MetricsPusher tests specific to Spark (because Spark streaming 
tests

[echauchot] Add a MetricksPusher test to runner-core (batch and streaming are 
done

[echauchot] Push metrics at the end of a batch pipeline in flink runner

[echauchot] improve MetricsPusher lifecycle and thread safety

[echauchot] Make MetricsPusher merge a list a MetricsContainerStepMaps because 
there

[echauchot] Fix thread synchronisation and replace usages of instance variable 
by

[echauchot] Clear dummyMetricsSink before test

[echauchot] Push metrics at the end of a batch pipeline in spark runner

[echauchot] Improve MetricsPusher teardown to enable multiple pipelines in a 
single

[echauchot] Manually generate json and remove jackson

[echauchot] Replace use of http client by use of java.net.HttpUrlConnection and 
deal

[echauchot] Remove DEFAULT_PERIOD constant in favor of already existing

[echauchot] Remove unneeded null check, format

[echauchot] convert MetricsSink to an interface with a single writeMetrics 
method

[echauchot] Remove MetricsSerializer base class and inline serialization in

[echauchot] Dynamically create the sinks by reflection

[echauchot] Split DummyMetricsSink into NoOpMetricsSink (default sink) and

[echauchot] Reduce overhead when no metricsSink is provided, do not start 
polling

[echauchot] Make MetricsPusher a regular object instead of a singleton to allow

[echauchot] Explicitely start MetricsPusher from the runners

[echauchot] Separate MetricsHttpSink POC to a new runners-extensions artifact 
and

[echauchot] Fix cycle bug between teardown() and pushmetrics()

[echauchot] Update MetricsPusher and TestMetricsSink to new serializable

[echauchot] Use regular jackson object mapper to serialize metrics now that 
they are

[echauchot] Give MetricsPusher a bit of time to push before assert in test

[echauchot] Make MetricsPusher thread a daemon

[echauchot] Fix build and clean: dependencies, rat, checkstyle, findbugs, remove

[geet.kumar75] Support kafka versions 0.10.1.0 and above

[echauchot] Move build to gradle

[echauchot] MetricsSink no more needs to be generic

[echauchot] SparkRunnerDebugger does not need to export metrics as it does not 
run

[tgroh] Use Existing Matchers in WatermarkManagerTest

[echauchot] Move MetricsHttpSink and related classes to a new sub-module

[kirpichov] Consistently handle EmptyMatchTreatment

[rangadi] Add 10 millis sleep when there are no elements left in a partition.

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam3 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0f2ba71e1b6db88ed79744e363586a8ff16dcb08 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0f2ba71e1b6db88ed79744e363586a8ff16dcb08
Commit message: "Merge pull request #5195: Use Existing Matchers in 
WatermarkManagerTest"
 > git rev-list --no-walk 247a62ff1d4368f1e7c2ade6bed5dec71d8d2bcc # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins5809426347260041593.sh
+ rm -rf PerfKitBenchmarker
[beam

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #208

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Use Existing Matchers in WatermarkManagerTest

--
[...truncated 99.43 MB...]
Apr 24, 2018 12:05:19 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Ensuring all FileSystem streams are closed for task 
View.AsSingleton/Combine.GloballyAsSingletonView/View.CreatePCollectionView/Combine.globally(Concatenate)/Combine.perKey(Concatenate)
 -> 
View.AsSingleton/Combine.GloballyAsSingletonView/View.CreatePCollectionView/Combine.globally(Concatenate)/Values/Values/Map/ParMultiDo(Anonymous)
 -> (Map, Map) (1/1) (89d6b16107dcde928321886e1b0e6430) [FINISHED]
Apr 24, 2018 12:05:19 AM grizzled.slf4j.Logger info
INFO: Un-registering task and sending final execution state FINISHED to 
JobManager for task PAssert$291/GroupGlobally/GroupDummyAndContents -> 
PAssert$291/GroupGlobally/Values/Values/Map/ParMultiDo(Anonymous) -> 
PAssert$291/GroupGlobally/ParDo(Concat)/ParMultiDo(Concat) -> 
PAssert$291/GetPane/Map/ParMultiDo(Anonymous) -> 
PAssert$291/RunChecks/ParMultiDo(GroupedValuesChecker) -> 
PAssert$291/VerifyAssertions/ParDo(DefaultConclude)/ParMultiDo(DefaultConclude) 
(79730c6bdd21b511d181618629a87cd8)
Apr 24, 2018 12:05:19 AM org.apache.flink.runtime.taskmanager.Task 
transitionState
INFO: 
Combine.globally(TestCombineFnWithContext)/Combine.perKey(TestCombineFnWithContext)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
Combine.globally(TestCombineFnWithContext)/Values/Values/Map/ParMultiDo(Anonymous)
 -> PAssert$293/GroupGlobally/Window.Into()/Window.Assign.out -> 
PAssert$293/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
PAssert$293/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)
 -> PAssert$293/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign.out 
-> ToKeyedWorkItem (1/1) (62ef92a0802202a1b4df811e62bf7a9d) switched from 
RUNNING to FINISHED.
Apr 24, 2018 12:05:19 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Freeing task resources for 
Combine.globally(TestCombineFnWithContext)/Combine.perKey(TestCombineFnWithContext)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
Combine.globally(TestCombineFnWithContext)/Values/Values/Map/ParMultiDo(Anonymous)
 -> PAssert$293/GroupGlobally/Window.Into()/Window.Assign.out -> 
PAssert$293/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
PAssert$293/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)
 -> PAssert$293/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign.out 
-> ToKeyedWorkItem (1/1) (62ef92a0802202a1b4df811e62bf7a9d).
Apr 24, 2018 12:05:19 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Ensuring all FileSystem streams are closed for task 
Combine.globally(TestCombineFnWithContext)/Combine.perKey(TestCombineFnWithContext)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
Combine.globally(TestCombineFnWithContext)/Values/Values/Map/ParMultiDo(Anonymous)
 -> PAssert$293/GroupGlobally/Window.Into()/Window.Assign.out -> 
PAssert$293/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
PAssert$293/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)
 -> PAssert$293/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign.out 
-> ToKeyedWorkItem (1/1) (62ef92a0802202a1b4df811e62bf7a9d) [FINISHED]
Apr 24, 2018 12:05:19 AM org.apache.flink.runtime.taskmanager.Task 
transitionState
INFO: 
Combine.perKey(TestCombineFnWithContext)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> PAssert$292/GroupGlobally/Window.Into()/Window.Assign.out -> 
PAssert$292/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
PAssert$292/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)
 -> PAssert$292/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign.out 
-> ToKeyedWorkItem (1/1) (bafcc738b0f5f4ab76efea5f4f998fc4) switched from 
RUNNING to FINISHED.
Apr 24, 2018 12:05:19 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Freeing task resources for 
Combine.perKey(TestCombineFnWithContext)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> PAssert$292/GroupGlobally/Window.Into()/Window.Assign.out -> 
PAssert$292/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
PAssert$292/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)
 -> PAssert$292/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign.out 
-> ToKeyedWorkItem (1/1) (bafcc738b0f5f4ab76efea5f4f998fc4).
Apr 24, 2018 12:05:19 AM org.apache.flink.runtime.taskmanager.Task run
INFO: Ensuring all FileSystem streams are closed for task 
Combine.perKey(TestCombineFnWithContext)/

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #187

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Use Existing Matchers in WatermarkManagerTest

--
[...truncated 523.19 KB...]
Gradle Test Executor 236 started executing tests.
Gradle Test Executor 236 finished executing tests.
Starting process 'Gradle Test Executor 237'. Working directory: 

 Command: /usr/local/asfpackages/java/jdk1.8.0_152/bin/java 
-Dbeam.spark.test.reuseSparkContext=true 
-DbeamTestPipelineOptions=["--runner=TestSparkRunner","--streaming=false","--enableSparkMetricSinks=false"]
 
-Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Dspark.ui.enabled=false 
-Dspark.ui.showConsoleProgress=false -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/4.6/workerMain/gradle-worker.jar 
worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test 
Executor 237'
Successfully started process 'Gradle Test Executor 237'
Gradle Test Executor 237 started executing tests.
Gradle Test Executor 237 finished executing tests.
Starting process 'Gradle Test Executor 238'. Working directory: 

 Command: /usr/local/asfpackages/java/jdk1.8.0_152/bin/java 
-Dbeam.spark.test.reuseSparkContext=true 
-DbeamTestPipelineOptions=["--runner=TestSparkRunner","--streaming=false","--enableSparkMetricSinks=false"]
 
-Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Dspark.ui.enabled=false 
-Dspark.ui.showConsoleProgress=false -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/4.6/workerMain/gradle-worker.jar 
worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test 
Executor 238'
Successfully started process 'Gradle Test Executor 238'
Gradle Test Executor 238 started executing tests.
Gradle Test Executor 238 finished executing tests.
Starting process 'Gradle Test Executor 239'. Working directory: 

 Command: /usr/local/asfpackages/java/jdk1.8.0_152/bin/java 
-Dbeam.spark.test.reuseSparkContext=true 
-DbeamTestPipelineOptions=["--runner=TestSparkRunner","--streaming=false","--enableSparkMetricSinks=false"]
 
-Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Dspark.ui.enabled=false 
-Dspark.ui.showConsoleProgress=false -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/4.6/workerMain/gradle-worker.jar 
worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test 
Executor 239'
Successfully started process 'Gradle Test Executor 239'
Gradle Test Executor 239 started executing tests.
Gradle Test Executor 239 finished executing tests.
Starting process 'Gradle Test Executor 240'. Working directory: 

 Command: /usr/local/asfpackages/java/jdk1.8.0_152/bin/java 
-Dbeam.spark.test.reuseSparkContext=true 
-DbeamTestPipelineOptions=["--runner=TestSparkRunner","--streaming=false","--enableSparkMetricSinks=false"]
 
-Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Dspark.ui.enabled=false 
-Dspark.ui.showConsoleProgress=false -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/4.6/workerMain/gradle-worker.jar 
worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test 
Executor 240'
Successfully started process 'Gradle Test Executor 240'
Gradle Test Executor 240 started executing tests.
Gradle Test Executor 240 finished executing tests.
Starting process 'Gradle Test Executor 241'. Working directory: 

 Command: /usr/local/asfpackages/java/jdk1.8.0_152/bin/java 
-Dbeam.spark.test.reuseSparkContext=true 
-DbeamTestPipelineOptions=["--runner=TestSparkRunner","--streaming=false","--enableSparkMetricSinks=false"]
 
-Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Dspark.ui.enabled=false 
-Dspark.ui.showConsoleProgress=false -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/4.6/workerMain/gradle-worker.jar 
worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test 
Executor 241'
Su

Build failed in Jenkins: beam_PerformanceTests_MongoDBIO_IT #87

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[geet.kumar75] BEAM-4038: Support Kafka Headers in KafkaIO

[geet.kumar75] BEAM-4038: Update license information

[geet.kumar75] Update code formatting

[geet.kumar75] Remove custom implementations of KafkaHeader, KafkaHeaders, etc.

[geet.kumar75] Changes based on review comments

[daniel.o.programmer] [BEAM-3513] Removing PrimitiveCombineGroupedValues 
override w/ FnAPI.

[echauchot] Introduce MetricsPusher in runner core to regularly push aggregated

[echauchot] Instanciate MetricsPusher in runner-specific code because we need

[echauchot] Improve MetricsPusher: do not aggregate metrics when not needed, 
improve

[echauchot] Create JsonMetricsSerializer

[echauchot] Stop MetricsPusher thread by observing pipeline state and improve 
the

[echauchot] Make metrics sink configurable through PipelineOptions, pass

[echauchot] Add MetricsPusher tests specific to Spark (because Spark streaming 
tests

[echauchot] Add a MetricksPusher test to runner-core (batch and streaming are 
done

[echauchot] Push metrics at the end of a batch pipeline in flink runner

[echauchot] improve MetricsPusher lifecycle and thread safety

[echauchot] Make MetricsPusher merge a list a MetricsContainerStepMaps because 
there

[echauchot] Fix thread synchronisation and replace usages of instance variable 
by

[echauchot] Clear dummyMetricsSink before test

[echauchot] Push metrics at the end of a batch pipeline in spark runner

[echauchot] Improve MetricsPusher teardown to enable multiple pipelines in a 
single

[echauchot] Manually generate json and remove jackson

[echauchot] Replace use of http client by use of java.net.HttpUrlConnection and 
deal

[echauchot] Remove DEFAULT_PERIOD constant in favor of already existing

[echauchot] Remove unneeded null check, format

[echauchot] convert MetricsSink to an interface with a single writeMetrics 
method

[echauchot] Remove MetricsSerializer base class and inline serialization in

[echauchot] Dynamically create the sinks by reflection

[echauchot] Split DummyMetricsSink into NoOpMetricsSink (default sink) and

[echauchot] Reduce overhead when no metricsSink is provided, do not start 
polling

[echauchot] Make MetricsPusher a regular object instead of a singleton to allow

[echauchot] Explicitely start MetricsPusher from the runners

[echauchot] Separate MetricsHttpSink POC to a new runners-extensions artifact 
and

[echauchot] Fix cycle bug between teardown() and pushmetrics()

[echauchot] Update MetricsPusher and TestMetricsSink to new serializable

[echauchot] Use regular jackson object mapper to serialize metrics now that 
they are

[echauchot] Give MetricsPusher a bit of time to push before assert in test

[echauchot] Make MetricsPusher thread a daemon

[echauchot] Fix build and clean: dependencies, rat, checkstyle, findbugs, remove

[geet.kumar75] Support kafka versions 0.10.1.0 and above

[echauchot] Move build to gradle

[echauchot] MetricsSink no more needs to be generic

[echauchot] SparkRunnerDebugger does not need to export metrics as it does not 
run

[tgroh] Use Existing Matchers in WatermarkManagerTest

[echauchot] Move MetricsHttpSink and related classes to a new sub-module

[kirpichov] Consistently handle EmptyMatchTreatment

[rangadi] Add 10 millis sleep when there are no elements left in a partition.

--
[...truncated 1.03 MB...]
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 3 ms while waiting for a 
server that matches WritableServerSelector. Client view of cluster state is 
{type=UNKNOWN, servers=[{address=104.197.104.146:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
at 
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:75)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:71)
at 
com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
at 
com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java

Jenkins build is back to normal : beam_PerformanceTests_AvroIOIT #414

2018-04-23 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #92

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[geet.kumar75] BEAM-4038: Support Kafka Headers in KafkaIO

[geet.kumar75] BEAM-4038: Update license information

[geet.kumar75] Update code formatting

[geet.kumar75] Remove custom implementations of KafkaHeader, KafkaHeaders, etc.

[geet.kumar75] Changes based on review comments

[daniel.o.programmer] [BEAM-3513] Removing PrimitiveCombineGroupedValues 
override w/ FnAPI.

[echauchot] Introduce MetricsPusher in runner core to regularly push aggregated

[echauchot] Instanciate MetricsPusher in runner-specific code because we need

[echauchot] Improve MetricsPusher: do not aggregate metrics when not needed, 
improve

[echauchot] Create JsonMetricsSerializer

[echauchot] Stop MetricsPusher thread by observing pipeline state and improve 
the

[echauchot] Make metrics sink configurable through PipelineOptions, pass

[echauchot] Add MetricsPusher tests specific to Spark (because Spark streaming 
tests

[echauchot] Add a MetricksPusher test to runner-core (batch and streaming are 
done

[echauchot] Push metrics at the end of a batch pipeline in flink runner

[echauchot] improve MetricsPusher lifecycle and thread safety

[echauchot] Make MetricsPusher merge a list a MetricsContainerStepMaps because 
there

[echauchot] Fix thread synchronisation and replace usages of instance variable 
by

[echauchot] Clear dummyMetricsSink before test

[echauchot] Push metrics at the end of a batch pipeline in spark runner

[echauchot] Improve MetricsPusher teardown to enable multiple pipelines in a 
single

[echauchot] Manually generate json and remove jackson

[echauchot] Replace use of http client by use of java.net.HttpUrlConnection and 
deal

[echauchot] Remove DEFAULT_PERIOD constant in favor of already existing

[echauchot] Remove unneeded null check, format

[echauchot] convert MetricsSink to an interface with a single writeMetrics 
method

[echauchot] Remove MetricsSerializer base class and inline serialization in

[echauchot] Dynamically create the sinks by reflection

[echauchot] Split DummyMetricsSink into NoOpMetricsSink (default sink) and

[echauchot] Reduce overhead when no metricsSink is provided, do not start 
polling

[echauchot] Make MetricsPusher a regular object instead of a singleton to allow

[echauchot] Explicitely start MetricsPusher from the runners

[echauchot] Separate MetricsHttpSink POC to a new runners-extensions artifact 
and

[echauchot] Fix cycle bug between teardown() and pushmetrics()

[echauchot] Update MetricsPusher and TestMetricsSink to new serializable

[echauchot] Use regular jackson object mapper to serialize metrics now that 
they are

[echauchot] Give MetricsPusher a bit of time to push before assert in test

[echauchot] Make MetricsPusher thread a daemon

[echauchot] Fix build and clean: dependencies, rat, checkstyle, findbugs, remove

[geet.kumar75] Support kafka versions 0.10.1.0 and above

[echauchot] Move build to gradle

[echauchot] MetricsSink no more needs to be generic

[echauchot] SparkRunnerDebugger does not need to export metrics as it does not 
run

[tgroh] Use Existing Matchers in WatermarkManagerTest

[echauchot] Move MetricsHttpSink and related classes to a new sub-module

[kirpichov] Consistently handle EmptyMatchTreatment

[rangadi] Add 10 millis sleep when there are no elements left in a partition.

--
[...truncated 229.48 KB...]
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:248)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:235)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
Caused by: java.net.ConnectException: Connection refused
   

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #85

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[geet.kumar75] BEAM-4038: Support Kafka Headers in KafkaIO

[geet.kumar75] BEAM-4038: Update license information

[geet.kumar75] Update code formatting

[geet.kumar75] Remove custom implementations of KafkaHeader, KafkaHeaders, etc.

[geet.kumar75] Changes based on review comments

[daniel.o.programmer] [BEAM-3513] Removing PrimitiveCombineGroupedValues 
override w/ FnAPI.

[echauchot] Introduce MetricsPusher in runner core to regularly push aggregated

[echauchot] Instanciate MetricsPusher in runner-specific code because we need

[echauchot] Improve MetricsPusher: do not aggregate metrics when not needed, 
improve

[echauchot] Create JsonMetricsSerializer

[echauchot] Stop MetricsPusher thread by observing pipeline state and improve 
the

[echauchot] Make metrics sink configurable through PipelineOptions, pass

[echauchot] Add MetricsPusher tests specific to Spark (because Spark streaming 
tests

[echauchot] Add a MetricksPusher test to runner-core (batch and streaming are 
done

[echauchot] Push metrics at the end of a batch pipeline in flink runner

[echauchot] improve MetricsPusher lifecycle and thread safety

[echauchot] Make MetricsPusher merge a list a MetricsContainerStepMaps because 
there

[echauchot] Fix thread synchronisation and replace usages of instance variable 
by

[echauchot] Clear dummyMetricsSink before test

[echauchot] Push metrics at the end of a batch pipeline in spark runner

[echauchot] Improve MetricsPusher teardown to enable multiple pipelines in a 
single

[echauchot] Manually generate json and remove jackson

[echauchot] Replace use of http client by use of java.net.HttpUrlConnection and 
deal

[echauchot] Remove DEFAULT_PERIOD constant in favor of already existing

[echauchot] Remove unneeded null check, format

[echauchot] convert MetricsSink to an interface with a single writeMetrics 
method

[echauchot] Remove MetricsSerializer base class and inline serialization in

[echauchot] Dynamically create the sinks by reflection

[echauchot] Split DummyMetricsSink into NoOpMetricsSink (default sink) and

[echauchot] Reduce overhead when no metricsSink is provided, do not start 
polling

[echauchot] Make MetricsPusher a regular object instead of a singleton to allow

[echauchot] Explicitely start MetricsPusher from the runners

[echauchot] Separate MetricsHttpSink POC to a new runners-extensions artifact 
and

[echauchot] Fix cycle bug between teardown() and pushmetrics()

[echauchot] Update MetricsPusher and TestMetricsSink to new serializable

[echauchot] Use regular jackson object mapper to serialize metrics now that 
they are

[echauchot] Give MetricsPusher a bit of time to push before assert in test

[echauchot] Make MetricsPusher thread a daemon

[echauchot] Fix build and clean: dependencies, rat, checkstyle, findbugs, remove

[geet.kumar75] Support kafka versions 0.10.1.0 and above

[echauchot] Move build to gradle

[echauchot] MetricsSink no more needs to be generic

[echauchot] SparkRunnerDebugger does not need to export metrics as it does not 
run

[tgroh] Use Existing Matchers in WatermarkManagerTest

[echauchot] Move MetricsHttpSink and related classes to a new sub-module

[kirpichov] Consistently handle EmptyMatchTreatment

[rangadi] Add 10 millis sleep when there are no elements left in a partition.

--
[...truncated 93.52 KB...]
at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2447)
at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2335)
at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:623)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:397)
at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

at org.apache.hadoop.ipc.Client.call(Client.java:1475)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy61.create(Unknown Source)

Jenkins build is back to normal : beam_PerformanceTests_JDBC #489

2018-04-23 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PerformanceTests_HadoopInputFormat #178

2018-04-23 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Spark #1627

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[geet.kumar75] BEAM-4038: Support Kafka Headers in KafkaIO

[geet.kumar75] BEAM-4038: Update license information

[geet.kumar75] Update code formatting

[geet.kumar75] Remove custom implementations of KafkaHeader, KafkaHeaders, etc.

[geet.kumar75] Changes based on review comments

[daniel.o.programmer] [BEAM-3513] Removing PrimitiveCombineGroupedValues 
override w/ FnAPI.

[echauchot] Introduce MetricsPusher in runner core to regularly push aggregated

[echauchot] Instanciate MetricsPusher in runner-specific code because we need

[echauchot] Improve MetricsPusher: do not aggregate metrics when not needed, 
improve

[echauchot] Create JsonMetricsSerializer

[echauchot] Stop MetricsPusher thread by observing pipeline state and improve 
the

[echauchot] Make metrics sink configurable through PipelineOptions, pass

[echauchot] Add MetricsPusher tests specific to Spark (because Spark streaming 
tests

[echauchot] Add a MetricksPusher test to runner-core (batch and streaming are 
done

[echauchot] Push metrics at the end of a batch pipeline in flink runner

[echauchot] improve MetricsPusher lifecycle and thread safety

[echauchot] Make MetricsPusher merge a list a MetricsContainerStepMaps because 
there

[echauchot] Fix thread synchronisation and replace usages of instance variable 
by

[echauchot] Clear dummyMetricsSink before test

[echauchot] Push metrics at the end of a batch pipeline in spark runner

[echauchot] Improve MetricsPusher teardown to enable multiple pipelines in a 
single

[echauchot] Manually generate json and remove jackson

[echauchot] Replace use of http client by use of java.net.HttpUrlConnection and 
deal

[echauchot] Remove DEFAULT_PERIOD constant in favor of already existing

[echauchot] Remove unneeded null check, format

[echauchot] convert MetricsSink to an interface with a single writeMetrics 
method

[echauchot] Remove MetricsSerializer base class and inline serialization in

[echauchot] Dynamically create the sinks by reflection

[echauchot] Split DummyMetricsSink into NoOpMetricsSink (default sink) and

[echauchot] Reduce overhead when no metricsSink is provided, do not start 
polling

[echauchot] Make MetricsPusher a regular object instead of a singleton to allow

[echauchot] Explicitely start MetricsPusher from the runners

[echauchot] Separate MetricsHttpSink POC to a new runners-extensions artifact 
and

[echauchot] Fix cycle bug between teardown() and pushmetrics()

[echauchot] Update MetricsPusher and TestMetricsSink to new serializable

[echauchot] Use regular jackson object mapper to serialize metrics now that 
they are

[echauchot] Give MetricsPusher a bit of time to push before assert in test

[echauchot] Make MetricsPusher thread a daemon

[echauchot] Fix build and clean: dependencies, rat, checkstyle, findbugs, remove

[geet.kumar75] Support kafka versions 0.10.1.0 and above

[echauchot] Move build to gradle

[echauchot] MetricsSink no more needs to be generic

[echauchot] SparkRunnerDebugger does not need to export metrics as it does not 
run

[tgroh] Use Existing Matchers in WatermarkManagerTest

[echauchot] Move MetricsHttpSink and related classes to a new sub-module

[kirpichov] Consistently handle EmptyMatchTreatment

[rangadi] Add 10 millis sleep when there are no elements left in a partition.

--
[...truncated 69.74 KB...]
2018-04-24 00:17:50,212 50988614 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-24 00:18:15,592 50988614 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-24 00:18:19,180 50988614 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r7575b9ff5d12d2be_0162f503e5e3_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r7575b9ff5d12d2be_0162f503e5e3_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r7575b9ff5d12d2be_0162f503e5e3_1 ... (0s) Current status: DONE   
2018-04-24 00:18:19,180 50988614 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-24 00:18:49,107 50988614 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 


Build failed in Jenkins: beam_PostCommit_Python_Verify #4770

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Use Existing Matchers in WatermarkManagerTest

--
[...truncated 1.06 MB...]
test_type_check_violation_valid_simple_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_enforce_kv_type_constraint 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_be_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_have_length_2 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_proxy_to_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_constraint_compatibility 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_repr (apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_proxy_to_union 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_getitem_sequence_not_allowed 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_any_return_type_hint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_primitive_type_or_type_constraint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_single_return_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_no_kwargs_accepted 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_composite_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_simple_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_violation 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_compatibility (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_invalid_elem_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_must_be_set 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_composite_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_simple_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_any_argument_type_hint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_basic_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_composite_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_invalid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_must_be_primitive_type_or_constraint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_mix_positional_and_keyword_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_simple_type_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_functions_as_regular_generator 
(apache_beam.typehints.typehints_test.TestGeneratorWrapper) ... ok
test_compatibility (apache_beam.typehints.typehints_test.TupleHintTestCase) ... 
ok
test_compatibility_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_getitem_invalid_ellipsis_type_param 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_getitem_params_must_be_type_or_constraint 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_raw_tuple (apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_composite_type 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_composite_type_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_simple_type_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_simple_types 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Apex_Gradle #181

2018-04-23 Thread Apache Jenkins Server
See 


--
[...truncated 14.74 MB...]
INFO: Deploy request: 
[OperatorDeployInfo[id=9,name=PAssert$171/GroupGlobally/GatherAllOutputs/GroupByKey,type=GENERIC,checkpoint={,
 0, 
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream2,sourceNodeId=8,sourcePortName=outputPort,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream12,bufferServer=localhost
Apr 24, 2018 12:52:25 AM 
com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received publisher request: PublishRequestTuple{version=1.0, 
identifier=2.output.2, windowId=}
Apr 24, 2018 12:52:25 AM 
com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received subscriber request: SubscribeRequestTuple{version=1.0, 
identifier=tcp://localhost:36073/1.output.1, windowId=, 
type=stream7/2.input, upstreamIdentifier=1.output.1, mask=0, partitions=null, 
bufferSize=1024}
Apr 24, 2018 12:52:25 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Deploy request: 
[OperatorDeployInfo[id=14,name=PAssert$171/GroupGlobally/WindowIntoDummy/Window.Assign,type=GENERIC,checkpoint={,
 0, 
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=inputPort,streamId=stream8,sourceNodeId=13,sourcePortName=output,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=stream9,bufferServer=localhost
Apr 24, 2018 12:52:25 AM 
com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received publisher request: PublishRequestTuple{version=1.0, 
identifier=9.output.9, windowId=}
Apr 24, 2018 12:52:25 AM 
com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received subscriber request: SubscribeRequestTuple{version=1.0, 
identifier=tcp://localhost:36073/8.outputPort.8, windowId=, 
type=stream2/9.input, upstreamIdentifier=8.outputPort.8, mask=0, 
partitions=null, bufferSize=1024}
Apr 24, 2018 12:52:25 AM 
com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received subscriber request: SubscribeRequestTuple{version=1.0, 
identifier=tcp://localhost:36073/13.output.12, windowId=, 
type=stream8/14.inputPort, upstreamIdentifier=13.output.12, mask=0, 
partitions=null, bufferSize=1024}
Apr 24, 2018 12:52:25 AM 
com.datatorrent.bufferserver.server.Server$UnidentifiedClient onMessage
INFO: Received publisher request: PublishRequestTuple{version=1.0, 
identifier=14.outputPort.14, windowId=}
Apr 24, 2018 12:52:25 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 1 sending EndOfStream
Apr 24, 2018 12:52:25 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 13 sending EndOfStream
Apr 24, 2018 12:52:25 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Deploy request: 
[OperatorDeployInfo[id=15,name=PAssert$171/GroupGlobally/FlattenDummyAndContents,type=GENERIC,checkpoint={,
 0, 
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=data1,streamId=stream0,sourceNodeId=12,sourcePortName=output,locality=,partitionMask=0,partitionKeys=],
 
OperatorDeployInfo.InputDeployInfo[portName=data2,streamId=stream9,sourceNodeId=14,sourcePortName=outputPort,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=stream19,bufferServer=localhost
Apr 24, 2018 12:52:25 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Deploy request: 
[OperatorDeployInfo[id=16,name=PAssert$171/GroupGlobally/GroupDummyAndContents,type=GENERIC,checkpoint={,
 0, 
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream19,sourceNodeId=15,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream6,bufferServer=localhost
Apr 24, 2018 12:52:25 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Deploy request: 
[OperatorDeployInfo[id=10,name=PAssert$171/GroupGlobally/GatherAllOutputs/Values/Values/Map/ParMultiDo(Anonymous),type=GENERIC,checkpoint={,
 0, 
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream12,sourceNodeId=9,sourcePortName=output,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream14,bufferServer=]]],
 
OperatorDeployInfo[id=12,name=PAssert$171/GroupGlobally/KeyForDummy/AddKeys/Map/ParMultiDo(Anonymous),type=OIO,checkpoint={,
 0, 
0},inputs=[Oper

Build failed in Jenkins: beam_PostCommit_Python_Verify #4771

2018-04-23 Thread Apache Jenkins Server
See 


--
[...truncated 1.06 MB...]
test_type_check_violation_valid_simple_type 
(apache_beam.typehints.typehints_test.IterableHintTestCase) ... ok
test_enforce_kv_type_constraint 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_be_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_have_length_2 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_proxy_to_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_constraint_compatibility 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_repr (apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_proxy_to_union 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_getitem_sequence_not_allowed 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_any_return_type_hint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_primitive_type_or_type_constraint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_single_return_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_no_kwargs_accepted 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_composite_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_simple_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_violation 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_compatibility (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_invalid_elem_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_must_be_set 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_composite_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_simple_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_any_argument_type_hint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_basic_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_composite_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_invalid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_must_be_primitive_type_or_constraint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_mix_positional_and_keyword_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_simple_type_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_functions_as_regular_generator 
(apache_beam.typehints.typehints_test.TestGeneratorWrapper) ... ok
test_compatibility (apache_beam.typehints.typehints_test.TupleHintTestCase) ... 
ok
test_compatibility_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_getitem_invalid_ellipsis_type_param 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_getitem_params_must_be_type_or_constraint 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_raw_tuple (apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_composite_type 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_composite_type_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_simple_type_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_simple_types 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_must_be_tuple 
(apache_beam.typehints.typehints_test.TupleH

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #212

2018-04-23 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Apex_Gradle #182

2018-04-23 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Python_Verify #4772

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[apilloud] [SQL] Embed BeamSqlTable in BeamCalciteTable

[owenzhang1990] [BEAM-4129] Run WordCount example on Gearpump runner with Gradle

--
[...truncated 1.06 MB...]
test_getitem_param_must_be_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_param_must_have_length_2 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_getitem_proxy_to_tuple 
(apache_beam.typehints.typehints_test.KVHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_invalid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_composite_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_enforce_list_type_constraint_valid_simple_type 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_constraint_compatibility 
(apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_list_repr (apache_beam.typehints.typehints_test.ListHintTestCase) ... ok
test_getitem_proxy_to_union 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_getitem_sequence_not_allowed 
(apache_beam.typehints.typehints_test.OptionalHintTestCase) ... ok
test_any_return_type_hint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_primitive_type_or_type_constraint 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_must_be_single_return_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_no_kwargs_accepted 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_composite_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_simple_type 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_type_check_violation 
(apache_beam.typehints.typehints_test.ReturnsDecoratorTestCase) ... ok
test_compatibility (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_getitem_invalid_composite_type_param 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_invalid_elem_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_must_be_set 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_composite_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_type_check_valid_elem_simple_type 
(apache_beam.typehints.typehints_test.SetHintTestCase) ... ok
test_any_argument_type_hint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_basic_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_composite_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_invalid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_must_be_primitive_type_or_constraint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_mix_positional_and_keyword_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_simple_type_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_functions_as_regular_generator 
(apache_beam.typehints.typehints_test.TestGeneratorWrapper) ... ok
test_compatibility (apache_beam.typehints.typehints_test.TupleHintTestCase) ... 
ok
test_compatibility_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_getitem_invalid_ellipsis_type_param 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_getitem_params_must_be_type_or_constraint 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_raw_tuple (apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_composite_type 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_composite_type_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_simple_type_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_simple_types 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_must_be_tuple 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_must_hav

Jenkins build is back to normal : beam_PostCommit_Python_Verify #4773

2018-04-23 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_HadoopInputFormat #179

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[apilloud] [SQL] Embed BeamSqlTable in BeamCalciteTable

[owenzhang1990] [BEAM-4129] Run WordCount example on Gearpump runner with Gradle

[sidhom] Fix python lint error

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam2 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision caf9d6404d25c3e3040edbd0703ad21066a36295 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f caf9d6404d25c3e3040edbd0703ad21066a36295
Commit message: "Merge pull request #5205 Fix python lint error"
 > git rev-list --no-walk 0f2ba71e1b6db88ed79744e363586a8ff16dcb08 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins5794839528639777961.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a 
--verbosity=debug
DEBUG: Running gcloud.container.clusters.get-credentials with 
Namespace(__calliope_internal_deepest_parser=ArgumentParser(prog='gcloud.container.clusters.get-credentials',
 usage=None, description='See 
https://cloud.google.com/container-engine/docs/kubectl for\nkubectl 
documentation.', version=None, formatter_class=, conflict_handler='error', add_help=False), 
account=None, api_version=None, authority_selector=None, 
authorization_token_file=None, cmd_func=>, 
command_path=['gcloud', 'container', 'clusters', 'get-credentials'], 
configuration=None, credential_file_override=None, document=None, format=None, 
h=None, help=None, http_timeout=None, log_http=None, name='io-datastores', 
project=None, quiet=None, trace_email=None, trace_log=None, trace_token=None, 
user_output_enabled=None, verbosity='debug', version=None, 
zone='us-central1-a').
WARNING: Accessing a Container Engine cluster requires the kubernetes 
commandline
client [kubectl]. To install, run
  $ gcloud components install kubectl

Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins3821885580679605052.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins6748059615823201583.sh
+ kubectl 
--kubeconfig=
 create namespace hadoopinputformatioit-1524546085809
namespace "hadoopinputformatioit-1524546085809" created
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins613008953493198993.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=hadoopinputformatioit-1524546085809
error: open /home/jenkins/.kube/config.lock: file exists
Build step 'Execute shell' marked build as failure


Build failed in Jenkins: beam_PerformanceTests_JDBC #490

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[apilloud] [SQL] Embed BeamSqlTable in BeamCalciteTable

[owenzhang1990] [BEAM-4129] Run WordCount example on Gearpump runner with Gradle

[sidhom] Fix python lint error

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam7 (beam) in workspace 

Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init  # 
 > timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision caf9d6404d25c3e3040edbd0703ad21066a36295 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f caf9d6404d25c3e3040edbd0703ad21066a36295
Commit message: "Merge pull request #5205 Fix python lint error"
 > git rev-list --no-walk 0f2ba71e1b6db88ed79744e363586a8ff16dcb08 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins6949273945394657010.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a 
--verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: 
[--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins4963278871881494230.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins2998858072292067768.sh
+ kubectl 
--kubeconfig=
 create namespace jdbcioit-1524546092323
namespace "jdbcioit-1524546092323" created
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins4980222838293174664.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=jdbcioit-1524546092323
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins4864705631565302316.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins4517336094340283130.sh
+ rm -rf .env
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins4876090899299410933.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins7021530130634548446.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://files.pythonhosted.org/packages/20/d7/04a0b689d3035143e2ff288f4b9ee4bf6ed80585cc121c90bfd85a1a8c2e/setuptools-39.0.1-py2.py3-none-any.whl#sha256=8010754433e3211b9cdbbf784b50f30e80bf40fc6b05eb5f865fab83300599b8
Downloading/unpacking pip from 
https://files.pythonhosted.org/packages/0f/74/ecd13431bcc456ed390b44c8a6e917c1820365cbebcb6a8974d1cd045ab4/pip-10.0.1-py2.py3-none-any.whl#sha256=717cdffb2833be8409433a93746744b59505f42146e8d37de6c62b430e25d6d7
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins3683323884888611

Build failed in Jenkins: beam_PerformanceTests_Python #1186

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[apilloud] [SQL] Embed BeamSqlTable in BeamCalciteTable

[owenzhang1990] [BEAM-4129] Run WordCount example on Gearpump runner with Gradle

[sidhom] Fix python lint error

--
[...truncated 30.28 KB...]
[INFO] --- maven-surefire-plugin:2.21.0:test (default-test) @ 
beam-sdks-java-build-tools ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:regex-properties 
(render-artifact-id) @ beam-sdks-java-build-tools ---
[INFO] 
[INFO] --- maven-jar-plugin:3.0.2:jar (default-jar) @ 
beam-sdks-java-build-tools ---
[INFO] Building jar: 

[INFO] 
[INFO] --- maven-site-plugin:3.7:attach-descriptor (attach-descriptor) @ 
beam-sdks-java-build-tools ---
[INFO] Skipping because packaging 'jar' is not pom.
[INFO] 
[INFO] --- maven-jar-plugin:3.0.2:test-jar (default-test-jar) @ 
beam-sdks-java-build-tools ---
[INFO] Building jar: 

[INFO] 
[INFO] --- maven-shade-plugin:3.1.0:shade (bundle-and-repackage) @ 
beam-sdks-java-build-tools ---
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing 

 with 

[INFO] Replacing original test artifact with shaded test artifact.
[INFO] Replacing 

 with 

[INFO] 
[INFO] --- maven-dependency-plugin:3.0.2:analyze-only (default) @ 
beam-sdks-java-build-tools ---
[INFO] No dependency problems found
[INFO] 
[INFO] --- maven-install-plugin:2.5.2:install (default-install) @ 
beam-sdks-java-build-tools ---
[INFO] Installing 

 to 
/home/jenkins/.m2/repository/org/apache/beam/beam-sdks-java-build-tools/2.5.0-SNAPSHOT/beam-sdks-java-build-tools-2.5.0-SNAPSHOT.jar
[INFO] Installing 

 to 
/home/jenkins/.m2/repository/org/apache/beam/beam-sdks-java-build-tools/2.5.0-SNAPSHOT/beam-sdks-java-build-tools-2.5.0-SNAPSHOT.pom
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Beam :: Parent .. SUCCESS [  9.385 s]
[INFO] Apache Beam :: SDKs :: Java :: Build Tools . FAILURE [  5.965 s]
[INFO] Apache Beam :: Model ... SKIPPED
[INFO] Apache Beam :: Model :: Pipeline ... SKIPPED
[INFO] Apache Beam :: Model :: Job Management . SKIPPED
[INFO] Apache Beam :: Model :: Fn Execution ... SKIPPED
[INFO] Apache Beam :: SDKs  SKIPPED
[INFO] Apache Beam :: SDKs :: Go .. SKIPPED
[INFO] Apache Beam :: SDKs :: Go :: Container . SKIPPED
[INFO] Apache Beam :: SDKs :: Java  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Core  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Fn Execution  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Extensions .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Extensions :: Google Cloud Platform Core 
SKIPPED
[INFO] Apache Beam :: Runners . SKIPPED
[INFO] Apache Beam :: Runners :: Core Construction Java ... SKIPPED
[INFO] Apache Beam :: Runners :: Core Java  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Harness . SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Container ... SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: Amazon Web Services SKIPPED
[INFO] Apache Beam :: Runners :: Local Java Core .. SKIPPED
[INFO] Apache Beam :: Runners :: Direct Java .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: AMQP .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: Common  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :

Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT #413

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[apilloud] [SQL] Embed BeamSqlTable in BeamCalciteTable

[owenzhang1990] [BEAM-4129] Run WordCount example on Gearpump runner with Gradle

[sidhom] Fix python lint error

--
[...truncated 83.11 KB...]
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.23.0 
from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-io-google-cloud-platform:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-protobuf:jar:2.5.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev374-1.23.0 from the 
shaded jar.
[INFO] Excluding com.google.api:gax-grpc:jar:0.20.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.auto.value:auto-value:jar:1.5.3 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-pubsub:jar:v1-rev382-1.23.0 from the shaded 
jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.23.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.23.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-resolver:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core:jar:1.0.2 from the shaded 
jar.
[INFO] Excluding org.json:json:jar:20160810 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-spanner:jar:0.20.0b-beta from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-spanner-v1:jar:0.1.11b 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:proto-google-cloud-spanner-admin-instance-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-spanner-v1:jar:0.1.11b 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:grpc-google-cloud-spanner-admin-database-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding 
com.google.api.grpc:grpc-google-cloud-spanner-admin-instance-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-longrunning-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-longrunning-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-protos:jar:1.0.0-pre3 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-client-core:jar:1.0.0 from 
the shaded jar.
[INFO] Excluding commons-logging:commons-logging:jar:1.2 from the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-appengine:jar:0.7.0 from 
the shaded jar.
[INFO] Excluding io.opencensus:opencensus-contrib-grpc-util:jar:0.7.0 from the 
shaded jar.
[INFO] Excluding io.opencensus:op

Jenkins build is back to normal : beam_PerformanceTests_XmlIOIT_HDFS #86

2018-04-23 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_MongoDBIO_IT #88

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[apilloud] [SQL] Embed BeamSqlTable in BeamCalciteTable

[owenzhang1990] [BEAM-4129] Run WordCount example on Gearpump runner with Gradle

[sidhom] Fix python lint error

--
[...truncated 99.62 KB...]
at 
com.mongodb.operation.MixedBulkWriteOperation$Run$2.executeWriteCommandProtocol(MixedBulkWriteOperation.java:455)
at 
com.mongodb.operation.MixedBulkWriteOperation$Run$RunExecutor.execute(MixedBulkWriteOperation.java:646)
at 
com.mongodb.operation.MixedBulkWriteOperation$Run.execute(MixedBulkWriteOperation.java:401)
at 
com.mongodb.operation.MixedBulkWriteOperation$1.call(MixedBulkWriteOperation.java:179)
at 
com.mongodb.operation.MixedBulkWriteOperation$1.call(MixedBulkWriteOperation.java:168)
at 
com.mongodb.operation.OperationHelper.withConnectionSource(OperationHelper.java:230)
at 
com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:221)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
at com.mongodb.Mongo.execute(Mongo.java:781)
at com.mongodb.Mongo$2.execute(Mongo.java:764)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 3 ms while waiting for a 
server that matches WritableServerSelector. Client view of cluster state is 
{type=UNKNOWN, servers=[{address=104.197.171.217:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
at 
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:75)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:71)
at 
com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
at 
com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
at com.mongodb.Mongo.execute(Mongo.java:781)
at com.mongodb.Mongo$2.execute(Mongo.java:764)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 3 ms while waiting for a 
server that matches WritableServerSelector. Client view of cluster state is 
{type=UNKNOWN, servers=[{address=104.197.171.217:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
at 
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:75)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:71)
at 
com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
at 
com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
at com.mongodb.Mongo.execute(Mongo.java:781)
at com.mongodb.Mongo$2.execute(Mongo.java:764)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(Mo

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #93

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[apilloud] [SQL] Embed BeamSqlTable in BeamCalciteTable

[owenzhang1990] [BEAM-4129] Run WordCount example on Gearpump runner with Gradle

[sidhom] Fix python lint error

--
[...truncated 128.70 KB...]
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy63.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:337)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.copy(HadoopFileSystem.java:131)
at org.apache.beam.sdk.io.FileSystems.copy(FileSystems.java:301)
at 
org.apache.beam.sdk.io.FileBasedSink$WriteOperation.moveToOutputFiles(FileBasedSink.java:755)
at 
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn.process(WriteFiles.java:801)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at 
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at 
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy62.getFileInfo(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy63.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:337)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.copy(HadoopFileSystem.java:131)
at org.apache.beam.sdk.io.FileSystems.copy(FileSystems.java:301)
at 
org.apache.beam.sdk.io.FileBasedSink$WriteOperation.moveToOutputFiles(FileBasedSink.java:755)
at 
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn.process(WriteFiles.java:801)
at 
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn$DoFnInvoker.invokeProcessElement(Unknown
 Source)
at 
org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
at 
org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
at 
com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
at 
com.google.cloud.dataflow.worker.util.common.worker.ParDoOpera

Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT_HDFS #87

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[apilloud] [SQL] Embed BeamSqlTable in BeamCalciteTable

[owenzhang1990] [BEAM-4129] Run WordCount example on Gearpump runner with Gradle

[sidhom] Fix python lint error

--
[...truncated 91.50 KB...]
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy62.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:337)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.copy(HadoopFileSystem.java:131)
at org.apache.beam.sdk.io.FileSystems.copy(FileSystems.java:300)
at 
org.apache.beam.sdk.io.FileBasedSink$WriteOperation.moveToOutputFiles(FileBasedSink.java:755)
at 
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn.process(WriteFiles.java:801)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at 
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at 
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy61.getFileInfo(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy62.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:337)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.copy(HadoopFileSystem.java:131)
at org.apache.beam.sdk.io.FileSystems.copy(FileSystems.java:300)
at 
org.apache.beam.sdk.io.FileBasedSink$WriteOperation.moveToOutputFiles(FileBasedSink.java:755)
at 
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn.process(WriteFiles.java:801)
at 
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn$DoFnInvoker.invokeProcessElement(Unknown
 Source)
at 
org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
at 
org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
at 
com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
at 
com.google.cloud.dataflow.worker.util.common.worker.

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #192

2018-04-23 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Spark #1628

2018-04-23 Thread Apache Jenkins Server
See 


Changes:

[apilloud] [SQL] Embed BeamSqlTable in BeamCalciteTable

[owenzhang1990] [BEAM-4129] Run WordCount example on Gearpump runner with Gradle

[sidhom] Fix python lint error

--
[...truncated 95.80 KB...]
'apache-beam-testing:bqjob_r383c741ee672f7a4_0162f64e92b8_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)
Upload complete.Waiting on bqjob_r383c741ee672f7a4_0162f64e92b8_1 ... (0s) 
Current status: RUNNING 
 Waiting on 
bqjob_r383c741ee672f7a4_0162f64e92b8_1 ... (0s) Current status: DONE   
2018-04-24 06:19:28,980 0aee8bbf MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-24 06:19:51,719 0aee8bbf MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-24 06:19:53,961 0aee8bbf MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r1a0016f46c3f58f7_0162f64ef376_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)
Upload complete.Waiting on bqjob_r1a0016f46c3f58f7_0162f64ef376_1 ... (0s) 
Current status: RUNNING 
 Waiting on 
bqjob_r1a0016f46c3f58f7_0162f64ef376_1 ... (0s) Current status: DONE   
2018-04-24 06:19:53,962 0aee8bbf MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-24 06:20:14,662 0aee8bbf MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-24 06:20:16,763 0aee8bbf MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r7deecd960b6c824_0162f64f4cfa_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)
Upload complete.Waiting on bqjob_r7deecd960b6c824_0162f64f4cfa_1 ... (0s) 
Current status: RUNNING 
Waiting on 
bqjob_r7deecd960b6c824_0162f64f4cfa_1 ... (0s) Current status: DONE   
2018-04-24 06:20:16,763 0aee8bbf MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-24 06:20:46,279 0aee8bbf MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-24 06:20:48,419 0aee8bbf MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-be

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #164

2018-04-24 Thread Apache Jenkins Server
See 


--
[...truncated 18.21 MB...]
Apr 24, 2018 7:09:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Sample 
keys/GroupByKey as step s16
Apr 24, 2018 7:09:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Sample 
keys/Combine.GroupedValues as step s17
Apr 24, 2018 7:09:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/GBKaSVForData/ParDo(GroupByKeyHashAndSortByKeyAndWindow) as step s18
Apr 24, 2018 7:09:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly as step s19
Apr 24, 2018 7:09:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/ParMultiDo(ToIsmRecordForMapLike) as step s20
Apr 24, 2018 7:09:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/GBKaSVForSize as step s21
Apr 24, 2018 7:09:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/ParDo(ToIsmMetadataRecordForSize) as step s22
Apr 24, 2018 7:09:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/GBKaSVForKeys as step s23
Apr 24, 2018 7:09:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/ParDo(ToIsmMetadataRecordForKey) as step s24
Apr 24, 2018 7:09:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/Flatten.PCollections as step s25
Apr 24, 2018 7:09:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/CreateDataflowView as step s26
Apr 24, 2018 7:09:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Partition 
input as step s27
Apr 24, 2018 7:09:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Group by 
partition as step s28
Apr 24, 2018 7:09:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Batch 
mutations together as step s29
Apr 24, 2018 7:09:19 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Write 
mutations to Spanner as step s30
Apr 24, 2018 7:09:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0424070911-8d490c31/output/results/staging/
Apr 24, 2018 7:09:19 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <80362 bytes, hash qsuQy3EHPdShQT9OkX9NUw> to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0424070911-8d490c31/output/results/staging/pipeline-qsuQy3EHPdShQT9OkX9NUw.pb

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Dataflow SDK version: 2.5.0-SNAPSHOT

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 24, 2018 7:09:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-04-24_00_09_20-7485403376942861152?project=apache-beam-testing

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Submitted job: 2018-04-24_00_09_20-7485403376942861152

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 24, 2018 7:09:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow j

Jenkins build is back to normal : beam_PostCommit_Java_GradleBuild #165

2018-04-24 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Analysis #6

2018-04-24 Thread Apache Jenkins Server
See 


--
GitHub pull request #5180 of commit a37f5e8e376216faaaf481b91e09b15182865efa, 
no merge conflicts.
Setting status of a37f5e8e376216faaaf481b91e09b15182865efa to PENDING with url 
https://builds.apache.org/job/beam_PerformanceTests_Analysis/6/ and message: 
'Build started sha1 is merged.'
Using context: Jenkins: Performance Tests Analysis
[EnvInject] - Loading node environment variables.
Building remotely on beam6 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/5180/*:refs/remotes/origin/pr/5180/*
 > git rev-parse refs/remotes/origin/pr/5180/merge^{commit} # timeout=10
 > git rev-parse refs/remotes/origin/origin/pr/5180/merge^{commit} # timeout=10
Checking out Revision 11975ed957c1b4b23165547c567a5c8955144215 
(refs/remotes/origin/pr/5180/merge)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 11975ed957c1b4b23165547c567a5c8955144215
Commit message: "Merge a37f5e8e376216faaaf481b91e09b15182865efa into 
07f2a45c686ef0ae829849a77bd0622be6dd7ec8"
First time build. Skipping changelog.
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Analysis] $ /bin/bash -xe 
/tmp/jenkins2611686208244579334.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Analysis] $ /bin/bash -xe 
/tmp/jenkins4311147685906551666.sh
+ rm -rf .env
[beam_PerformanceTests_Analysis] $ /bin/bash -xe 
/tmp/jenkins9042333626816366100.sh
+ virtualenv .env --system-site-packages
New python executable in 

Installing setuptools, pip, wheel...done.
[beam_PerformanceTests_Analysis] $ /bin/bash -xe 
/tmp/jenkins2193950406693621331.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in 
./.env/lib/python2.7/site-packages (39.0.1)
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages 
(10.0.1)
cheetah 2.4.4 requires Markdown>=2.0.1, which is not installed.
[beam_PerformanceTests_Analysis] $ /bin/bash -xe 
/tmp/jenkins7979391923458059247.sh
+ .env/bin/pip install requests google.cloud.bigquery
Requirement already satisfied: requests in 
/home/jenkins/.local/lib/python2.7/site-packages (2.18.1)
Collecting google.cloud.bigquery
:339:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  SNIMissingWarning
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
  Using cached 
https://files.pythonhosted.org/packages/4d/7e/d47392a7449411b7e4f8c95a32c29f5c9808fa7a7111ab302fec773fa86d/google_cloud_bigquery-1.1.0-py2.py3-none-any.whl
Requirement already satisfied: idna<2.6,>=2.5 in 
/home/jenkins/.local/lib/python2.7/site-packages (from requests) (2.5)
Requirement already satisfied: urllib3<1.22,>=1.21.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from requests) (1.21.1)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in 
/home/jenkins/.local/lib/python2.7/site-packages (from requests) (3.0.4)
Requirement already satisfied: certifi>=2017.4.17 in 
/home/jenkins/.local/lib/python2.7/site-packages (from requests) (2017.4.17)
Collecting google-cloud-core<0.29dev,>=0.28.0 (from google.cloud.bigquery)
  Using cached 
https://files.pythonhosted.org/packages/0f/4

Jenkins build is back to normal : beam_PerformanceTests_Spark #1629

2018-04-24 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Spark #1630

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[Pablo] Python Metrics now rely on StateSampler state.

[robertwb] [BEAM-4097] Set environment for Python sdk function specs.

[robertwb] Logging around default docker image environment.

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam5 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 07f2a45c686ef0ae829849a77bd0622be6dd7ec8 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 07f2a45c686ef0ae829849a77bd0622be6dd7ec8
Commit message: "Merge pull request #5191 [BEAM-4097] Set environment for 
Python sdk function specs."
 > git rev-list --no-walk caf9d6404d25c3e3040edbd0703ad21066a36295 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4360265091079575283.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins682408481641096296.sh
+ rm -rf .env
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7382137751780386066.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins451453892311774086.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://files.pythonhosted.org/packages/20/d7/04a0b689d3035143e2ff288f4b9ee4bf6ed80585cc121c90bfd85a1a8c2e/setuptools-39.0.1-py2.py3-none-any.whl#sha256=8010754433e3211b9cdbbf784b50f30e80bf40fc6b05eb5f865fab83300599b8
Downloading/unpacking pip from 
https://files.pythonhosted.org/packages/0f/74/ecd13431bcc456ed390b44c8a6e917c1820365cbebcb6a8974d1cd045ab4/pip-10.0.1-py2.py3-none-any.whl#sha256=717cdffb2833be8409433a93746744b59505f42146e8d37de6c62b430e25d6d7
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5114782393718917001.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3641637412170014018.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Collecting absl-py (from -r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15)) (2.10)
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16)) (39.0.1)
Collecting colorlog[windows]==2.6.0 (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached 
https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r PerfKitBenchmarker/requirements.txt (line 18))
Collecting futures>=3.0.3 (from -r PerfKitBenchmarker/requirements.txt (line 
19))
  Using cached 
https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20)) (3.12)
Collecting pint>=0.7 (from -r PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
  Using cached 
https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requ

Build failed in Jenkins: beam_PerformanceTests_HadoopInputFormat #180

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[Pablo] Python Metrics now rely on StateSampler state.

[robertwb] [BEAM-4097] Set environment for Python sdk function specs.

[robertwb] Logging around default docker image environment.

--
[...truncated 163.20 KB...]
[INFO] Excluding com.google.cloud.bigdataoss:gcsio:jar:1.4.5 from the shaded 
jar.
[INFO] Excluding 
com.google.apis:google-api-services-cloudresourcemanager:jar:v1-rev477-1.23.0 
from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-io-google-cloud-platform:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-protobuf:jar:2.5.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev374-1.23.0 from the 
shaded jar.
[INFO] Excluding com.google.api:gax-grpc:jar:0.20.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-pubsub:jar:v1-rev382-1.23.0 from the shaded 
jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.23.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.23.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core:jar:1.0.2 from the shaded 
jar.
[INFO] Excluding org.json:json:jar:20160810 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-spanner:jar:0.20.0b-beta from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-spanner-v1:jar:0.1.11b 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:proto-google-cloud-spanner-admin-instance-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-spanner-v1:jar:0.1.11b 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:grpc-google-cloud-spanner-admin-database-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding 
com.google.api.grpc:grpc-google-cloud-spanner-admin-instance-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-longrunning-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-longrunning-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-protos:jar:1.0.0-pre3 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-client-core:jar:1.0.0 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-appengine:jar:0.7.0 from 
the shaded jar.
[INFO] Excluding io.opencensus:opencensus-contrib-grpc-util:jar:0.7.0 from the 
shaded jar.
[INFO] Excluding io.opencensus:opencensus-api:jar:0.7.0 from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:proto-google-cloud-spanner-admin-database-v1:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-all:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-okhttp:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.squareup.okhttp:okhttp:jar:2.5.0 from the shaded jar.
[INFO] Excluding com.squareup.okio:okio:jar:1.6.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-lite:jar:1

Build failed in Jenkins: beam_PerformanceTests_Python #1187

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[Pablo] Python Metrics now rely on StateSampler state.

[robertwb] [BEAM-4097] Set environment for Python sdk function specs.

[robertwb] Logging around default docker image environment.

--
[...truncated 251.72 KB...]
[INFO] --- maven-surefire-plugin:2.21.0:test (default-test) @ 
beam-sdks-java-io-elasticsearch-tests-5 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:regex-properties 
(render-artifact-id) @ beam-sdks-java-io-elasticsearch-tests-5 ---
[INFO] 
[INFO] --- maven-jar-plugin:3.0.2:jar (default-jar) @ 
beam-sdks-java-io-elasticsearch-tests-5 ---
[INFO] Building jar: 

[INFO] 
[INFO] --- maven-site-plugin:3.7:attach-descriptor (attach-descriptor) @ 
beam-sdks-java-io-elasticsearch-tests-5 ---
[INFO] Skipping because packaging 'jar' is not pom.
[INFO] 
[INFO] --- maven-jar-plugin:3.0.2:test-jar (default-test-jar) @ 
beam-sdks-java-io-elasticsearch-tests-5 ---
[INFO] Building jar: 

[INFO] 
[INFO] --- maven-shade-plugin:3.1.0:shade (bundle-and-repackage) @ 
beam-sdks-java-io-elasticsearch-tests-5 ---
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing 

 with 

[INFO] Replacing original test artifact with shaded test artifact.
[INFO] Replacing 

 with 

[INFO] 
[INFO] --- maven-dependency-plugin:3.0.2:analyze-only (default) @ 
beam-sdks-java-io-elasticsearch-tests-5 ---
[INFO] No dependency problems found
[INFO] 
[INFO] --- maven-install-plugin:2.5.2:install (default-install) @ 
beam-sdks-java-io-elasticsearch-tests-5 ---
[INFO] Installing 

 to 
/home/jenkins/.m2/repository/org/apache/beam/beam-sdks-java-io-elasticsearch-tests-5/2.5.0-SNAPSHOT/beam-sdks-java-io-elasticsearch-tests-5-2.5.0-SNAPSHOT.jar
[INFO] Installing 

 to 
/home/jenkins/.m2/repository/org/apache/beam/beam-sdks-java-io-elasticsearch-tests-5/2.5.0-SNAPSHOT/beam-sdks-java-io-elasticsearch-tests-5-2.5.0-SNAPSHOT.pom
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Beam :: Parent .. SUCCESS [  9.881 s]
[INFO] Apache Beam :: SDKs :: Java :: Build Tools . SUCCESS [  6.198 s]
[INFO] Apache Beam :: Model ... SUCCESS [  0.165 s]
[INFO] Apache Beam :: Model :: Pipeline ... SUCCESS [ 18.446 s]
[INFO] Apache Beam :: Model :: Job Management . SUCCESS [  7.539 s]
[INFO] Apache Beam :: Model :: Fn Execution ... SUCCESS [  8.611 s]
[INFO] Apache Beam :: SDKs  SUCCESS [  0.370 s]
[INFO] Apache Beam :: SDKs :: Go .. SUCCESS [ 54.686 s]
[INFO] Apache Beam :: SDKs :: Go :: Container . SUCCESS [ 31.697 s]
[INFO] Apache Beam :: SDKs :: Java  SUCCESS [  0.116 s]
[INFO] Apache Beam :: SDKs :: Java :: Core  SUCCESS [ 34.721 s]
[INFO] Apache Beam :: SDKs :: Java :: Fn Execution  SUCCESS [  3.633 s]
[INFO] Apache Beam :: SDKs :: Java :: Extensions .. SUCCESS [  0.039 s]
[INFO] Apache Beam :: SDKs :: Java :: Extensions :: Google Cloud Platform Core 
SUCCESS [  3.716 s]
[INFO] Apache Beam :: Runners . SUCCESS [  0.067 s]
[INFO] Apache Beam :: Runners :: Core Con

Jenkins build is back to normal : beam_PerformanceTests_Compressed_TextIOIT #414

2018-04-24 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PerformanceTests_Compressed_TextIOIT_HDFS #88

2018-04-24 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PerformanceTests_TextIOIT_HDFS #94

2018-04-24 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #87

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[Pablo] Python Metrics now rely on StateSampler state.

[robertwb] [BEAM-4097] Set environment for Python sdk function specs.

[robertwb] Logging around default docker image environment.

--
[...truncated 431.39 KB...]
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy63.create(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at 
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at 
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy62.create(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy63.create(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(

Build failed in Jenkins: beam_PerformanceTests_AvroIOIT_HDFS #88

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[Pablo] Python Metrics now rely on StateSampler state.

[robertwb] [BEAM-4097] Set environment for Python sdk function specs.

[robertwb] Logging around default docker image environment.

--
[...truncated 243.15 KB...]
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at 
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at 
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy62.create(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy63.create(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn$DoFnInvoker.invokeProcessElement(Unknown
 Source)
at 
org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
at 
org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:138)
at 
com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
at 
com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
at 
com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
at 
com.google.cloud.dataflow.worker.AssignWindowsParDoFnFactory$AssignWindowsParDoFn.processElement(AssignWindowsParDoFnFactory.java:118)
at 
com.google.cloud.dataflow.worker.util.common.worker.ParDoO

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #216

2018-04-24 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PerformanceTests_JDBC #491

2018-04-24 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #194

2018-04-24 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PerformanceTests_MongoDBIO_IT #89

2018-04-24 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PerformanceTests_Analysis #7

2018-04-24 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #218

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[sidhom] [BEAM-4149] Set worker id to "" if it is not set in the request header

[sidhom] [BEAM-3327] Refactor ControlClientPool to allow client multiplexing

[sidhom] [BEAM-3327] Basic Docker environment factory

--
[...truncated 63.30 MB...]
Apr 24, 2018 3:51:01 PM 
org.apache.flink.runtime.taskexecutor.TaskManagerServices checkTempDirs
INFO: Temporary file directory '/tmp': total 492 GB, usable 374 GB (76.02% 
usable)
Apr 24, 2018 3:51:01 PM 
org.apache.flink.runtime.clusterframework.FlinkResourceManager 
triggerConnectingToJobManager
INFO: Trying to associate with JobManager leader 
akka://flink/user/jobmanager_1
Apr 24, 2018 3:51:01 PM 
org.apache.flink.runtime.clusterframework.FlinkResourceManager 
jobManagerLeaderConnected
INFO: Resource Manager associating with leading JobManager 
Actor[akka://flink/user/jobmanager_1#1359063283] - leader session 
dd6eb48e-2ad8-4103-b25b-afdc16b560b9
Apr 24, 2018 3:51:01 PM 
org.apache.flink.runtime.io.network.buffer.NetworkBufferPool 
INFO: Allocated 332 MB for network buffer pool (number of memory segments: 
10632, bytes per segment: 32768).
Apr 24, 2018 3:51:01 PM org.apache.flink.runtime.query.QueryableStateUtils 
createKvStateClientProxy
WARNING: Could not load Queryable State Client Proxy. Probable reason: 
flink-queryable-state-runtime is not in the classpath. Please put the 
corresponding jar from the opt to the lib folder.
Apr 24, 2018 3:51:01 PM org.apache.flink.runtime.query.QueryableStateUtils 
createKvStateServer
WARNING: Could not load Queryable State Server. Probable reason: 
flink-queryable-state-runtime is not in the classpath. Please put the 
corresponding jar from the opt to the lib folder.
Apr 24, 2018 3:51:01 PM 
org.apache.flink.runtime.io.network.NetworkEnvironment start
INFO: Starting the network environment and its components.
Apr 24, 2018 3:51:01 PM 
org.apache.flink.runtime.taskexecutor.TaskManagerServices createMemoryManager
INFO: Limiting managed memory to 1046 MB, memory will be allocated lazily.
Apr 24, 2018 3:51:01 PM 
org.apache.flink.runtime.io.disk.iomanager.IOManager 
INFO: I/O manager uses directory 
/tmp/flink-io-99481433-e889-4b7d-a9e3-86fce1abbb5f for spill files.
Apr 24, 2018 3:51:01 PM org.apache.flink.runtime.filecache.FileCache 
INFO: User file cache uses directory 
/tmp/flink-dist-cache-b29ee499-9c57-49c8-bf7b-486e3e09f353
Apr 24, 2018 3:51:01 PM org.apache.flink.runtime.filecache.FileCache 
INFO: User file cache uses directory 
/tmp/flink-dist-cache-3ac52ff9-4a85-46c3-9262-76e1304144bb
Apr 24, 2018 3:51:01 PM grizzled.slf4j.Logger info
INFO: Starting TaskManager actor at 
akka://flink/user/taskmanager_1#1402864746.
Apr 24, 2018 3:51:01 PM grizzled.slf4j.Logger info
INFO: TaskManager data connection information: 
65bbeb315b0552855adc3bef5dea09a6 @ localhost (dataPort=-1)
Apr 24, 2018 3:51:01 PM grizzled.slf4j.Logger info
INFO: TaskManager has 1 task slot(s).
Apr 24, 2018 3:51:01 PM grizzled.slf4j.Logger info
INFO: Memory usage stats: [HEAP: 370/1348/3342 MB, NON HEAP: 105/109/-1 MB 
(used/committed/max)]
Apr 24, 2018 3:51:01 PM grizzled.slf4j.Logger info
INFO: Trying to register at JobManager akka://flink/user/jobmanager_1 
(attempt 1, timeout: 500 milliseconds)
Apr 24, 2018 3:51:01 PM org.apache.flink.runtime.instance.InstanceManager 
registerTaskManager
INFO: Registered TaskManager at localhost (akka://flink/user/taskmanager_1) 
as db4d7a55be13798b23c75447d1b769a7. Current number of registered hosts is 1. 
Current number of alive task slots is 1.
Apr 24, 2018 3:51:01 PM 
org.apache.flink.runtime.clusterframework.FlinkResourceManager 
handleResourceStarted
INFO: TaskManager 65bbeb315b0552855adc3bef5dea09a6 has started.
Apr 24, 2018 3:51:01 PM grizzled.slf4j.Logger info
INFO: Successful registration at JobManager 
(akka://flink/user/jobmanager_1), starting network stack and library cache.
Apr 24, 2018 3:51:01 PM grizzled.slf4j.Logger info
INFO: Determined BLOB server address to be localhost/127.0.0.1:37558. 
Starting BLOB cache.
Apr 24, 2018 3:51:01 PM org.apache.flink.runtime.blob.AbstractBlobCache 

INFO: Created BLOB cache storage directory 
/tmp/blobStore-5104c259-53b5-43dc-ba13-05fa366d2da6
Apr 24, 2018 3:51:01 PM org.apache.flink.runtime.blob.AbstractBlobCache 

INFO: Created BLOB cache storage directory 
/tmp/blobStore-2b2aac6b-2e91-4fbc-b1d3-d0b82e51c1d8
Apr 24, 2018 3:51:01 PM org.apache.flink.runtime.client.JobClientActor 
handleMessage
INFO: Received SubmitJobAndWait(JobGraph(jobId: 
e0d2dc948aa15f41784f0e86217c7308)) but there is no connection to a JobManager 
yet.
Apr 24, 2018 3:51:01 PM 
org.apache.flink.runtime.client.JobSubmissionClientActor handleCusto

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #196

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[sidhom] [BEAM-4149] Set worker id to "" if it is not set in the request header

[sidhom] [BEAM-3327] Refactor ControlClientPool to allow client multiplexing

[sidhom] [BEAM-3327] Basic Docker environment factory

--
[...truncated 519.44 KB...]
Starting process 'Gradle Test Executor 234'. Working directory: 

 Command: /usr/local/asfpackages/java/jdk1.8.0_152/bin/java 
-Dbeam.spark.test.reuseSparkContext=true 
-DbeamTestPipelineOptions=["--runner=TestSparkRunner","--streaming=false","--enableSparkMetricSinks=false"]
 
-Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Dspark.ui.enabled=false 
-Dspark.ui.showConsoleProgress=false -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/4.6/workerMain/gradle-worker.jar 
worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test 
Executor 234'
Successfully started process 'Gradle Test Executor 234'
Gradle Test Executor 234 started executing tests.
Gradle Test Executor 234 finished executing tests.
Starting process 'Gradle Test Executor 235'. Working directory: 

 Command: /usr/local/asfpackages/java/jdk1.8.0_152/bin/java 
-Dbeam.spark.test.reuseSparkContext=true 
-DbeamTestPipelineOptions=["--runner=TestSparkRunner","--streaming=false","--enableSparkMetricSinks=false"]
 
-Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Dspark.ui.enabled=false 
-Dspark.ui.showConsoleProgress=false -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/4.6/workerMain/gradle-worker.jar 
worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test 
Executor 235'
Successfully started process 'Gradle Test Executor 235'
Gradle Test Executor 235 started executing tests.
Gradle Test Executor 235 finished executing tests.
Starting process 'Gradle Test Executor 236'. Working directory: 

 Command: /usr/local/asfpackages/java/jdk1.8.0_152/bin/java 
-Dbeam.spark.test.reuseSparkContext=true 
-DbeamTestPipelineOptions=["--runner=TestSparkRunner","--streaming=false","--enableSparkMetricSinks=false"]
 
-Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Dspark.ui.enabled=false 
-Dspark.ui.showConsoleProgress=false -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/4.6/workerMain/gradle-worker.jar 
worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test 
Executor 236'
Successfully started process 'Gradle Test Executor 236'
Gradle Test Executor 236 started executing tests.
Gradle Test Executor 236 finished executing tests.
Starting process 'Gradle Test Executor 237'. Working directory: 

 Command: /usr/local/asfpackages/java/jdk1.8.0_152/bin/java 
-Dbeam.spark.test.reuseSparkContext=true 
-DbeamTestPipelineOptions=["--runner=TestSparkRunner","--streaming=false","--enableSparkMetricSinks=false"]
 
-Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Dspark.ui.enabled=false 
-Dspark.ui.showConsoleProgress=false -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/4.6/workerMain/gradle-worker.jar 
worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test 
Executor 237'
Successfully started process 'Gradle Test Executor 237'
Gradle Test Executor 237 started executing tests.
Gradle Test Executor 237 finished executing tests.
Starting process 'Gradle Test Executor 238'. Working directory: 

 Command: /usr/local/asfpackages/java/jdk1.8.0_152/bin/java 
-Dbeam.spark.test.reuseSparkContext=true 
-DbeamTestPipelineOptions=["--runner=TestSparkRunner","--streaming=false","--enableSparkMetricSinks=false"]
 
-Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Dspark.ui.enabled=false 
-Dspark.ui.showConsoleProgress=false -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/4.6/workerMain/gradle-worker.jar 
worker.org.gradle.process.inter

Jenkins build is back to normal : beam_PostCommit_Java_GradleBuild #168

2018-04-24 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #197

2018-04-24 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #95

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Stop implementing EvaluatorFactory in Registry

[aromanenko.dev] [BEAM-4066] Moved anonymous classes into inner ones

[sidhom] [BEAM-4149] Set worker id to "" if it is not set in the request header

[sidhom] [BEAM-3327] Refactor ControlClientPool to allow client multiplexing

[sidhom] [BEAM-3327] Basic Docker environment factory

[iemejia] [BEAM-4018] Add a ByteKeyRangeTracker based on RestrictionTracker for

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam11 (beam) in workspace 

Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 >  # 
 > timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision d3cd6116517cce04cf5c35bbd2d05a494dd8fed3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f d3cd6116517cce04cf5c35bbd2d05a494dd8fed3
Commit message: "Merge pull request #5129: Stop implementing EvaluatorFactory 
in Registry"
 > git rev-list --no-walk 07f2a45c686ef0ae829849a77bd0622be6dd7ec8 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7945808508708897226.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a 
--verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: 
[--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: unable to load default kubeconfig: unable to load kubeconfig for 
/home/jenkins/.kube/config: [Errno 2] No such file or directory: 
u'/home/jenkins/.kube/config'; recreating /home/jenkins/.kube/config
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins6612866003387524944.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins5529996779164781756.sh
+ kubectl 
--kubeconfig=
 create namespace filebasedioithdfs-1524592067864
namespace "filebasedioithdfs-1524592067864" created
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins6774864324579705846.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=filebasedioithdfs-1524592067864
error: open /home/jenkins/.kube/config.lock: file exists
Build step 'Execute shell' marked build as failure


Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #220

2018-04-24 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_JDBC #492

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Stop implementing EvaluatorFactory in Registry

[aromanenko.dev] [BEAM-4066] Moved anonymous classes into inner ones

[sidhom] [BEAM-4149] Set worker id to "" if it is not set in the request header

[sidhom] [BEAM-3327] Refactor ControlClientPool to allow client multiplexing

[sidhom] [BEAM-3327] Basic Docker environment factory

[iemejia] [BEAM-4018] Add a ByteKeyRangeTracker based on RestrictionTracker for

--
[...truncated 46.36 KB...]
[INFO] Excluding io.opencensus:opencensus-api:jar:0.7.0 from the shaded jar.
[INFO] Excluding io.dropwizard.metrics:metrics-core:jar:3.1.2 from the shaded 
jar.
[INFO] Excluding com.google.protobuf:protobuf-java:jar:3.2.0 from the shaded 
jar.
[INFO] Excluding io.netty:netty-tcnative-boringssl-static:jar:1.1.33.Fork26 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:proto-google-cloud-spanner-admin-database-v1:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client:jar:1.23.0 from the 
shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client:jar:1.23.0 from 
the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client:jar:1.23.0 from the 
shaded jar.
[INFO] Excluding org.apache.httpcomponents:httpclient:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding org.apache.httpcomponents:httpcore:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding commons-codec:commons-codec:jar:1.3 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.23.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.23.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev233-1.23.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev124-1.23.0 from the 
shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.23.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.23.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.23.0 
from the shaded jar.
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing 

 with 

[INFO] Replacing original test artifact with shaded test artifact.
[INFO] Replacing 

 with 

[INFO] Dependency-reduced POM written at: 

[INFO] 
[INFO] --- maven-failsafe-plugin:2.21.0:integration-test (default) @ 
beam-sdks-java-io-jdbc ---
[INFO] Failsafe report directory: 

[INFO] parallel='all', perCoreThreadCount=true, threadCount=4, 
useUnlimitedThreads=false, threadCountSuites=0, threadCountClasses=0, 
threadCountMethods=0, parallelOptimized=true
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] Running org.apache.beam.sdk.io.jdbc.JdbcIOIT
[ERROR] Tests run: 2, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 0 s <<< 
FAILURE! - in org.apache.beam.sdk.io.jdbc.JdbcIOIT
[ERROR] org.apache.beam.sdk.io.jdbc.JdbcIOIT  Time elapsed: 0 s  <<< ERROR!
org.postgresql.util.PSQLException: The connection attempt failed.
at 
org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:272)
at 
org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
at org.postgresql.jdbc.PgConnection.(PgConnection.java:215)
at org.postgresql.Driver.makeConnection(Driver.java:404)
at org.postgresql.Driver.connect(Driver.java:272)
at java.sql.DriverManager.getConnection(DriverMana

Jenkins build is back to normal : beam_PerformanceTests_XmlIOIT_HDFS #88

2018-04-24 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PerformanceTests_AvroIOIT_HDFS #89

2018-04-24 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PerformanceTests_HadoopInputFormat #181

2018-04-24 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PerformanceTests_XmlIOIT #184

2018-04-24 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Python #1188

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Stop implementing EvaluatorFactory in Registry

[aromanenko.dev] [BEAM-4066] Moved anonymous classes into inner ones

[sidhom] [BEAM-4149] Set worker id to "" if it is not set in the request header

[sidhom] [BEAM-3327] Refactor ControlClientPool to allow client multiplexing

[sidhom] [BEAM-3327] Basic Docker environment factory

[iemejia] [BEAM-4018] Add a ByteKeyRangeTracker based on RestrictionTracker for

--
[...truncated 13.13 KB...]
  Downloading 
https://files.pythonhosted.org/packages/0d/4d/4e5985d075d241d686a1663fa1f88b61d544658d08c1375c7c6aac32afc3/typing-3.6.4-py2-none-any.whl
Requirement already satisfied: futures<4.0.0,>=3.1.1 in 
./.env/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (3.2.0)
Collecting google-apitools<=0.5.20,>=0.5.18 (from apache-beam==2.5.0.dev0)
  Downloading 
https://files.pythonhosted.org/packages/1d/0c/64f84f91643f775fdb64c6c10f4a4f0d827f8b0d98a2ba2b4bb9dc2f8646/google_apitools-0.5.20-py2-none-any.whl
 (330kB)
Collecting proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 (from 
apache-beam==2.5.0.dev0)
  Downloading 
https://files.pythonhosted.org/packages/2a/1f/4124f15e1132a2eeeaf616d825990bb1d395b4c2c37362654ea5cd89bb42/proto-google-cloud-datastore-v1-0.90.4.tar.gz
Collecting googledatastore==7.0.1 (from apache-beam==2.5.0.dev0)
  Downloading 
https://files.pythonhosted.org/packages/73/d0/17ce873331aaf529ab238464a15fd7bdc1ba8d2c684789970a7fa8b505a8/googledatastore-7.0.1.tar.gz
Collecting google-cloud-pubsub==0.26.0 (from apache-beam==2.5.0.dev0)
  Downloading 
https://files.pythonhosted.org/packages/37/92/c74a643126d58505daec9addf872dfaffea3305981b90cc435f4b9213cdd/google_cloud_pubsub-0.26.0-py2.py3-none-any.whl
Collecting proto-google-cloud-pubsub-v1==0.15.4 (from apache-beam==2.5.0.dev0)
  Downloading 
https://files.pythonhosted.org/packages/c0/a2/2eeffa0069830f00016196dfdd69491cf562372b5353f2e8e378b3c2cb0a/proto-google-cloud-pubsub-v1-0.15.4.tar.gz
Collecting google-cloud-bigquery==0.25.0 (from apache-beam==2.5.0.dev0)
  Downloading 
https://files.pythonhosted.org/packages/76/67/6165c516ff6ceaa62eb61f11d8451e1b0acc4d3775e181630aba9652babb/google_cloud_bigquery-0.25.0-py2.py3-none-any.whl
 (41kB)
Collecting nose>=1.3.7 (from apache-beam==2.5.0.dev0)
  Downloading 
https://files.pythonhosted.org/packages/99/4f/13fb671119e65c4dce97c60e67d3fd9e6f7f809f2b307e2611f4701205cb/nose-1.3.7-py2-none-any.whl
 (154kB)
Collecting pyhamcrest<2.0,>=1.9 (from apache-beam==2.5.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Requirement already satisfied: enum34>=1.0.4 in 
/usr/local/lib/python2.7/dist-packages (from 
grpcio<2,>=1.8->apache-beam==2.5.0.dev0) (1.1.6)
Requirement already satisfied: docopt in /usr/local/lib/python2.7/dist-packages 
(from hdfs<3.0.0,>=2.1.0->apache-beam==2.5.0.dev0) (0.6.2)
Requirement already satisfied: requests>=2.7.0 in 
/usr/local/lib/python2.7/dist-packages (from 
hdfs<3.0.0,>=2.1.0->apache-beam==2.5.0.dev0) (2.18.4)
Requirement already satisfied: pbr>=0.11 in /usr/lib/python2.7/dist-packages 
(from mock<3.0.0,>=1.0.1->apache-beam==2.5.0.dev0) (1.8.0)
Collecting funcsigs>=1; python_version < "3.3" (from 
mock<3.0.0,>=1.0.1->apache-beam==2.5.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Requirement already satisfied: pyasn1>=0.1.7 in 
/usr/local/lib/python2.7/dist-packages (from 
oauth2client<5,>=2.0.1->apache-beam==2.5.0.dev0) (0.4.2)
Collecting pyasn1-modules>=0.0.5 (from 
oauth2client<5,>=2.0.1->apache-beam==2.5.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/e9/51/bcd96bf6231d4b2cc5e023c511bee86637ba375c44a6f9d1b4b7ad1ce4b9/pyasn1_modules-0.2.1-py2.py3-none-any.whl
Collecting rsa>=3.1.4 (from oauth2client<5,>=2.0.1->apache-beam==2.5.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/e1/ae/baedc9cb175552e95f3395c43055a6a5e125ae4d48a1d7a924baca83e92e/rsa-3.4.2-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from protobuf<4,>=3.5.0.post1->apache-beam==2.5.0.dev0) (39.0.1)
Collecting fasteners>=0.14 (from 
google-apitools<=0.5.20,>=0.5.18->apache-beam==2.5.0.dev0)
  Downloading 
https://files.pythonhosted.org/packages/14/3a/096c7ad18e102d4f219f5dd15951f9728ca5092a3385d2e8f79a7c1e1017/fasteners-0.14.1-py2.py3-none-any.whl
Collecting googleapis-common-protos<2.0dev,>=1.5.2 (from 
proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.5.0.dev0)
  Downloading 
https://files.pythonhosted.org/packages/00/03/d25bed04ec8d930bcfa488ba81a2ecbf7eb36ae3ffd7e8f5be0d036a89c9/googleapis-common-protos-1.5.3.tar.gz
Collecting gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0 (

Build failed in Jenkins: beam_PerformanceTests_Spark #1631

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Stop implementing EvaluatorFactory in Registry

[aromanenko.dev] [BEAM-4066] Moved anonymous classes into inner ones

[sidhom] [BEAM-4149] Set worker id to "" if it is not set in the request header

[sidhom] [BEAM-3327] Refactor ControlClientPool to allow client multiplexing

[sidhom] [BEAM-3327] Basic Docker environment factory

[iemejia] [BEAM-4018] Add a ByteKeyRangeTracker based on RestrictionTracker for

--
[...truncated 76.55 KB...]
2018-04-24 18:17:33,324 185b061d MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-24 18:18:01,744 185b061d MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-24 18:18:05,161 185b061d MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r19168b6a12aa958c_0162f8e074f0_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r19168b6a12aa958c_0162f8e074f0_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r19168b6a12aa958c_0162f8e074f0_1 ... (0s) Current status: DONE   
2018-04-24 18:18:05,162 185b061d MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-24 18:18:22,717 185b061d MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-24 18:18:26,278 185b061d MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r50573da36ae8226b_0162f8e0c6ee_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r50573da36ae8226b_0162f8e0c6ee_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r50573da36ae8226b_0162f8e0c6ee_1 ... (0s) Current status: DONE   
2018-04-24 18:18:26,278 185b061d MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-24 18:18:43,799 185b061d MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-24 18:18:47,513 185b061d MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r4ffc45d3677a4848_0162f8e11949_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r4ffc45d3677a4848_0162f8e11949_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r4ffc45d3677a4848_0162f8e11949_1 ... (0s) Current status: DONE   
2018-04-24 18:18:47,513 185b061d MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-24 18:19:10,540 185b061d MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-24 18:19:14,151 185b061d MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r72ad4a900aaba168_0162f8e181b8_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upl

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Apex_Gradle #191

2018-04-24 Thread Apache Jenkins Server
See 


--
[...truncated 26.75 MB...]
INFO: container-55 msg: [container-55] Exiting heartbeat loop..
Apr 24, 2018 6:40:10 PM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-55 terminating.
Apr 24, 2018 6:40:10 PM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-54
Apr 24, 2018 6:40:10 PM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Apr 24, 2018 6:40:10 PM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-54 msg: [container-54] Exiting heartbeat loop..
Apr 24, 2018 6:40:10 PM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-54 terminating.
Apr 24, 2018 6:40:10 PM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-53
Apr 24, 2018 6:40:10 PM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Apr 24, 2018 6:40:10 PM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-53 msg: [container-53] Exiting heartbeat loop..
Apr 24, 2018 6:40:10 PM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-53 terminating.
Apr 24, 2018 6:40:10 PM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-50
Apr 24, 2018 6:40:10 PM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-49
Apr 24, 2018 6:40:10 PM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Apr 24, 2018 6:40:10 PM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-50 msg: [container-50] Exiting heartbeat loop..
Apr 24, 2018 6:40:10 PM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Apr 24, 2018 6:40:10 PM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-50 terminating.
Apr 24, 2018 6:40:10 PM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-49 msg: [container-49] Exiting heartbeat loop..
Apr 24, 2018 6:40:10 PM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-49 terminating.
Apr 24, 2018 6:40:10 PM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-48
Apr 24, 2018 6:40:10 PM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Apr 24, 2018 6:40:10 PM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-48 msg: [container-48] Exiting heartbeat loop..
Apr 24, 2018 6:40:10 PM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-48 terminating.
Apr 24, 2018 6:40:10 PM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-44
Apr 24, 2018 6:40:10 PM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Apr 24, 2018 6:40:10 PM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-44 msg: [container-44] Exiting heartbeat loop..
Apr 24, 2018 6:40:10 PM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-44 terminating.
Apr 24, 2018 6:40:10 PM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-40
Apr 24, 2018 6:40:10 PM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Apr 24, 2018 6:40:10 PM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-40 msg: [container-40] Exiting heartbeat loop..
Apr 24, 2018 6:40:10 PM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-40 terminating.
Apr 24, 2018 6:40:10 PM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-39
Apr 24, 2018 6:40:10 PM com.datatorrent.stram.engine.StreamingContainer 
p

Jenkins build is back to normal : beam_PostCommit_Java_GradleBuild #171

2018-04-24 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1442

2018-04-24 Thread Apache Jenkins Server
See 


--
GitHub pull request #5213 of commit f809cc089c13a571c3ada9d0a88456110ac4db76, 
no merge conflicts.
[EnvInject] - Loading node environment variables.
Building remotely on beam23 (beam) in workspace 

Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/5213/*:refs/remotes/origin/pr/5213/*
 > git rev-parse refs/remotes/origin/pr/5213/merge^{commit} # timeout=10
 > git rev-parse refs/remotes/origin/origin/pr/5213/merge^{commit} # timeout=10
Checking out Revision 0730e5221f1f710fb5c936b893bfe2b164263ae3 
(refs/remotes/origin/pr/5213/merge)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0730e5221f1f710fb5c936b893bfe2b164263ae3
Commit message: "Merge f809cc089c13a571c3ada9d0a88456110ac4db76 into 
d3cd6116517cce04cf5c35bbd2d05a494dd8fed3"
First time build. Skipping changelog.
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PostCommit_Python_ValidatesRunner_Dataflow] $ /bin/bash -xe 
/tmp/jenkins415255159412528903.sh
+ cd src
+ bash sdks/python/run_validatesrunner.sh

# pip install --user installation location.
LOCAL_PATH=$HOME/.local/bin/

# INFRA does not install virtualenv
pip install virtualenv --user
Requirement already satisfied: virtualenv in /usr/lib/python2.7/dist-packages 
(15.0.1)

# Virtualenv for the rest of the script to run setup & e2e tests
${LOCAL_PATH}/virtualenv sdks/python
sdks/python/run_validatesrunner.sh: line 38: 
/home/jenkins/.local/bin//virtualenv: No such file or directory
Build step 'Execute shell' marked build as failure


Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Apex_Gradle #192

2018-04-24 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PostCommit_Python_Verify #4780

2018-04-24 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #174

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[apilloud] [BEAM-3983] Add utils for converting to BigQuery types

--
[...truncated 19.67 MB...]
INFO: 2018-04-24T22:59:51.766Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/Values/Values/Map into 
PAssert$3/CreateActual/GatherPanes/GroupByKey/GroupByWindow
Apr 24, 2018 11:00:02 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-24T22:59:51.780Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/GroupByKey/GroupByWindow into 
PAssert$3/CreateActual/GatherPanes/GroupByKey/Read
Apr 24, 2018 11:00:02 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-24T22:59:51.798Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/GroupByKey/Reify into 
PAssert$3/CreateActual/GatherPanes/Window.Into()/Window.Assign
Apr 24, 2018 11:00:02 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-24T22:59:51.822Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/GroupByKey/Write into 
PAssert$3/CreateActual/GatherPanes/GroupByKey/Reify
Apr 24, 2018 11:00:02 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-24T22:59:51.839Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/Window.Into()/Window.Assign into 
PAssert$3/CreateActual/GatherPanes/WithKeys/AddKeys/Map
Apr 24, 2018 11:00:02 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-24T22:59:51.863Z: Unzipping flatten s18-u63 for input 
s19.output-c61
Apr 24, 2018 11:00:02 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-24T22:59:51.882Z: Fusing unzipped copy of 
PAssert$3/CreateActual/GatherPanes/Reify.Window/ParDo(Anonymous), through 
flatten s18-u63, into producer 
PAssert$3/CreateActual/FilterActuals/Window.Assign
Apr 24, 2018 11:00:02 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-24T22:59:51.909Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/Reify.Window/ParDo(Anonymous) into 
PAssert$3/CreateActual/FilterActuals/Window.Assign
Apr 24, 2018 11:00:02 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-24T22:59:51.930Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
Apr 24, 2018 11:00:02 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-24T22:59:51.960Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
Apr 24, 2018 11:00:02 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-24T22:59:51.974Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
Apr 24, 2018 11:00:02 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-24T22:59:52.000Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
Apr 24, 2018 11:00:02 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-24T22:59:52.027Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
Apr 24, 2018 11:00:02 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-24T22:59:52.053Z: Fusing consumer 
PAssert$3/CreateActual/RewindowActuals/Window.Assign into 
PAssert$3/CreateActual/Flatten.Iterables/FlattenIterables/FlatMap
Apr 24, 2018 11:0

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #175

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Add a CollectionT to Bundle

[tgroh] Use Bundle in WatermarkManager

--
[...truncated 18.96 MB...]

org.apache.beam.examples.cookbook.MaxPerKeyExamplesTest > testExtractTempFn 
STANDARD_ERROR
Apr 24, 2018 11:31:21 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.BigQueryTornadoesTest > testFormatCounts 
STANDARD_ERROR
Apr 24, 2018 11:31:22 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.BigQueryTornadoesTest > testExtractTornadoes 
STANDARD_ERROR
Apr 24, 2018 11:31:22 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.BigQueryTornadoesTest > testNoTornadoes 
STANDARD_ERROR
Apr 24, 2018 11:31:22 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.TriggerExampleTest > testExtractTotalFlow 
STANDARD_ERROR
Apr 24, 2018 11:31:23 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.FilterExamplesTest > 
testFilterSingleMonthDataFn STANDARD_ERROR
Apr 24, 2018 11:31:23 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.FilterExamplesTest > testProjectionFn 
STANDARD_ERROR
Apr 24, 2018 11:31:23 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.JoinExamplesTest > testExtractCountryInfoFn 
STANDARD_ERROR
Apr 24, 2018 11:31:23 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.JoinExamplesTest > testExtractEventDataFn 
STANDARD_ERROR
Apr 24, 2018 11:31:24 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.DebuggingWordCountTest > testDebuggingWordCount 
STANDARD_ERROR
Apr 24, 2018 11:31:24 PM org.apache.beam.sdk.io.FileBasedSource 
getEstimatedSizeBytes
INFO: Filepattern 
/tmp/junit5219256489574877044/junit5278720245691410365.tmp matched 1 files with 
total size 54
Apr 24, 2018 11:31:24 PM org.apache.beam.sdk.io.FileBasedSource split
INFO: Splitting filepattern 
/tmp/junit5219256489574877044/junit5278720245691410365.tmp into bundles of size 
3 took 1 ms and produced 1 files and 18 bundles

org.apache.beam.examples.WordCountTest > testExtractWordsFn STANDARD_ERROR
Apr 24, 2018 11:31:24 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.subprocess.ExampleEchoPipelineTest > 
testExampleEchoPipeline STANDARD_ERROR
Apr 24, 2018 11:31:26 PM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils initSemaphore
INFO: Initialized Semaphore for binary test-Echo8566447910782988931.sh 
Apr 24, 2018 11:31:26 PM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils setUp
INFO: Calling filesetup to move Executables to worker.
Apr 24, 2018 11:31:26 PM 
org.apache.beam.examples.subprocess.utils.FileUtils copyFileFromGCSToWorker
INFO: Moving File /tmp/test-Echo8566447910782988931.sh to 
/tmp/test-Echoo508107964237318519/test-Echo8566447910782988931.sh 
Apr 24, 2018 11:31:26 PM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils initSemaphore
INFO: Initialized Semaphore for binary test-EchoAgain6057674206472611287.sh 
Apr 24, 2018 11:31:26 PM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils setUp
INFO: Calling filesetup to move Executables to worker.
Apr 24, 2018 11:31:26 PM 
org.apache.beam.examples.subprocess.utils.FileUtils copyFileFromGCSToWorker
INFO: Moving File /tmp/test-EchoAgain6057674206472611287.sh to 
/tmp/test-Echoo508107964237318519/test-EchoAgain6057674206472611287.sh 

org.apache.beam.examples.complete.game.HourlyTeamScoreTest > 
testUserScoresFilter STANDARD_OUT
GOT user2_AmberCockatoo

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #176

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Cleanups in GroupByKeyOnlyEvaluatorFactory

--
[...truncated 19.49 MB...]
org.apache.beam.examples.cookbook.CombinePerKeyExamplesTest > 
testExtractLargeWordsFn STANDARD_ERROR
Apr 24, 2018 11:32:49 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.
Apr 24, 2018 11:32:49 PM org.apache.beam.sdk.metrics.MetricsEnvironment 
getCurrentContainer
WARNING: Reporting metrics are not supported in the current execution 
environment.

org.apache.beam.examples.cookbook.MaxPerKeyExamplesTest > testFormatMaxesFn 
STANDARD_ERROR
Apr 24, 2018 11:32:49 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.MaxPerKeyExamplesTest > testExtractTempFn 
STANDARD_ERROR
Apr 24, 2018 11:32:49 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.BigQueryTornadoesTest > testFormatCounts 
STANDARD_ERROR
Apr 24, 2018 11:32:49 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.BigQueryTornadoesTest > testExtractTornadoes 
STANDARD_ERROR
Apr 24, 2018 11:32:49 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.BigQueryTornadoesTest > testNoTornadoes 
STANDARD_ERROR
Apr 24, 2018 11:32:49 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.TriggerExampleTest > testExtractTotalFlow 
STANDARD_ERROR
Apr 24, 2018 11:32:50 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.FilterExamplesTest > 
testFilterSingleMonthDataFn STANDARD_ERROR
Apr 24, 2018 11:32:51 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.FilterExamplesTest > testProjectionFn 
STANDARD_ERROR
Apr 24, 2018 11:32:51 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.JoinExamplesTest > testExtractCountryInfoFn 
STANDARD_ERROR
Apr 24, 2018 11:32:51 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.JoinExamplesTest > testExtractEventDataFn 
STANDARD_ERROR
Apr 24, 2018 11:32:51 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.DebuggingWordCountTest > testDebuggingWordCount 
STANDARD_ERROR
Apr 24, 2018 11:32:51 PM org.apache.beam.sdk.io.FileBasedSource 
getEstimatedSizeBytes
INFO: Filepattern 
/tmp/junit6674066865889181197/junit8041641849536197142.tmp matched 1 files with 
total size 54
Apr 24, 2018 11:32:51 PM org.apache.beam.sdk.io.FileBasedSource split
INFO: Splitting filepattern 
/tmp/junit6674066865889181197/junit8041641849536197142.tmp into bundles of size 
3 took 0 ms and produced 1 files and 18 bundles

org.apache.beam.examples.WordCountTest > testExtractWordsFn STANDARD_ERROR
Apr 24, 2018 11:32:51 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.subprocess.ExampleEchoPipelineTest > 
testExampleEchoPipeline STANDARD_ERROR
Apr 24, 2018 11:32:53 PM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils initSemaphore
INFO: Initialized Semaphore for binary test-Echo706128473611446487.sh 
Apr 24, 2018 11:32:53 PM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils setUp
INFO: Calling filesetup to move Executables to worker.
Apr 24, 2018 11:32:53 PM 
org.apache.beam.examples.subprocess.utils.FileUtils copyFileFromGCSToWorker
INFO: Moving File /tmp/test-Echo706128473611446487.sh to 
/tmp/test-Echoo9003518107771992924/test-Echo70612847361144

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #178

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[kirpichov] [BEAM-4166] Invoke @Setup in FnApiDoFnRunner

--
[...truncated 18.67 MB...]
org.apache.beam.examples.cookbook.FilterExamplesTest > 
testFilterSingleMonthDataFn STANDARD_ERROR
Apr 24, 2018 11:40:22 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.FilterExamplesTest > testProjectionFn 
STANDARD_ERROR
Apr 24, 2018 11:40:22 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.JoinExamplesTest > testExtractCountryInfoFn 
STANDARD_ERROR
Apr 24, 2018 11:40:22 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.JoinExamplesTest > testExtractEventDataFn 
STANDARD_ERROR
Apr 24, 2018 11:40:22 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.DebuggingWordCountTest > testDebuggingWordCount 
STANDARD_ERROR
Apr 24, 2018 11:40:22 PM org.apache.beam.sdk.io.FileBasedSource 
getEstimatedSizeBytes
INFO: Filepattern 
/tmp/junit7290404604861474461/junit7856709705613336601.tmp matched 1 files with 
total size 54
Apr 24, 2018 11:40:22 PM org.apache.beam.sdk.io.FileBasedSource split
INFO: Splitting filepattern 
/tmp/junit7290404604861474461/junit7856709705613336601.tmp into bundles of size 
3 took 1 ms and produced 1 files and 18 bundles

org.apache.beam.examples.WordCountTest > testExtractWordsFn STANDARD_ERROR
Apr 24, 2018 11:40:22 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.subprocess.ExampleEchoPipelineTest > 
testExampleEchoPipeline STANDARD_ERROR
Apr 24, 2018 11:40:24 PM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils initSemaphore
INFO: Initialized Semaphore for binary test-Echo5729650519303680627.sh 
Apr 24, 2018 11:40:24 PM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils setUp
INFO: Calling filesetup to move Executables to worker.
Apr 24, 2018 11:40:24 PM 
org.apache.beam.examples.subprocess.utils.FileUtils copyFileFromGCSToWorker
INFO: Moving File /tmp/test-Echo5729650519303680627.sh to 
/tmp/test-Echoo1092152855298419818/test-Echo5729650519303680627.sh 
Apr 24, 2018 11:40:24 PM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils initSemaphore
INFO: Initialized Semaphore for binary test-EchoAgain3964862310941665987.sh 
Apr 24, 2018 11:40:24 PM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils setUp
INFO: Calling filesetup to move Executables to worker.
Apr 24, 2018 11:40:24 PM 
org.apache.beam.examples.subprocess.utils.FileUtils copyFileFromGCSToWorker
INFO: Moving File /tmp/test-EchoAgain3964862310941665987.sh to 
/tmp/test-Echoo1092152855298419818/test-EchoAgain3964862310941665987.sh 

org.apache.beam.examples.complete.game.HourlyTeamScoreTest > 
testUserScoresFilter STANDARD_OUT
GOT user13_ApricotQuokka,ApricotQuokka,15,144795563,2015-11-19 
09:53:53.444
GOT user6_AmberNumbat,AmberNumbat,11,144795563,2015-11-19 09:53:53.444
GOT user0_MagentaKangaroo,MagentaKangaroo,3,144795563,2015-11-19 
09:53:53.444
GOT user7_AlmondWallaby,AlmondWallaby,15,144795563,2015-11-19 
09:53:53.444
GOT 
user7_AndroidGreenKookaburra,AndroidGreenKookaburra,12,144795563,2015-11-19 
09:53:53.444
GOT 
user7_AndroidGreenKookaburra,AndroidGreenKookaburra,11,144795563,2015-11-19 
09:53:53.444
GOT user19_BisqueBilby,BisqueBilby,6,144795563,2015-11-19 09:53:53.444
GOT user0_MagentaKangaroo,MagentaKangaroo,4,144796569,2015-11-19 
12:41:31.053
GOT user18_BananaEmu,BananaEmu,7,144796569,2015-11-19 12:41:31.053
GOT user2_AmberCockatoo,AmberCockatoo,13,144796569,2015-11-19 
12:41:31.053
GOT user18_BananaEmu,BananaEmu,1,144796569,2015-11-19 12:41:31.053
GOT 
user0_AndroidGreenEchidna,AndroidGreenEchidna,0,144796569,2015-11-19 
12:41:31.053
GOT user19_BisqueBilby,BisqueBilby,8,144795563,2015-11-19 09:53:53.444
GOT user18_ApricotCaneToad,ApricotCaneToad,14,144796569,2015-11-19 
12:41:31.053
GOT user3_BananaEmu,BananaEmu,17,144796569,2015-11-19 12:41:31.053

org.apache.beam.examples.complete.game.UserScoreTest > testTeamScoreSums 
STANDARD_OUT
GOT user0_MagentaKangaroo,MagentaKangaroo,3,144795563,2015-11-19 
09:53:53

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #177

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[kedin] Add primitive java types support to Row generation logic, add example

--
[...truncated 18.68 MB...]
org.apache.beam.examples.cookbook.BigQueryTornadoesTest > testNoTornadoes 
STANDARD_ERROR
Apr 24, 2018 11:40:09 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.TriggerExampleTest > testExtractTotalFlow 
STANDARD_ERROR
Apr 24, 2018 11:40:10 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.FilterExamplesTest > 
testFilterSingleMonthDataFn STANDARD_ERROR
Apr 24, 2018 11:40:11 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.FilterExamplesTest > testProjectionFn 
STANDARD_ERROR
Apr 24, 2018 11:40:11 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.JoinExamplesTest > testExtractCountryInfoFn 
STANDARD_ERROR
Apr 24, 2018 11:40:11 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.JoinExamplesTest > testExtractEventDataFn 
STANDARD_ERROR
Apr 24, 2018 11:40:11 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.DebuggingWordCountTest > testDebuggingWordCount 
STANDARD_ERROR
Apr 24, 2018 11:40:11 PM org.apache.beam.sdk.io.FileBasedSource 
getEstimatedSizeBytes
INFO: Filepattern 
/tmp/junit2245267671230643969/junit2476733885584945703.tmp matched 1 files with 
total size 54
Apr 24, 2018 11:40:11 PM org.apache.beam.sdk.io.FileBasedSource split
INFO: Splitting filepattern 
/tmp/junit2245267671230643969/junit2476733885584945703.tmp into bundles of size 
3 took 0 ms and produced 1 files and 18 bundles

org.apache.beam.examples.WordCountTest > testExtractWordsFn STANDARD_ERROR
Apr 24, 2018 11:40:11 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.subprocess.ExampleEchoPipelineTest > 
testExampleEchoPipeline STANDARD_ERROR
Apr 24, 2018 11:40:13 PM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils initSemaphore
INFO: Initialized Semaphore for binary test-Echo3173891149910928532.sh 
Apr 24, 2018 11:40:13 PM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils setUp
INFO: Calling filesetup to move Executables to worker.
Apr 24, 2018 11:40:13 PM 
org.apache.beam.examples.subprocess.utils.FileUtils copyFileFromGCSToWorker
INFO: Moving File /tmp/test-Echo3173891149910928532.sh to 
/tmp/test-Echoo802144664658894864/test-Echo3173891149910928532.sh 
Apr 24, 2018 11:40:13 PM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils initSemaphore
INFO: Initialized Semaphore for binary test-EchoAgain3022186494051844904.sh 
Apr 24, 2018 11:40:13 PM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils setUp
INFO: Calling filesetup to move Executables to worker.
Apr 24, 2018 11:40:13 PM 
org.apache.beam.examples.subprocess.utils.FileUtils copyFileFromGCSToWorker
INFO: Moving File /tmp/test-EchoAgain3022186494051844904.sh to 
/tmp/test-Echoo802144664658894864/test-EchoAgain3022186494051844904.sh 

org.apache.beam.examples.complete.game.HourlyTeamScoreTest > 
testUserScoresFilter STANDARD_OUT
GOT user18_BananaEmu,BananaEmu,7,144796569,2015-11-19 12:41:31.053
GOT user6_AmberNumbat,AmberNumbat,11,144795563,2015-11-19 09:53:53.444
GOT user18_BananaEmu,BananaEmu,1,144796569,2015-11-19 12:41:31.053
GOT user0_MagentaKangaroo,MagentaKangaroo,4,144796569,2015-11-19 
12:41:31.053
GOT user3_BananaEmu,BananaEmu,17,144796569,2015-11-19 12:41:31.053
GOT user0_MagentaKangaroo,MagentaKangaroo,3,144795563,2015-11-19 
09:53:53.444
GOT user2_AmberCockatoo,AmberCockatoo,13,144796569,2015-11-19 
12:41:31.053
GOT user19_BisqueBilby,BisqueBilby,6,144795563,2015-11-19 09:53:53.444
GOT user13_ApricotQuokka,ApricotQuokka,15,144795563,2015-11-19 
09:53:53.444
GOT 
user7_AndroidGreenKookaburra,AndroidGreenKookaburra,12,144795563,2015-11-19 
09:53:53.444
GOT user7_AlmondWallaby,AlmondWallaby,

Build failed in Jenkins: beam_PerformanceTests_Python #1189

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Cleanups in GroupByKeyOnlyEvaluatorFactory

[tgroh] Add a CollectionT to Bundle

[tgroh] Use Bundle in WatermarkManager

[apilloud] [BEAM-3983] Add utils for converting to BigQuery types

[rangadi] Disable flaky unbounded pipeline test

[wcn] Drain source when user function processing fails.

[kedin] Add primitive java types support to Row generation logic, add example

[aaltay] Unpinning Python jobs from Jenkins machines. (#5214)

[kirpichov] [BEAM-4166] Invoke @Setup in FnApiDoFnRunner

[wcn] Allow request and init hooks to update the context.

--
[...truncated 4.30 KB...]
Collecting contextlib2>=0.5.1 (from -r PerfKitBenchmarker/requirements.txt 
(line 24))
  Using cached 
https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Requirement already satisfied: six in /usr/local/lib/python2.7/dist-packages 
(from absl-py->-r PerfKitBenchmarker/requirements.txt (line 14)) (1.11.0)
Requirement already satisfied: MarkupSafe>=0.23 in 
/usr/local/lib/python2.7/dist-packages (from jinja2>=2.7->-r 
PerfKitBenchmarker/requirements.txt (line 15)) (1.0)
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached 
https://files.pythonhosted.org/packages/db/c8/7dcf9dbcb22429512708fe3a547f8b6101c0d02137acbd892505aee57adf/colorama-0.3.9-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Requirement already satisfied: requests>=2.9.1 in 
/usr/local/lib/python2.7/dist-packages (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.18.4)
Collecting xmltodict (from pywinrm->-r PerfKitBenchmarker/requirements.txt 
(line 25))
  Using cached 
https://files.pythonhosted.org/packages/42/a9/7e99652c6bc619d19d58cdd8c47560730eb5825d43a7e25db2e1d776ceb7/xmltodict-0.11.0-py2.py3-none-any.whl
Requirement already satisfied: cryptography>=1.3 in 
/usr/local/lib/python2.7/dist-packages (from requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.2.2)
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/69/bc/230987c0dc22c763529330b2e669dbdba374d6a10c1f61232274184731be/ntlm_auth-1.1.0-py2.py3-none-any.whl
Requirement already satisfied: certifi>=2017.4.17 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2018.4.16)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (3.0.4)
Requirement already satisfied: idna<2.7,>=2.5 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.6)
Requirement already satisfied: urllib3<1.23,>=1.21.1 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.22)
Requirement already satisfied: cffi>=1.7; platform_python_implementation != 
"PyPy" in /usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.11.5)
Requirement already satisfied: enum34; python_version < "3" in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.1.6)
Requirement already satisfied: asn1crypto>=0.21.0 in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (0.24.0)
Requirement already satisfied: ipaddress; python_version < "3" in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.0.22)
Requirement already satisfied: pycparser in 
/usr/local/lib/python2.7/dist-packages (from cffi>=1.7; 
platform_python_implementation != 
"PyPy"->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.18)
Installing collected packages: absl-py, colorama, colorlog, blinker, futures, 
pint, numpy, contextlib2, ntlm-auth, requests-ntlm, xmltodict, pywinr

Build failed in Jenkins: beam_PerformanceTests_HadoopInputFormat #182

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Cleanups in GroupByKeyOnlyEvaluatorFactory

[tgroh] Add a CollectionT to Bundle

[tgroh] Use Bundle in WatermarkManager

[apilloud] [BEAM-3983] Add utils for converting to BigQuery types

[rangadi] Disable flaky unbounded pipeline test

[wcn] Drain source when user function processing fails.

[kedin] Add primitive java types support to Row generation logic, add example

[aaltay] Unpinning Python jobs from Jenkins machines. (#5214)

[kirpichov] [BEAM-4166] Invoke @Setup in FnApiDoFnRunner

[wcn] Allow request and init hooks to update the context.

--
[...truncated 48.14 KB...]
[INFO] Excluding com.google.api.grpc:grpc-google-longrunning-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-longrunning-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-protos:jar:1.0.0-pre3 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-client-core:jar:1.0.0 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-appengine:jar:0.7.0 from 
the shaded jar.
[INFO] Excluding io.opencensus:opencensus-contrib-grpc-util:jar:0.7.0 from the 
shaded jar.
[INFO] Excluding io.netty:netty-tcnative-boringssl-static:jar:1.1.33.Fork26 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:proto-google-cloud-spanner-admin-database-v1:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client:jar:1.23.0 from the 
shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client:jar:1.23.0 from 
the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client:jar:1.23.0 from the 
shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.23.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.23.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev233-1.23.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev124-1.23.0 from the 
shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.23.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.23.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.23.0 
from the shaded jar.
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing 

 with 

[INFO] Replacing original test artifact with shaded test artifact.
[INFO] Replacing 

 with 

[INFO] Dependency-reduced POM written at: 

[INFO] 
[INFO] --- maven-failsafe-plugin:2.21.0:integration-test (default) @ 
beam-sdks-java-io-hadoop-input-format ---
[INFO] Failsafe report directory: 

[INFO] parallel='all', perCoreThreadCount=true, threadCount=4, 
useUnlimitedThreads=false, threadCountSuites=0, threadCountClasses=0, 
threadCountMethods=0, parallelOptimized=true
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] Running org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT
[ERROR] Tests run: 2, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 0 s <<< 
FAILURE! - in org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT
[ERROR] org.apache.beam.sdk.io.hadoop.inputformat.Hadoo

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #96

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Cleanups in GroupByKeyOnlyEvaluatorFactory

[tgroh] Add a CollectionT to Bundle

[tgroh] Use Bundle in WatermarkManager

[apilloud] [BEAM-3983] Add utils for converting to BigQuery types

[rangadi] Disable flaky unbounded pipeline test

[wcn] Drain source when user function processing fails.

[kedin] Add primitive java types support to Row generation logic, add example

[aaltay] Unpinning Python jobs from Jenkins machines. (#5214)

[kirpichov] [BEAM-4166] Invoke @Setup in FnApiDoFnRunner

[wcn] Allow request and init hooks to update the context.

--
[...truncated 238.39 KB...]
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at 
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at 
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy62.create(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy63.create(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hd

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #180

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[wcn] Allow request and init hooks to update the context.

--
[...truncated 19.13 MB...]
Apr 25, 2018 12:15:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:15:26.412Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/Reify.Window/ParDo(Anonymous) into 
PAssert$3/CreateActual/FilterActuals/Window.Assign
Apr 25, 2018 12:15:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:15:26.437Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
Apr 25, 2018 12:15:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:15:26.461Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
Apr 25, 2018 12:15:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:15:26.482Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
Apr 25, 2018 12:15:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:15:26.509Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
Apr 25, 2018 12:15:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:15:26.537Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
Apr 25, 2018 12:15:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:15:26.572Z: Fusing consumer 
PAssert$3/CreateActual/RewindowActuals/Window.Assign into 
PAssert$3/CreateActual/Flatten.Iterables/FlattenIterables/FlatMap
Apr 25, 2018 12:15:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:15:26.604Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial
Apr 25, 2018 12:15:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:15:26.624Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read
Apr 25, 2018 12:15:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:15:26.645Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify
Apr 25, 2018 12:15:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:15:26.671Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+PAssert$3/CreateActual/

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #179

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[wcn] Drain source when user function processing fails.

--
[...truncated 22.03 MB...]
INFO: 2018-04-25T00:14:48.904Z: Unzipping flatten s18-u63 for input 
s19.output-c61
Apr 25, 2018 12:14:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:14:48.947Z: Fusing unzipped copy of 
PAssert$3/CreateActual/GatherPanes/Reify.Window/ParDo(Anonymous), through 
flatten s18-u63, into producer 
PAssert$3/CreateActual/FilterActuals/Window.Assign
Apr 25, 2018 12:14:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:14:48.986Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/Reify.Window/ParDo(Anonymous) into 
PAssert$3/CreateActual/FilterActuals/Window.Assign
Apr 25, 2018 12:14:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:14:49.011Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
Apr 25, 2018 12:14:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:14:49.040Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
Apr 25, 2018 12:14:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:14:49.065Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
Apr 25, 2018 12:14:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:14:49.097Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
Apr 25, 2018 12:14:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:14:49.133Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
Apr 25, 2018 12:14:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:14:49.176Z: Fusing consumer 
PAssert$3/CreateActual/RewindowActuals/Window.Assign into 
PAssert$3/CreateActual/Flatten.Iterables/FlattenIterables/FlatMap
Apr 25, 2018 12:14:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:14:49.211Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial
Apr 25, 2018 12:14:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:14:49.251Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read
Apr 25, 2018 12:14:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:14:49.279Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.Glo

Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT_HDFS #90

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Cleanups in GroupByKeyOnlyEvaluatorFactory

[tgroh] Add a CollectionT to Bundle

[tgroh] Use Bundle in WatermarkManager

[apilloud] [BEAM-3983] Add utils for converting to BigQuery types

[rangadi] Disable flaky unbounded pipeline test

[wcn] Drain source when user function processing fails.

[kedin] Add primitive java types support to Row generation logic, add example

[aaltay] Unpinning Python jobs from Jenkins machines. (#5214)

[kirpichov] [BEAM-4166] Invoke @Setup in FnApiDoFnRunner

[wcn] Allow request and init hooks to update the context.

--
[...truncated 77.37 KB...]
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy63.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:337)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.copy(HadoopFileSystem.java:131)
at org.apache.beam.sdk.io.FileSystems.copy(FileSystems.java:301)
at 
org.apache.beam.sdk.io.FileBasedSink$WriteOperation.moveToOutputFiles(FileBasedSink.java:755)
at 
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn.process(WriteFiles.java:801)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at 
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at 
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy62.getFileInfo(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy63.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:337)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.copy(HadoopFileSystem.java:131)
at org.apache.beam.sdk.io.FileSystems.copy(FileSystems.java:301)
at 
org.apache.beam.sdk.io.FileBasedSink$WriteOperation.moveToOutputFiles(FileBasedSink.java:755)
at 
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn.process(WriteFiles.java:801)
at 
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn$DoFnInvoker.invokeProcessEle

Build failed in Jenkins: beam_PerformanceTests_MongoDBIO_IT #91

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Cleanups in GroupByKeyOnlyEvaluatorFactory

[tgroh] Add a CollectionT to Bundle

[tgroh] Use Bundle in WatermarkManager

[apilloud] [BEAM-3983] Add utils for converting to BigQuery types

[rangadi] Disable flaky unbounded pipeline test

[wcn] Drain source when user function processing fails.

[kedin] Add primitive java types support to Row generation logic, add example

[aaltay] Unpinning Python jobs from Jenkins machines. (#5214)

[kirpichov] [BEAM-4166] Invoke @Setup in FnApiDoFnRunner

[wcn] Allow request and init hooks to update the context.

--
[...truncated 37.72 KB...]
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-pubsub:jar:v1-rev382-1.23.0 from the shaded 
jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.23.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.23.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-resolver:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-all:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-okhttp:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.squareup.okhttp:okhttp:jar:2.5.0 from the shaded jar.
[INFO] Excluding com.squareup.okio:okio:jar:1.6.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-lite:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-nano:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.protobuf.nano:protobuf-javanano:jar:3.0.0-alpha-5 
from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core:jar:1.0.2 from the shaded 
jar.
[INFO] Excluding org.json:json:jar:20160810 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-spanner:jar:0.20.0b-beta from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-spanner-v1:jar:0.1.11b 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:proto-google-cloud-spanner-admin-instance-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-spanner-v1:jar:0.1.11b 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:grpc-google-cloud-spanner-admin-database-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding 
com.google.api.grpc:grpc-google-cloud-spanner-admin-instance-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-longrunning-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-longrunning-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-protos:jar:1.0.0-pre3 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-client-core:jar:1.0.0 from 
the shaded jar.
[INFO] Excluding commons-logging:commons-logging:jar:1.2 from the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-appengin

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #89

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Cleanups in GroupByKeyOnlyEvaluatorFactory

[tgroh] Add a CollectionT to Bundle

[tgroh] Use Bundle in WatermarkManager

[apilloud] [BEAM-3983] Add utils for converting to BigQuery types

[rangadi] Disable flaky unbounded pipeline test

[wcn] Drain source when user function processing fails.

[kedin] Add primitive java types support to Row generation logic, add example

[aaltay] Unpinning Python jobs from Jenkins machines. (#5214)

[kirpichov] [BEAM-4166] Invoke @Setup in FnApiDoFnRunner

[wcn] Allow request and init hooks to update the context.

--
[...truncated 258.16 KB...]
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy63.create(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at 
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at 
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy62.create(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy63.create(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:

Jenkins build is back to normal : beam_PerformanceTests_JDBC #493

2018-04-24 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Spark #1632

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Cleanups in GroupByKeyOnlyEvaluatorFactory

[tgroh] Add a CollectionT to Bundle

[tgroh] Use Bundle in WatermarkManager

[apilloud] [BEAM-3983] Add utils for converting to BigQuery types

[rangadi] Disable flaky unbounded pipeline test

[wcn] Drain source when user function processing fails.

[kedin] Add primitive java types support to Row generation logic, add example

[aaltay] Unpinning Python jobs from Jenkins machines. (#5214)

[kirpichov] [BEAM-4166] Invoke @Setup in FnApiDoFnRunner

[wcn] Allow request and init hooks to update the context.

--
[...truncated 85.37 KB...]
2018-04-25 00:17:04,088 073d9fd4 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-25 00:17:29,143 073d9fd4 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-25 00:17:32,717 073d9fd4 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r3f5510e62ff105ad_0162fa298c8b_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r3f5510e62ff105ad_0162fa298c8b_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r3f5510e62ff105ad_0162fa298c8b_1 ... (0s) Current status: DONE   
2018-04-25 00:17:32,718 073d9fd4 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-25 00:17:58,647 073d9fd4 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-25 00:18:02,175 073d9fd4 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r31e14ef016565f2c_0162fa29ffed_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r31e14ef016565f2c_0162fa29ffed_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r31e14ef016565f2c_0162fa29ffed_1 ... (0s) Current status: DONE   
2018-04-25 00:18:02,176 073d9fd4 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-25 00:18:22,590 073d9fd4 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-25 00:18:26,212 073d9fd4 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r548fbc5e2bd27b9e_0162fa2a5d57_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r548fbc5e2bd27b9e_0162fa2a5d57_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r548fbc5e2bd27b9e_0162fa2a5d57_1 ... (0s) Current status: DONE   
2018-04-25 00:18:26,213 073d9fd4 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-25 00:18:41,674 073d9fd4 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-25 00:18:45,238 073d9fd4 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testi

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #181

2018-04-24 Thread Apache Jenkins Server
See 


--
[...truncated 19.62 MB...]
INFO: 2018-04-25T00:47:14.516Z: Fusing consumer 
PAssert$3/CreateActual/Flatten.Iterables/FlattenIterables/FlatMap into 
PAssert$3/CreateActual/ExtractPane/Map
Apr 25, 2018 12:47:17 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:47:14.566Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/Values/Values/Map into 
PAssert$3/CreateActual/GatherPanes/GroupByKey/GroupByWindow
Apr 25, 2018 12:47:17 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:47:14.593Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/GroupByKey/GroupByWindow into 
PAssert$3/CreateActual/GatherPanes/GroupByKey/Read
Apr 25, 2018 12:47:17 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:47:14.617Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/GroupByKey/Reify into 
PAssert$3/CreateActual/GatherPanes/Window.Into()/Window.Assign
Apr 25, 2018 12:47:17 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:47:14.651Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/GroupByKey/Write into 
PAssert$3/CreateActual/GatherPanes/GroupByKey/Reify
Apr 25, 2018 12:47:17 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:47:14.671Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/Window.Into()/Window.Assign into 
PAssert$3/CreateActual/GatherPanes/WithKeys/AddKeys/Map
Apr 25, 2018 12:47:17 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:47:14.697Z: Unzipping flatten s18-u63 for input 
s19.output-c61
Apr 25, 2018 12:47:17 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:47:14.726Z: Fusing unzipped copy of 
PAssert$3/CreateActual/GatherPanes/Reify.Window/ParDo(Anonymous), through 
flatten s18-u63, into producer 
PAssert$3/CreateActual/FilterActuals/Window.Assign
Apr 25, 2018 12:47:17 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:47:14.757Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/Reify.Window/ParDo(Anonymous) into 
PAssert$3/CreateActual/FilterActuals/Window.Assign
Apr 25, 2018 12:47:17 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:47:14.777Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
Apr 25, 2018 12:47:17 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:47:14.810Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
Apr 25, 2018 12:47:17 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:47:14.836Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
Apr 25, 2018 12:47:17 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:47:14.872Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
Apr 25, 2018 12:47:17 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T00:47:14.898Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
Apr 25, 2018 12:47:17 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-2

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #182

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[kedin] Add JsonToRow Transform

[kedin] Convert JsonToRow from MapElements.via() to ParDo

--
[...truncated 19.37 MB...]

org.apache.beam.examples.cookbook.TriggerExampleTest > testExtractTotalFlow 
STANDARD_ERROR
Apr 25, 2018 2:40:23 AM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.FilterExamplesTest > 
testFilterSingleMonthDataFn STANDARD_ERROR
Apr 25, 2018 2:40:24 AM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.FilterExamplesTest > testProjectionFn 
STANDARD_ERROR
Apr 25, 2018 2:40:24 AM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.JoinExamplesTest > testExtractCountryInfoFn 
STANDARD_ERROR
Apr 25, 2018 2:40:24 AM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.cookbook.JoinExamplesTest > testExtractEventDataFn 
STANDARD_ERROR
Apr 25, 2018 2:40:24 AM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.DebuggingWordCountTest > testDebuggingWordCount 
STANDARD_ERROR
Apr 25, 2018 2:40:24 AM org.apache.beam.sdk.io.FileBasedSource 
getEstimatedSizeBytes
INFO: Filepattern /tmp/junit120444639599036259/junit192480798379155287.tmp 
matched 1 files with total size 54
Apr 25, 2018 2:40:24 AM org.apache.beam.sdk.io.FileBasedSource split
INFO: Splitting filepattern 
/tmp/junit120444639599036259/junit192480798379155287.tmp into bundles of size 3 
took 1 ms and produced 1 files and 18 bundles

org.apache.beam.examples.WordCountTest > testExtractWordsFn STANDARD_ERROR
Apr 25, 2018 2:40:24 AM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.examples.subprocess.ExampleEchoPipelineTest > 
testExampleEchoPipeline STANDARD_ERROR
Apr 25, 2018 2:40:26 AM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils initSemaphore
INFO: Initialized Semaphore for binary test-Echo7195164603381926811.sh 
Apr 25, 2018 2:40:26 AM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils setUp
INFO: Calling filesetup to move Executables to worker.
Apr 25, 2018 2:40:26 AM org.apache.beam.examples.subprocess.utils.FileUtils 
copyFileFromGCSToWorker
INFO: Moving File /tmp/test-Echo7195164603381926811.sh to 
/tmp/test-Echoo8917452202130660514/test-Echo7195164603381926811.sh 
Apr 25, 2018 2:40:26 AM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils initSemaphore
INFO: Initialized Semaphore for binary test-EchoAgain591204249350566431.sh 
Apr 25, 2018 2:40:26 AM 
org.apache.beam.examples.subprocess.utils.CallingSubProcessUtils setUp
INFO: Calling filesetup to move Executables to worker.
Apr 25, 2018 2:40:26 AM org.apache.beam.examples.subprocess.utils.FileUtils 
copyFileFromGCSToWorker
INFO: Moving File /tmp/test-EchoAgain591204249350566431.sh to 
/tmp/test-Echoo8917452202130660514/test-EchoAgain591204249350566431.sh 

org.apache.beam.examples.complete.game.HourlyTeamScoreTest > 
testUserScoresFilter STANDARD_OUT
GOT user3_BananaEmu,BananaEmu,17,144796569,2015-11-19 12:41:31.053
GOT user0_MagentaKangaroo,MagentaKangaroo,4,144796569,2015-11-19 
12:41:31.053
GOT user18_BananaEmu,BananaEmu,1,144796569,2015-11-19 12:41:31.053
GOT user2_AmberCockatoo,AmberCockatoo,13,144796569,2015-11-19 
12:41:31.053
GOT user0_MagentaKangaroo,MagentaKangaroo,3,144795563,2015-11-19 
09:53:53.444
GOT user18_ApricotCaneToad,ApricotCaneToad,14,144796569,2015-11-19 
12:41:31.053
GOT user19_BisqueBilby,BisqueBilby,6,144795563,2015-11-19 09:53:53.444
GOT user18_BananaEmu,BananaEmu,7,144796569,2015-11-19 12:41:31.053
GOT 
user7_AndroidGreenKookaburra,AndroidGreenKookaburra,11,144795563,2015-11-19 
09:53:53.444
GOT 
user7_AndroidGreenKookaburra,AndroidGreenKookaburra,12,144795563,2015-11-19 
09:53:53.444
GOT user19_BisqueBilby,BisqueBilby,8,144795563,2015-11-19 09:53:53.444
GOT user13_ApricotQuokka,ApricotQuokka,15,144795563,2015-11-19 
09:53:53.444
GOT user7_AlmondWallaby,AlmondWallaby,15,144795563,2015-11-19 
09:53:53.444
GOT 
user0_AndroidGreenEchidna,AndroidGreenEchi

Build failed in Jenkins: beam_PostCommit_Python_ValidatesContainer_Dataflow #114

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Stop implementing EvaluatorFactory in Registry

[Pablo] Python Metrics now rely on StateSampler state.

[tgroh] Cleanups in GroupByKeyOnlyEvaluatorFactory

[tgroh] Add a CollectionT to Bundle

[tgroh] Use Bundle in WatermarkManager

[apilloud] [BEAM-3983] Add utils for converting to BigQuery types

[apilloud] [SQL] Embed BeamSqlTable in BeamCalciteTable

[owenzhang1990] [BEAM-4129] Run WordCount example on Gearpump runner with Gradle

[aromanenko.dev] [BEAM-4066] Moved anonymous classes into inner ones

[sidhom] [BEAM-4149] Set worker id to "" if it is not set in the request header

[sidhom] [BEAM-3327] Refactor ControlClientPool to allow client multiplexing

[sidhom] [BEAM-3327] Basic Docker environment factory

[sidhom] Fix python lint error

[robertwb] [BEAM-4097] Set environment for Python sdk function specs.

[robertwb] Logging around default docker image environment.

[iemejia] [BEAM-4018] Add a ByteKeyRangeTracker based on RestrictionTracker for

[rangadi] Disable flaky unbounded pipeline test

[wcn] Drain source when user function processing fails.

[kedin] Add primitive java types support to Row generation logic, add example

[aaltay] Unpinning Python jobs from Jenkins machines. (#5214)

[kirpichov] [BEAM-4166] Invoke @Setup in FnApiDoFnRunner

[kedin] Add JsonToRow Transform

[kedin] Convert JsonToRow from MapElements.via() to ParDo

[wcn] Allow request and init hooks to update the context.

--
[...truncated 1.03 KB...]
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9c2b43227e1ddac39676f6c09aca1af82a9d4cdb (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9c2b43227e1ddac39676f6c09aca1af82a9d4cdb
Commit message: "Merge pull request #5120: [BEAM-4160] Add JsonToRow transform"
 > git rev-list --no-walk 0f2ba71e1b6db88ed79744e363586a8ff16dcb08 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PostCommit_Python_ValidatesContainer_Dataflow] $ /bin/bash -xe 
/tmp/jenkins8141324131341984009.sh
+ cd src
+ bash sdks/python/container/run_validatescontainer.sh

# pip install --user installation location.
LOCAL_PATH=$HOME/.local/bin/

# Where to store integration test outputs.
GCS_LOCATION=gs://temp-storage-for-end-to-end-tests

# Project for the container and integration test
PROJECT=apache-beam-testing

# Verify in the root of the repository
test -d sdks/python/container

# Verify docker and gcloud commands exist
command -v docker
/usr/bin/docker
command -v gcloud
/usr/bin/gcloud
docker -v
Docker version 17.05.0-ce, build 89658be
gcloud -v
Google Cloud SDK 191.0.0
alpha 2018.02.23
beta 2018.02.23
bq 2.0.29
core 2018.02.23
gsutil 4.28

# ensure gcloud is version 186 or above
TMPDIR=$(mktemp -d)
mktemp -d
gcloud_ver=$(gcloud -v | head -1 | awk '{print $4}')
gcloud -v | head -1 | awk '{print $4}'
if [[ "$gcloud_ver" < "186" ]]
then
  pushd $TMPDIR
  curl 
https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-sdk-186.0.0-linux-x86_64.tar.gz
 --output gcloud.tar.gz
  tar xf gcloud.tar.gz
  ./google-cloud-sdk/install.sh --quiet
  . ./google-cloud-sdk/path.bash.inc
  popd
  gcloud components update --quiet || echo 'gcloud components update failed'
  gcloud -v
fi

# Build the container
TAG=$(date +%Y%m%d-%H%M%S)
date +%Y%m%d-%H%M%S
CONTAINER=us.gcr.io/$PROJECT/$USER/python
echo "Using container $CONTAINER"
Using container us.gcr.io/apache-beam-testing/jenkins/python
./gradlew :beam-sdks-python-container:docker 
-Pdocker-repository-root=us.gcr.io/$PROJECT/$USER -Pdocker-tag=$TAG
Parallel execution with configuration on demand is an incubating feature.
Applying build_rules.gradle to beam
createPerformanceTestHarness with default configuration for project beam
Adding 48 .gitignore exclusions to Apache Rat
Applying build_rules.gradle to beam-sdks-python-container
applyGoNature with default configuration for project beam-sdks-python-container
applyDockerNature with default configuration for project 
beam-sdks-python-container
containerImageName with [name:python] for project beam-sdks-python-container
Applying build_rules.gradle to beam-sdks-go
applyGoNature with default configuration for project beam-sdks-go
:beam-sdks-go:prepare
:beam-sdks-python-container:prepare
Use project G

Build failed in Jenkins: beam_PerformanceTests_Python #1190

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[kedin] Add JsonToRow Transform

[kedin] Convert JsonToRow from MapElements.via() to ParDo

--
[...truncated 4.29 KB...]
Collecting contextlib2>=0.5.1 (from -r PerfKitBenchmarker/requirements.txt 
(line 24))
  Using cached 
https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Requirement already satisfied: six in /usr/local/lib/python2.7/dist-packages 
(from absl-py->-r PerfKitBenchmarker/requirements.txt (line 14)) (1.11.0)
Requirement already satisfied: MarkupSafe>=0.23 in 
/usr/local/lib/python2.7/dist-packages (from jinja2>=2.7->-r 
PerfKitBenchmarker/requirements.txt (line 15)) (1.0)
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached 
https://files.pythonhosted.org/packages/db/c8/7dcf9dbcb22429512708fe3a547f8b6101c0d02137acbd892505aee57adf/colorama-0.3.9-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Requirement already satisfied: requests>=2.9.1 in 
/usr/local/lib/python2.7/dist-packages (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.18.4)
Collecting xmltodict (from pywinrm->-r PerfKitBenchmarker/requirements.txt 
(line 25))
  Using cached 
https://files.pythonhosted.org/packages/42/a9/7e99652c6bc619d19d58cdd8c47560730eb5825d43a7e25db2e1d776ceb7/xmltodict-0.11.0-py2.py3-none-any.whl
Requirement already satisfied: cryptography>=1.3 in 
/usr/local/lib/python2.7/dist-packages (from requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.2.2)
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/69/bc/230987c0dc22c763529330b2e669dbdba374d6a10c1f61232274184731be/ntlm_auth-1.1.0-py2.py3-none-any.whl
Requirement already satisfied: certifi>=2017.4.17 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2018.4.16)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (3.0.4)
Requirement already satisfied: idna<2.7,>=2.5 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.6)
Requirement already satisfied: urllib3<1.23,>=1.21.1 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.22)
Requirement already satisfied: cffi>=1.7; platform_python_implementation != 
"PyPy" in /usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.11.5)
Requirement already satisfied: enum34; python_version < "3" in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.1.6)
Requirement already satisfied: asn1crypto>=0.21.0 in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (0.24.0)
Requirement already satisfied: ipaddress; python_version < "3" in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.0.22)
Requirement already satisfied: pycparser in 
/usr/local/lib/python2.7/dist-packages (from cffi>=1.7; 
platform_python_implementation != 
"PyPy"->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.18)
Installing collected packages: absl-py, colorama, colorlog, blinker, futures, 
pint, numpy, contextlib2, ntlm-auth, requests-ntlm, xmltodict, pywinrm
Successfully installed absl-py-0.2.0 blinker-1.4 colorama-0.3.9 colorlog-2.6.0 
contextlib2-0.5.5 futures-3.2.0 ntlm-auth-1.1.0 numpy-1.13.3 pint-0.8.1 
pywinrm-0.3.0 requests-ntlm-1.1.0 xmltodict-0.11.0
[beam_PerformanceTests_Python] $ /bin/bash -xe /tmp/jenkins62554953466333455.sh
+ .env/bin/pip install -e 'src/sdks/python/[gcp,test]'
Obtaining 
file://
Collecting avro<2.0.0,>=1.8.1 

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #97

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[kedin] Add JsonToRow Transform

[kedin] Convert JsonToRow from MapElements.via() to ParDo

--
[...truncated 1.83 KB...]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins1046106401685166210.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins3456340541571188019.sh
+ kubectl 
--kubeconfig=
 create namespace filebasedioithdfs-1524632470474
namespace "filebasedioithdfs-1524632470474" created
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins155845714309385347.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=filebasedioithdfs-1524632470474
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins3882401479101503587.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins2576143224903875620.sh
+ rm -rf .env
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins2443205291596115845.sh
+ virtualenv .env --system-site-packages
New python executable in 

Also creating executable in 

Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python2
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins5532319485673771361.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in 
./.env/lib/python2.7/site-packages (39.0.1)
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages 
(10.0.1)
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7834433788017224899.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7772899530875752107.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Collecting absl-py (from -r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15)) (2.10)
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16)) (39.0.1)
Collecting colorlog[windows]==2.6.0 (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached 
https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r PerfKitBenchmarker/requirements.txt (line 18))
Collecting futures>=3.0.3 (from -r PerfKitBenchmarker/requirements.txt (line 
19))
  Using cached 
https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20)) (3.12)
Collecting pint>=0.7 (from -r PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
  Using cached 
https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23)) (3.2.3.post2)
Collecting contextlib2>=0.5.1 (from -r PerfKitBenchmarker/requirements.txt 
(line 24))
  Using cached 
https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Requirement already satisfied: six in /usr/local/lib/python2.7/dist-packages 
(from abs

Jenkins build is back to normal : beam_PerformanceTests_XmlIOIT_HDFS #90

2018-04-24 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PerformanceTests_HadoopInputFormat #183

2018-04-24 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_MongoDBIO_IT #92

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[kedin] Add JsonToRow Transform

[kedin] Convert JsonToRow from MapElements.via() to ParDo

--
[...truncated 53.10 KB...]
at 
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:84)
at 
com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
at com.mongodb.Mongo.execute(Mongo.java:772)
at com.mongodb.Mongo$2.execute(Mongo.java:759)
at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:130)
at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:124)
at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:114)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$BoundedMongoDbSource.split(MongoDbIO.java:332)
at 
com.google.cloud.dataflow.worker.WorkerCustomSources.splitAndValidate(WorkerCustomSources.java:275)
at 
com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:197)
at 
com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:181)
at 
com.google.cloud.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:160)
at 
com.google.cloud.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:77)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:383)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:355)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:286)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
com.mongodb.MongoTimeoutException: Timed out after 3 ms while waiting for a 
server that matches ReadPreferenceServerSelector{readPreference=primary}. 
Client view of cluster state is {type=UNKNOWN, 
servers=[{address=104.198.233.11:27017, type=UNKNOWN, state=CONNECTING, 
exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, 
caused by {java.net.SocketTimeoutException: connect timed out}}]
at 
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:75)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:71)
at 
com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63)
at 
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:89)
at 
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:84)
at 
com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
at com.mongodb.Mongo.execute(Mongo.java:772)
at com.mongodb.Mongo$2.execute(Mongo.java:759)
at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:130)
at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:124)
at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:114)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$BoundedMongoDbSource.split(MongoDbIO.java:332)
at 
com.google.cloud.dataflow.worker.WorkerCustomSources.splitAndValidate(WorkerCustomSources.java:275)
at 
com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:197)
at 
com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:181)
at 
com.google.cloud.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:160)
at 
com.google.cloud.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:77)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:383)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:355)

Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT_HDFS #91

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[kedin] Add JsonToRow Transform

[kedin] Convert JsonToRow from MapElements.via() to ParDo

--
[...truncated 482.76 KB...]
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at 
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at 
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy62.create(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy63.create(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpilli

Build failed in Jenkins: beam_PerformanceTests_Spark #1633

2018-04-24 Thread Apache Jenkins Server
See 


Changes:

[kedin] Add JsonToRow Transform

[kedin] Convert JsonToRow from MapElements.via() to ParDo

--
[...truncated 68.17 KB...]
2018-04-25 06:16:30,026 fbf3e0cc MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-25 06:16:52,676 fbf3e0cc MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-25 06:16:56,244 fbf3e0cc MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r5219cec03b123d0f_0162fb729528_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r5219cec03b123d0f_0162fb729528_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r5219cec03b123d0f_0162fb729528_1 ... (0s) Current status: DONE   
2018-04-25 06:16:56,244 fbf3e0cc MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-25 06:17:14,663 fbf3e0cc MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-25 06:17:18,093 fbf3e0cc MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r364a703291b655c0_0162fb72eb18_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r364a703291b655c0_0162fb72eb18_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r364a703291b655c0_0162fb72eb18_1 ... (0s) Current status: DONE   
2018-04-25 06:17:18,093 fbf3e0cc MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-25 06:17:34,833 fbf3e0cc MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-25 06:17:38,235 fbf3e0cc MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r5e6febca615b74ab_0162fb7339c6_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r5e6febca615b74ab_0162fb7339c6_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r5e6febca615b74ab_0162fb7339c6_1 ... (0s) Current status: DONE   
2018-04-25 06:17:38,236 fbf3e0cc MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-25 06:18:04,308 fbf3e0cc MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-25 06:18:07,752 fbf3e0cc MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r60942179a480a175_0162fb73ad0b_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r60942179a480a175_0162fb73ad0b_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r60942179a480a175_0162fb73ad0b_1 ... (0s) Current status: DONE   
2018-04-25 06:18:07,752 fbf3e0cc MainThread INFO R

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #183

2018-04-24 Thread Apache Jenkins Server
See 


--
[...truncated 19.82 MB...]
Apr 25, 2018 6:42:21 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading /home/jenkins/.m2/repository/org/tukaani/xz/1.5/xz-1.5.jar 
to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425064219-5ddebc87/output/results/staging/xz-1.5-UQUOWVswjErsisMU9m4YvA.jar
Apr 25, 2018 6:42:21 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/junit/junit/4.12/junit-4.12.jar to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425064219-5ddebc87/output/results/staging/junit-4.12-WzjEDJf70K3uKfkeYEBVhA.jar
Apr 25, 2018 6:42:21 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.9/jackson-databind-2.8.9.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425064219-5ddebc87/output/results/staging/jackson-databind-2.8.9-LY9EwV_rjXYnHufFJYsgcg.jar
Apr 25, 2018 6:42:21 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425064219-5ddebc87/output/results/staging/jackson-core-asl-1.9.13-MZxJpDBOP6n-PNjc_ACdNw.jar
Apr 25, 2018 6:42:21 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/com/google/errorprone/error_prone_annotations/2.0.15/error_prone_annotations-2.0.15.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425064219-5ddebc87/output/results/staging/error_prone_annotations-2.0.15-npnuyJrjs2dF75GdvUBIvQ.jar
Apr 25, 2018 6:42:21 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425064219-5ddebc87/output/results/staging/beam-runners-direct-java-2.5.0-SNAPSHOT-shaded-8Wx_RGDnr32LHdCGbTpUbQ.jar
Apr 25, 2018 6:42:21 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/com/thoughtworks/paranamer/paranamer/2.7/paranamer-2.7.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425064219-5ddebc87/output/results/staging/paranamer-2.7-VweilzYySf_-OOgYnNb5yw.jar
Apr 25, 2018 6:42:21 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425064219-5ddebc87/output/results/staging/beam-sdks-java-core-2.5.0-SNAPSHOT-shaded-9Ma709ZquatgS7CyJSnItw.jar
Apr 25, 2018 6:42:21 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/com/google/protobuf/protobuf-java/3.2.0/protobuf-java-3.2.0.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425064219-5ddebc87/output/results/staging/protobuf-java-3.2.0-fh30Gescj5k_IhxJOGxriw.jar
Apr 25, 2018 6:42:21 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/mockito/mockito-core/1.9.5/mockito-core-1.9.5.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425064219-5ddebc87/output/results/staging/mockito-core-1.9.5-b3PPBKVutgqqmWUG58EPxw.jar
Apr 25, 2018 6:42:21 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425064219-5ddebc87/output/results/staging/jackson-mapper-asl-1.9.13-F1D5wzk1L8S3KNYbVxcWEw.jar
Apr 25, 2018 6:42:21 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/io/netty/netty-handler-proxy/4.1.8.Final/netty-handler-proxy-4.1.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425064219-5ddebc87/output/results/staging/netty-handler-proxy-4.1.8.Final-Zey48Fj4mlWtgpwIc7Osgg.jar
Apr 25, 2018 6:42:21 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-context/1.9.0/28b0836f48c9705abf73829bbc536dba29a1329a/grpc-context-1.9.0.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #184

2018-04-25 Thread Apache Jenkins Server
See 


Changes:

[szewinho] [BEAM-4153] Fixing performance test of spark, added option to trigger

--
[...truncated 19.33 MB...]
Apr 25, 2018 8:03:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T08:03:24.969Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/Values/Values/Map into 
PAssert$3/CreateActual/GatherPanes/GroupByKey/GroupByWindow
Apr 25, 2018 8:03:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T08:03:24.999Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/GroupByKey/GroupByWindow into 
PAssert$3/CreateActual/GatherPanes/GroupByKey/Read
Apr 25, 2018 8:03:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T08:03:25.032Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/GroupByKey/Reify into 
PAssert$3/CreateActual/GatherPanes/Window.Into()/Window.Assign
Apr 25, 2018 8:03:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T08:03:25.063Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/GroupByKey/Write into 
PAssert$3/CreateActual/GatherPanes/GroupByKey/Reify
Apr 25, 2018 8:03:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T08:03:25.095Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/Window.Into()/Window.Assign into 
PAssert$3/CreateActual/GatherPanes/WithKeys/AddKeys/Map
Apr 25, 2018 8:03:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T08:03:25.123Z: Unzipping flatten s18-u63 for input 
s19.output-c61
Apr 25, 2018 8:03:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T08:03:25.147Z: Fusing unzipped copy of 
PAssert$3/CreateActual/GatherPanes/Reify.Window/ParDo(Anonymous), through 
flatten s18-u63, into producer 
PAssert$3/CreateActual/FilterActuals/Window.Assign
Apr 25, 2018 8:03:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T08:03:25.170Z: Fusing consumer 
PAssert$3/CreateActual/GatherPanes/Reify.Window/ParDo(Anonymous) into 
PAssert$3/CreateActual/FilterActuals/Window.Assign
Apr 25, 2018 8:03:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T08:03:25.195Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
Apr 25, 2018 8:03:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T08:03:25.225Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
Apr 25, 2018 8:03:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T08:03:25.257Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
Apr 25, 2018 8:03:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T08:03:25.288Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
Apr 25, 2018 8:03:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T08:03:25.311Z: Fusing consumer 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
 into 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
Apr 25, 2018 8:03:30 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T08:03:25.343Z: Fusing consumer 
PAssert$3/CreateActual/RewindowActuals/

Jenkins build is back to normal : beam_PerformanceTests_Spark #1634

2018-04-25 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Python #1191

2018-04-25 Thread Apache Jenkins Server
See 


Changes:

[szewinho] [BEAM-4153] Fixing performance test of spark, added option to trigger

--
[...truncated 4.27 KB...]
Collecting contextlib2>=0.5.1 (from -r PerfKitBenchmarker/requirements.txt 
(line 24))
  Using cached 
https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Requirement already satisfied: six in /usr/local/lib/python2.7/dist-packages 
(from absl-py->-r PerfKitBenchmarker/requirements.txt (line 14)) (1.11.0)
Requirement already satisfied: MarkupSafe>=0.23 in 
/usr/local/lib/python2.7/dist-packages (from jinja2>=2.7->-r 
PerfKitBenchmarker/requirements.txt (line 15)) (1.0)
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached 
https://files.pythonhosted.org/packages/db/c8/7dcf9dbcb22429512708fe3a547f8b6101c0d02137acbd892505aee57adf/colorama-0.3.9-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Requirement already satisfied: requests>=2.9.1 in 
/usr/local/lib/python2.7/dist-packages (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.18.4)
Collecting xmltodict (from pywinrm->-r PerfKitBenchmarker/requirements.txt 
(line 25))
  Using cached 
https://files.pythonhosted.org/packages/42/a9/7e99652c6bc619d19d58cdd8c47560730eb5825d43a7e25db2e1d776ceb7/xmltodict-0.11.0-py2.py3-none-any.whl
Requirement already satisfied: cryptography>=1.3 in 
/usr/local/lib/python2.7/dist-packages (from requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.2.2)
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/69/bc/230987c0dc22c763529330b2e669dbdba374d6a10c1f61232274184731be/ntlm_auth-1.1.0-py2.py3-none-any.whl
Requirement already satisfied: certifi>=2017.4.17 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2018.4.16)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (3.0.4)
Requirement already satisfied: idna<2.7,>=2.5 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.6)
Requirement already satisfied: urllib3<1.23,>=1.21.1 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.22)
Requirement already satisfied: cffi>=1.7; platform_python_implementation != 
"PyPy" in /usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.11.5)
Requirement already satisfied: enum34; python_version < "3" in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.1.6)
Requirement already satisfied: asn1crypto>=0.21.0 in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (0.24.0)
Requirement already satisfied: ipaddress; python_version < "3" in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.0.22)
Requirement already satisfied: pycparser in 
/usr/local/lib/python2.7/dist-packages (from cffi>=1.7; 
platform_python_implementation != 
"PyPy"->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.18)
Installing collected packages: absl-py, colorama, colorlog, blinker, futures, 
pint, numpy, contextlib2, ntlm-auth, requests-ntlm, xmltodict, pywinrm
Successfully installed absl-py-0.2.0 blinker-1.4 colorama-0.3.9 colorlog-2.6.0 
contextlib2-0.5.5 futures-3.2.0 ntlm-auth-1.1.0 numpy-1.13.3 pint-0.8.1 
pywinrm-0.3.0 requests-ntlm-1.1.0 xmltodict-0.11.0
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3539505579878232616.sh
+ .env/bin/pip install -e 'src/sdks/python/[gcp,test]'
Obtaining 
file://
Collecting avro<2.0.0,>=1.8.1 (from a

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #91

2018-04-25 Thread Apache Jenkins Server
See 


Changes:

[szewinho] [BEAM-4153] Fixing performance test of spark, added option to trigger

--
[...truncated 1.82 KB...]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins6858490278701700698.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins6207107033845527333.sh
+ kubectl 
--kubeconfig=
 create namespace filebasedioithdfs-1524650483796
namespace "filebasedioithdfs-1524650483796" created
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins8734289686146092525.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=filebasedioithdfs-1524650483796
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins23258637882795824.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins419977072506853107.sh
+ rm -rf .env
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins515334786514270.sh
+ virtualenv .env --system-site-packages
New python executable in 

Also creating executable in 

Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python2
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins1827607456067705615.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in 
./.env/lib/python2.7/site-packages (39.0.1)
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages 
(10.0.1)
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7790167048729912815.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins751187617529303451.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Collecting absl-py (from -r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15)) (2.10)
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16)) (39.0.1)
Collecting colorlog[windows]==2.6.0 (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached 
https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r PerfKitBenchmarker/requirements.txt (line 18))
Collecting futures>=3.0.3 (from -r PerfKitBenchmarker/requirements.txt (line 
19))
  Using cached 
https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20)) (3.12)
Collecting pint>=0.7 (from -r PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
  Using cached 
https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23)) (3.2.3.post2)
Collecting contextlib2>=0.5.1 (from -r PerfKitBenchmarker/requirements.txt 
(line 24))
  Using cached 
https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Requirement already satisfied: six in /usr/local/lib/python2.7/dist-packages 
(from absl-py->-r PerfKitBenchmarker/

Jenkins build is back to normal : beam_PerformanceTests_TextIOIT_HDFS #98

2018-04-25 Thread Apache Jenkins Server
See 




  1   2   3   4   5   6   7   8   9   10   >