See 
<https://builds.apache.org/job/beam_PerformanceTests_Dataflow/339/display/redirect?page=changes>

Changes:

[wtanaka] Rename openWindows => windowsThatAreOpen

[chamikara] Adding validatesrunner test for sources

[lcwik] Update pom.xml

[lcwik] [BEAM-2060] Allow to specify charset in XmlIO

[lcwik] [BEAM-2060] Use withCharset(Charset) for the user facing API

[dhalperi] Remove unnecessary semicolons

[tgroh] Use Structural Value when comparing Window(AndTrigger) Namespaces

[altay] Add ValueProvider tests to json_value_test.py

[bchambers] [BEAM-1148] Port PAssert away from aggregators

[sisk] TextIO & AvroIO no longer validate schemas against IOChannelFactory

[altay] Increase post commit timeout from 600 seconds to 900

[robertwb] Add instructions to regenerate Python proto wrappers.

[robertwb] Generate python proto wrappers for runner and fn API.

[robertwb] Add apache licence to generated files.

[robertwb] Adapt to PR #2505 changes to protos.

[lcwik] [BEAM-1871] Move ProtoCoder to new sdks/java/extensions/protobuf

[tgroh] Properly Declare Protobuf extension dependencies

[tgroh] Stop Extending AtomicCoder

[altay] Pin apitools to 0.5.8

[iemejia] Fix error with the artifactId on Flink runner after refactor

[iemejia] [BEAM-2075] Update flink runner to use flink version 1.2.1

[aljoscha.krettek] [BEAM-1772] Support merging WindowFn other than 
IntervalWindow on Flink

[aljoscha.krettek] [BEAM-1812] Add externalized checkpoint configuration to

[dhalperi] ByteKey: remove ByteString from public API, replace with ByteBuffer

[lcwik] [BEAM-1877] Use Iterables.isEmpty in GroupIntoBatches

[tgroh] Add `Filter#equal`

[tgroh] Add HadoopResourceId

[tgroh] Add Cloud Object Translators for Coders

[tgroh] Revert "Add HadoopResourceId"

[altay] Fix in dataflow metrics

[bchambers] Removing Aggregators from Examples

[bchambers] Removing Aggregators from runner-specific examples and tests

[bchambers] Removing Aggregators from PipelineResults and subclasses

[sourabhbajaj] [BEAM-2068] Update to latest version of apitools

[lcwik] [BEAM-1871] Move Bigquery/Pubsub options to

[tgroh] Make Most StandardCoders CustomCoders

[chamikara] [BEAM-1988] Add FileSystems Interface for accessing underlying FS

[altay] [BEAM-1989] Fix the syntax warning from import star

[altay] [BEAM-1749] Upgrade to pycodestyle

[dhalperi] Datastore: fix use of deprecated function

[wtanaka] Remove unused private fields

[chamikara] Updates DoFn invocation logic to be more extensible.

[lcwik] [BEAM-1871] Move ByteStringCoder to sdks/java/extensions/protobuf

[dhalperi] Update maven surefire and failsafe plugins to version 2.20

[lcwik] [BEAM-1871] Move functionality from AvroUtils to AvroSource hiding

[dhalperi] Add surefire-plugin.version property to archetype pom.xml

[tgroh] Remove AtomicCoder

[aviemzur] [BEAM-1958] Standard IO Metrics in Java SDK

[tgroh] Fix small typo in comment in CoderRegistry

[lcwik] [BEAM-1871] Remove deprecated methods from AvroCoder

[lcwik] [BEAM-2060] Add charset support in XmlSink

[lcwik] [BEAM-1871] Move RetryHttpRequestInitializer and remove deprecated

[dhalperi] LocalResourceId: make toString end in '/' for directories

[dhalperi] FileSystems: make tolerant of and more efficient for empty lists

[dhalperi] LocalFileSystem: create parent directories if needed

[lcwik] Add getElementCoders to UnionCoder

[dhalperi] Make it possible to test runners that don't support all metrics

[lcwik] Make WindowedValueCoder an Interface

[lcwik] [BEAM-1871] Drop usage of Jackson/CloudObject from CoGbkResultSchema

[kirpichov] [BEAM-1573] Use Kafka serializers instead of coders in KafkaIO

[bchambers] Remove aggregators from Dataflow runner

[klk] Replace OutputTimeFn UDF with TimestampCombiner enum

[lcwik] [BEAM-1871] Clean-up org.apache.beam.sdk.util, move BitSetCoder from

[lcwik] [BEAM-1871] Re-add BitSetCoder to sdk/util for Dataflow worker.

[altay] Do not depend on message id in DataflowRunner

[altay] [BEAM-1316] start bundle should not output any elements

[robertwb] Per-transform runner api dispatch.

[robertwb] Translate flatten to Runner API.

[robertwb] Translate WindowInto through the Runner API.

[robertwb] Factor out URN registration logic.

[kirpichov] update JavaDoc for Count.globally

[tgroh] Add Registrars for Coder Cloud Object Translators

[dhalperi] FileBasedSink: fix typos

[klk] Revert "Replace OutputTimeFn UDF with TimestampCombiner enum"

[tgroh] Revert "Make WindowedValueCoder an Interface"

[iemejia] Use parent pom version of hadoop, move common deps into the hadoop

[iemejia] Change to commons-lang dependency to commons-lang3 to be consistent 
with

[jbonofre] [BEAM-2091] Typo in build instructions in Apex Runner's README.md

[klk] Do not rely on metrics in streaming TestDataflowRunner

[thw] BEAM-1766 Remove Aggregators from Apex runner

[lcwik] XmlIO and XmlSource now take an optional validationEventHandler to throw

[tgroh] Mark Pipeline#getPipelineOptions Experimental

------------------------------------------
Started by timer
Started by timer
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam6 (beam) in workspace 
<https://builds.apache.org/job/beam_PerformanceTests_Dataflow/ws/>
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://builds.apache.org/job/beam_PerformanceTests_Dataflow/ws/> 
 > # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Pruning obsolete local branches
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* +refs/pull/*:refs/remotes/origin/pr/* 
 > --prune
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1babed250404d4c1b5ce82034e1cda0bcfa71c26 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1babed250404d4c1b5ce82034e1cda0bcfa71c26
 > git rev-list a67019739dc7f09a8336b9606c3726ad5d546f51 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Dataflow] $ /bin/bash -xe 
/tmp/hudson8481344159617934558.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Dataflow] $ /bin/bash -xe 
/tmp/hudson5395040492010499161.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Dataflow] $ /bin/bash -xe 
/tmp/hudson1482093818285122542.sh
+ pip install --user -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied (use --upgrade to upgrade): python-gflags==3.1.1 
in /home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied (use --upgrade to upgrade): jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied (use --upgrade to upgrade): setuptools in 
/usr/lib/python2.7/dist-packages (from -r PerfKitBenchmarker/requirements.txt 
(line 16))
Requirement already satisfied (use --upgrade to upgrade): 
colorlog[windows]==2.6.0 in /home/jenkins/.local/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 17))
  Installing extra requirements: 'windows'
Requirement already satisfied (use --upgrade to upgrade): blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied (use --upgrade to upgrade): futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied (use --upgrade to upgrade): PyYAML==3.11 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied (use --upgrade to upgrade): pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Requirement already satisfied (use --upgrade to upgrade): numpy in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 22))
Requirement already satisfied (use --upgrade to upgrade): functools32 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23))
Requirement already satisfied (use --upgrade to upgrade): contextlib2>=0.5.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 24))
Cleaning up...
[beam_PerformanceTests_Dataflow] $ /bin/bash -xe 
/tmp/hudson7652032562621552715.sh
+ python PerfKitBenchmarker/pkb.py --project=apache-beam-testing 
--dpb_log_level=INFO --maven_binary=/home/jenkins/tools/maven/latest/bin/mvn 
--bigquery_table=beam_performance.pkb_results --official=true 
--benchmarks=dpb_wordcount_benchmark 
--dpb_dataflow_staging_location=gs://temp-storage-for-perf-tests/staging 
--dpb_wordcount_input=dataflow-samples/shakespeare/kinglear.txt 
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataflow
WARNING:root:File resource loader root perfkitbenchmarker/data/ycsb is not a 
directory.
2017-04-27 17:24:40,325 c2085155 MainThread INFO     Verbose logging to: 
/tmp/perfkitbenchmarker/runs/c2085155/pkb.log
2017-04-27 17:24:40,325 c2085155 MainThread INFO     PerfKitBenchmarker 
version: v1.11.0-45-g6f31bb3
2017-04-27 17:24:40,326 c2085155 MainThread INFO     Flag values:
--maven_binary=/home/jenkins/tools/maven/latest/bin/mvn
--project=apache-beam-testing
--bigquery_table=beam_performance.pkb_results
--dpb_wordcount_input=dataflow-samples/shakespeare/kinglear.txt
--dpb_log_level=INFO
--official
--benchmarks=dpb_wordcount_benchmark
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataflow
--dpb_dataflow_staging_location=gs://temp-storage-for-perf-tests/staging
2017-04-27 17:24:40,901 c2085155 MainThread WARNING  The key "flags" was not in 
the default config, but was in user overrides. This may indicate a typo.
2017-04-27 17:24:41,043 c2085155 MainThread dpb_wordcount_benchmark(1/1) INFO   
  Provisioning resources for benchmark dpb_wordcount_benchmark
2017-04-27 17:24:41,045 c2085155 MainThread dpb_wordcount_benchmark(1/1) INFO   
  Preparing benchmark dpb_wordcount_benchmark
2017-04-27 17:24:41,046 c2085155 MainThread dpb_wordcount_benchmark(1/1) INFO   
  Running benchmark dpb_wordcount_benchmark
2017-04-27 17:24:41,051 c2085155 MainThread dpb_wordcount_benchmark(1/1) ERROR  
  Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Dataflow/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 510, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Dataflow/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 426, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Dataflow/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/dpb_wordcount_benchmark.py";,>
 line 162, in Run
    job_type=job_type)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Dataflow/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py";,>
 line 115, in SubmitJob
    stdout, _, _ = vm_util.IssueCommand(cmd)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Dataflow/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py";,>
 line 287, in IssueCommand
    full_cmd = ' '.join(cmd)
TypeError: sequence item 2: expected string, NoneType found
2017-04-27 17:24:41,053 c2085155 MainThread dpb_wordcount_benchmark(1/1) ERROR  
  Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. 
Execution will continue.
2017-04-27 17:24:41,108 c2085155 MainThread INFO     Benchmark run statuses:
---------------------------------------------------------
Name                     UID                       Status
---------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED
---------------------------------------------------------
Success rate: 0.00% (0/1)
2017-04-27 17:24:41,108 c2085155 MainThread INFO     Complete logs can be found 
at: /tmp/perfkitbenchmarker/runs/c2085155/pkb.log
Build step 'Execute shell' marked build as failure

Reply via email to