Jenkins build is back to normal : beam_PostCommit_Python2 #237

2019-08-16 Thread Apache Jenkins Server
See 


-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #4311

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Python36 #239

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Java_PortabilityApi #2619

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #674

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[mxm] [BEAM-7936] Update portable WordCount Gradle task on portability page

--
[...truncated 357.84 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565951713.06_5f2ec1cd-4da6-496f-a65a-6b0cf4dab0b7
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565951717.68_a669c583-141b-41eb-b2f1-b6a0d83342c8 failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565951718.25_47b903b3-5c97-4676-822f-662c843fcfb9
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565951719.4_ef3d16f9-ed48-433f-8f31-2e4933ad8b09 failed 
in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565951719.86_1ec9a9f8-53f4-4aed-af1e-5f1a14c17d1d 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit__
 

Jenkins build is back to normal : beam_PostCommit_XVR_Flink #102

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #4312

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[mxm] [BEAM-7936] Update portable WordCount Gradle task on portability page

--
[...truncated 156.25 KB...]
test_flatten_multiple_pcollections_having_multiple_consumers 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

--
XML: nosetests-validatesRunnerStreamingTests-df.xml
--
XML: 

--
Ran 15 tests in 1001.484s

OK
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_03_56_58-6837380816642301252?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_04_04_30-7587972956465062864?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_03_57_01-13694389494864063103?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_04_05_22-17955689101394840653?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_03_56_59-7318406388214944412?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_04_05_16-6137632294341403539?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_03_57_00-16822353459251412225?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_03_56_59-2104690058007080292?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_04_05_21-4498141867375268197?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_03_56_59-11266853876411017337?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_04_04_51-24584139048734983?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_03_56_59-13715595535280892146?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_04_05_14-9196910485557992927?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_03_56_58-17718921450534288134?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_04_05_25-15759197722473910767?project=apache-beam-testing.

> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests
test_flatten_multiple_pcollections_having_multiple_consumers 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ERROR
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle 
(apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
test_as_singleton_with_different_de

Build failed in Jenkins: beam_PostCommit_Python35 #239

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[mxm] [BEAM-7936] Update portable WordCount Gradle task on portability page

--
[...truncated 84.67 KB...]
Traceback (most recent call last):
  File 
"
 line 269, in test_big_query_write_without_schema
write_disposition=beam.io.BigQueryDisposition.WRITE_APPEND))
  File 
"
 line 426, in __exit__
self.run().wait_until_finish()
  File 
"
 line 406, in run
self._options).run(False)
  File 
"
 line 419, in run
return self.runner.run_pipeline(self, self._options)
  File 
"
 line 51, in run_pipeline
hc_assert_that(self.result, pickler.loads(on_success_matcher))
AssertionError: 
Expected: (Expected data is [(b'xyw', datetime.date(2011, 1, 1), 
datetime.time(23, 59, 59, 99)), (b'abc', datetime.date(2000, 1, 1), 
datetime.time(0, 0)), (b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 
31), datetime.time(23, 59, 59)), (b'\xab\xac\xad', datetime.date(2000, 1, 1), 
datetime.time(0, 0))])
 but: Expected data is [(b'xyw', datetime.date(2011, 1, 1), 
datetime.time(23, 59, 59, 99)), (b'abc', datetime.date(2000, 1, 1), 
datetime.time(0, 0)), (b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 
31), datetime.time(23, 59, 59)), (b'\xab\xac\xad', datetime.date(2000, 1, 1), 
datetime.time(0, 0))] Actual data is []

 >> begin captured logging << 
root: INFO: Created dataset python_write_to_table_15659516078576 in project 
apache-beam-testing
root: INFO:   
root: DEBUG: 3 [1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  
create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: 
', 'ref_AppliedPTransform_write/AppendDestination_5\n  
write/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n  
write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ']
root: INFO:   
root: DEBUG: 3 [1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  
create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: 
', 'ref_AppliedPTransform_write/AppendDestination_5\n  
write/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n  
write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ']
root: INFO:   

root: DEBUG: 3 [1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  
create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: 
', 'ref_AppliedPTransform_write/AppendDestination_5\n  
write/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n  
write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ']
root: INFO:   

root: DEBUG: 3 [1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  
create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: 
', 'ref_AppliedPTransform_write/AppendDestination_5\n  
write/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n  
write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ']
root: INFO:   

root: DEBUG: 3 [1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  
create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: 
', 'ref_AppliedPTransform_write/AppendDestination_5\n  
write/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n  
write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ']
root: INFO: =

Build failed in Jenkins: beam_PostRelease_NightlySnapshot #710

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[dcavazos] [BEAM-7389] Add code examples for KvSwap page

[dcavazos] [BEAM-7389] Add code examples for Map page

[dcavazos] [BEAM-7389] Add code examples for Keys page

[dcavazos] [BEAM-7389] Add code examples for WithTimestamps page

[iemejia] Update build plugins

[mxm] [BEAM-7936] Update portable WordCount Gradle task on portability page

[markliu] Fix command format in Release Guide

[github] Update stager.py

[kedin] [SQL] Add custom table name resolution

[kedin] [SQL] Support complex identifiers in DataCatalog

[github] Downgrade log message level

[lukecwik] [BEAM-7987] Drop empty Windmill workitem in WindowingWindmillReader

--
[...truncated 3.52 MB...]
grep Foundation counts*
counts-0-of-3:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:google-cloud-dataflow-java:runQuickstartJavaDataflow
Aug 16, 2019 11:44:40 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/tmp/groovy-generated-4244791717803362928-tmpdir/word-count-beam/target/classes 
to 
gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/tmp/staging/classes-jbW4E3GTvru4_wgqadZ5Zw.jar
Aug 16, 2019 11:44:40 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 131 files cached, 1 files newly uploaded in 1 
seconds
Aug 16, 2019 11:44:41 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ReadLines/Read as step s1
Aug 16, 2019 11:44:41 AM org.apache.beam.sdk.io.FileBasedSource 
getEstimatedSizeBytes
INFO: Filepattern gs://apache-beam-samples/shakespeare/* matched 44 files with 
total size 5443510
Aug 16, 2019 11:44:41 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding WordCount.CountWords/ParDo(ExtractWords) as step s2
Aug 16, 2019 11:44:41 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding WordCount.CountWords/Count.PerElement/Init/Map as step s3
Aug 16, 2019 11:44:41 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey as step 
s4
Aug 16, 2019 11:44:41 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues
 as step s5
Aug 16, 2019 11:44:41 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding MapElements/Map as step s6
Aug 16, 2019 11:44:41 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding WriteCounts/WriteFiles/RewindowIntoGlobal/Window.Assign as step s7
Aug 16, 2019 11:44:41 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles 
as step s8
Aug 16, 2019 11:44:41 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten as step 
s9
Aug 16, 2019 11:44:41 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten as step 
s10
Aug 16, 2019 11:44:41 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum as step s11
Aug 16, 2019 11:44:41 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections as 
step s12
Aug 16, 2019 11:44:41 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)
 as step s13
Aug 16, 2019 11:44:41 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/CreateDataflowView as 
step s14
Aug 16, 2019 11:44:41 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource)
 as step s15
Aug 16, 2019 11:44:41 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)
 as step s16
Aug 16, 2019 11:44:41 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
WriteCounts/WriteFiles/GatherTempFil

Build failed in Jenkins: beam_PostCommit_Java11_ValidatesRunner_Direct #1589

2019-08-16 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b134e67bf3effd89de0def275666cf8c57acc908 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b134e67bf3effd89de0def275666cf8c57acc908
Commit message: "Merge pull request #9348: [BEAM-7936] Update portable 
WordCount Gradle task on portability page"
 > git rev-list --no-walk b134e67bf3effd89de0def275666cf8c57acc908 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 -Dorg.gradle.java.home=/usr/lib/jvm/java-8-openjdk-amd64 
:runners:direct-java:shadowJar :runners:direct-java:shadowTestJar
Starting a Gradle Daemon (subsequent builds will be faster)
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :runners:direct-java:processTestResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Tas

Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #675

2019-08-16 Thread Apache Jenkins Server
See 


--
[...truncated 357.81 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565957233.84_4770e658-f1c9-4523-9959-42811be1d7d7
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565957238.6_a7640298-7ac3-44cf-8fb9-c09340f223c7 failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565957239.2_3af2b478-a169-4765-b911-0e382a2b22e9
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565957240.49_a9c73c1a-d6bf-48ce-b614-b90c24e2679d 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565957240.99_fa428585-a9d4-42ac-90d1-3a64a8bbd8f9 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 4

Build failed in Jenkins: beam_PreCommit_Portable_Python_Cron #1031

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[mxm] [BEAM-7936] Update portable WordCount Gradle task on portability page

--
[...truncated 899.92 KB...]
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) 
(08b1fda44df7a5ec80adc1627b55913d) switched from RUNNING to CANCELING.
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskmanager.Task - Triggering cancellation of task 
code ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2) (08b1fda44df7a5ec80adc1627b55913d).
[flink-akka.actor.default-dispatcher-16] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - 
[1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (1/2) 
(c24e0cb8445189886e87536fcaf59bd8) switched from CANCELING to CANCELED.
[flink-akka.actor.default-dispatcher-14] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - 
write/Write/WriteImpl/GroupByKey -> [1]write/Write/WriteImpl/Extract -> (Map -> 
ToKeyedWorkItem, Map -> ToKeyedWorkItem) (1/2) 
(b8ce130fbb1b625cfd1406d5f6b16792) switched from CANCELING to CANCELED.
[flink-akka.actor.default-dispatcher-14] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - 
[2]write/Write/WriteImpl/DoOnce/{FlatMap(), 
Map(decode)} -> [1]write/Write/WriteImpl/InitializeWrite -> (Map -> 
ToKeyedWorkItem, Map -> ToKeyedWorkItem, Map -> ToKeyedWorkItem) (2/2) 
(1ecc2049b1c1d68a735f3f37ada8434f) switched from CANCELING to CANCELED.
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskmanager.Task - Attempting to cancel task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(b2cc2fde48d041eaf18a5f5e10657d91).
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(b2cc2fde48d041eaf18a5f5e10657d91) switched from RUNNING to CANCELING.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) 
(08b1fda44df7a5ec80adc1627b55913d) switched from CANCELING to CANCELED.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources 
for ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2) (08b1fda44df7a5ec80adc1627b55913d).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) 
(08b1fda44df7a5ec80adc1627b55913d) [CANCELED]
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskmanager.Task - Triggering cancellation of task 
code ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2) (b2cc2fde48d041eaf18a5f5e10657d91).
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskmanager.Task - Attempting to cancel task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (1/2) 
(452084a10e3d04588dade529f1137724).
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (1/2) 
(452084a10e3d04588dade529f1137724) switched from RUNNING to CANCELING.
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskmanager.Task - Triggering cancellation of task 
code ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map 
(1/2) (452084a10e3d04588dade529f1137724).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(b2cc2fde48d041eaf18a5f5e10657d91) switched from CANCELING to CANCELED.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources 
for ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2) (b2cc2fde48d041eaf18a5f5e10657d91).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(b2cc2fde48d041eaf18a5f5e10657d91) [CANCELED]
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskmanager.Task - Attempting to cancel task 
ref_Ap

Jenkins build is back to normal : beam_PerformanceTests_AvroIOIT_HDFS #1995

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PreCommit_Java_Cron #1679

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[mxm] [BEAM-7936] Update portable WordCount Gradle task on portability page

--
[...truncated 437.95 KB...]
> Task :runners:samza:job-server:checkstyleTest NO-SOURCE
> Task :runners:samza:job-server:javadoc NO-SOURCE
> Task :runners:samza:job-server:spotbugsMain NO-SOURCE
> Task :runners:samza:job-server:test NO-SOURCE
> Task :runners:samza:job-server:check UP-TO-DATE
> Task :runners:samza:job-server:build
> Task :runners:samza:job-server:buildDependents
> Task :sdks:java:io:elasticsearch:test NO-SOURCE
> Task :sdks:java:io:elasticsearch:check
> Task :sdks:java:io:elasticsearch:build
> Task :sdks:java:extensions:sql:datacatalog:javadoc
[main] INFO org.gradle.internal.nativeintegration.services.NativeServices - 
Initialized native services in: /home/jenkins/.gradle/native
> Task :sdks:java:core:javadoc
> Task :sdks:java:extensions:sql:hcatalog:javadoc

> Task :sdks:java:io:hadoop-common:spotbugsMain
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.

> Task :sdks:java:extensions:google-cloud-platform-core:test
> Task :sdks:java:extensions:google-cloud-platform-core:check
> Task :sdks:java:extensions:google-cloud-platform-core:build
> Task :sdks:java:fn-execution:test
> Task :sdks:java:fn-execution:check
> Task :sdks:java:fn-execution:build
> Task :sdks:java:extensions:sql:validateShadedJarDoesntLeakNonProjectClasses
> Task :sdks:java:extensions:sql:check
> Task :sdks:java:extensions:sql:build
[main] INFO org.gradle.internal.nativeintegration.services.NativeServices - 
Initialized native services in: /home/jenkins/.gradle/native

> Task :sdks:java:core:spotbugsMain
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.

> Task :sdks:java:io:hadoop-common:test
> Task :sdks:java:io:hadoop-common:check
> Task :sdks:java:io:hadoop-common:build
> Task :runners:flink:1.7:check
> Task :runners:flink:1.7:build
> Task :runners:flink:1.7:buildDependents
> Task :sdks:java:extensions:sql:jdbc:shadowJarTest
> Task :sdks:java:extensions:sql:jdbc:preCommit
> Task :runners:flink:1.8:test

> Task :sdks:java:core:spotbugsMain
The following classes needed for analysis were missing:
  com.google.auto.value.AutoValue$Builder
  com.google.auto.value.AutoValue

> Task :sdks:java:core:test
> Task :runners:flink:1.8:check
> Task :runners:flink:1.8:build
> Task :runners:flink:1.8:buildDependents
> Task :sdks:java:core:validateShadedJarDoesntLeakNonProjectClasses
> Task :sdks:java:core:check
> Task :sdks:java:core:build
> Task :sdks:java:core:buildNeeded
> Task :runners:direct-java:jar
> Task :runners:direct-java:packageTests
> Task :runners:direct-java:assemble
> Task :runners:direct-java:analyzeClassesDependencies SKIPPED
> Task :runners:direct-java:analyzeTestClassesDependencies SKIPPED
> Task :runners:direct-java:analyzeDependencies SKIPPED
> Task :runners:direct-java:checkstyleMain
> Task :runners:direct-java:checkstyleTest
> Task :sdks:java:io:amazon-web-services2:test
> Task :sdks:java:io:amazon-web-services2:check
> Task :sdks:java:io:amazon-web-services2:build
> Task :sdks:java:io:amazon-web-services2:buildDependents
> Task :sdks:java:extensions:kryo:test
> Task :sdks:java:extensions:kryo:check
> Task :sdks:java:extensions:kryo:build
> Task :runners:direct-java:javadoc
[main] INFO org.gradle.internal.nativeintegration.services.NativeServices - 
Initialized native services in: /home/jenkins/.gradle/native

> Task :runners:direct-java:spotbugsMain
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.

> Task :sdks:java:extensions:jackson:test
> Task :sdks:java:extensions:jackson:check
> Task :sdks:java:extensions:jackson:build
> Task :sdks:java:extensions:jackson:buildDependents
> Task :sdks:java:io:hadoop-format:test
> Task :sdks:java:extensions:euphoria:test
> Task :sdks:java:extensions:join-library:test
> Task :sdks:java:extensions:sketching:test
> Task :sdks:java:extensions:sorter:test
> Task :sdks:java:io:amazon-web-services:test
> Task :sdks:java:io:amqp:test
> Task :sdks:java:io:cassandra:test
> Task :sdks:java:extensions:join-library:check
> Task :sdks:java:extensions:join-library:build
> Task :sdks:java:io:clickhouse:test
> Task :sdks:java:extensions:sketching:check
> Task :sdks:java:extensions:sketching:build
> Task :sdks:java:extensions:sketching:buildDependents
> Task :sdks:java:io:amqp:check
> Task :sdks:java:io:amqp:build
> Task :sdks:java:io:amqp:buildDependents
> Task :sdks:java:io:hadoop-file-system:te

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #4313

2019-08-16 Thread Apache Jenkins Server
See 


--
[...truncated 210.16 KB...]
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_05_13_31-1414030185370669?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_05_02_16-12616865958693412928?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_05_13_37-15589976852713795828?project=apache-beam-testing.
test_dofn_lifecycle 
(apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_as_list_and_as_dict_side_inputs 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

--
XML: nosetests-validatesRunnerBatchTests-df-py35.xml
--
XML: 

--
Ran 17 tests in 1729.389s

OK

> Task :sdks:python:test-suites:dataflow:py35:validatesRunnerStreamingTests
>>> RUNNING integration tests with pipeline options: 
>>> --runner=TestDataflowRunner --project=apache-beam-testing 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --sdk_location=
>>>  --requirements_file=postcommit_requirements.txt --num_workers=1 
>>> --sleep_secs=20 --streaming 
>>> --dataflow_worker_jar=
>>>  
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>  
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 
>>> --attr=ValidatesRunner,!sickbay-streaming
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:179: UserWarning: Some syntactic constructs of Python 3 are not yet 
fully supported by Apache Beam.
  'Some syntactic constructs of Python 3 are not yet fully supported by '
:474:
 UserWarning: Normalizing '2.16.0.dev' to '2.16.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'


Jenkins build is back to normal : beam_PostCommit_Python35 #240

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py36 #347

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[mxm] [BEAM-7936] Update portable WordCount Gradle task on portability page

--
[...truncated 54.25 KB...]
Collecting pymongo<4.0.0,>=3.8.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/6d/27/f30b90f40054948b32df04a8e6355946874d084ac73755986b28d3003578/pymongo-3.9.0-cp36-cp36m-manylinux1_x86_64.whl
Collecting oauth2client<4,>=2.0.1 (from apache-beam==2.16.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in 

 (from apache-beam==2.16.0.dev0) (3.9.1)
Collecting pydot<2,>=1.2.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/33/d1/b1479a770f66d962f545c2101630ce1d5592d90cb4f083d38862e93d16d2/pydot-1.4.1-py2.py3-none-any.whl
Collecting python-dateutil<3,>=2.8.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl
Collecting pytz>=2018.3 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/87/76/46d697698a143e05f77bec5a526bf4e56a0be61d63425b68f4ba553b51f2/pytz-2019.2-py2.py3-none-any.whl
Collecting pyyaml<4.0.0,>=3.12 (from apache-beam==2.16.0.dev0)
Collecting avro-python3<2.0.0,>=1.8.1 (from apache-beam==2.16.0.dev0)
Collecting pyarrow<0.15.0,>=0.11.1 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/6f/df/33d8d6b682750d16baa9f45db825f33e0e0feb479fac1da9758f7ac8fd4b/pyarrow-0.14.1-cp36-cp36m-manylinux2010_x86_64.whl
Collecting cachetools<4,>=3.1.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/2f/a6/30b0a0bef12283e83e58c1d6e7b5aabc7acfc4110df81a4471655d33e704/cachetools-3.1.1-py2.py3-none-any.whl
Collecting google-apitools<0.5.29,>=0.5.28 (from apache-beam==2.16.0.dev0)
Collecting google-cloud-datastore<1.8.0,>=1.7.1 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/d0/aa/29cbcf8cf7d08ce2d55b9dce858f7c632b434cb6451bed17cb4275804217/google_cloud_datastore-1.7.4-py2.py3-none-any.whl
Collecting google-cloud-pubsub<0.40.0,>=0.39.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/c0/9a/4455b1c1450e9b912855b58ca6eee7a27ff1e9b52e4d98c243d93256f469/google_cloud_pubsub-0.39.1-py2.py3-none-any.whl
Collecting google-cloud-bigquery<1.18.0,>=1.6.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/d7/72/e88edd9a0b3c16a7b2c4107b1a9d3ff182b84a29f051ae15293e1375d7fe/google_cloud_bigquery-1.17.0-py2.py3-none-any.whl
Collecting google-cloud-core<2,>=0.28.1 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/ee/f0/084f598629db8e6ec3627688723875cdb03637acb6d86999bb105a71df64/google_cloud_core-1.0.3-py2.py3-none-any.whl
Collecting google-cloud-bigtable<0.33.0,>=0.31.1 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/08/77/b468e209dbb0a6f614e6781f06a4894299a4c6167c2c525cc086caa7c075/google_cloud_bigtable-0.32.2-py2.py3-none-any.whl
Collecting nose>=1.3.7 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/15/d8/dd071918c040f50fa1cf80da16423af51ff8ce4a0f2399b7bf8de45ac3d9/nose-1.3.7-py3-none-any.whl
Collecting nose_xunitmp>=0.4.1 (from apache-beam==2.16.0.dev0)
Collecting numpy<2,>=1.14.3 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/19/b9/bda9781f0a74b90ebd2e046fde1196182900bd4a8e1ea503d3ffebc50e7c/numpy-1.17.0-cp36-cp36m-manylinux1_x86_64.whl
Collecting pandas<0.24,>=0.23.4 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/e1/d8/feeb346d41f181e83fba45224ab14a8d8af019b48af742e047f3845d8cff/pandas-0.23.4-cp36-cp36m-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Collecting tenacity<6.0,>=5.0.2 (from apache-beam==2.16.0.dev0)
  Downloading 
https://files.pythonhosted.org/packages/1e/a1/be8c8610f4620c56790965ba2b564dd76d13cbcd7c2ff8f6053ce63027fb/tenacity-5.1.1-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in 

 (from grpcio<2

Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #65

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[dcavazos] [BEAM-7389] Add code examples for KvSwap page

[dcavazos] [BEAM-7389] Add code examples for Map page

[dcavazos] [BEAM-7389] Add code examples for Keys page

[dcavazos] [BEAM-7389] Add code examples for WithTimestamps page

[iemejia] Update build plugins

[mxm] [BEAM-7936] Update portable WordCount Gradle task on portability page

[markliu] Fix command format in Release Guide

[github] Update stager.py

[kedin] [SQL] Add custom table name resolution

[kedin] [SQL] Support complex identifiers in DataCatalog

[github] Downgrade log message level

[lukecwik] [BEAM-7987] Drop empty Windmill workitem in WindowingWindmillReader

--
[...truncated 92.66 KB...]
Collecting configparser>=3.5; python_version < "3" (from 
importlib-metadata>=0.12->pluggy<1,>=0.3.0->tox==3.11.1)
  Using cached 
https://files.pythonhosted.org/packages/ab/1a/ec151e5e703ac80041eaccef923611bbcec2b667c20383655a06962732e9/configparser-3.8.1-py2.py3-none-any.whl
Collecting scandir; python_version < "3.5" (from pathlib2; python_version == 
"3.4.*" or python_version < 
"3"->importlib-metadata>=0.12->pluggy<1,>=0.3.0->tox==3.11.1)
Installing collected packages: six, contextlib2, zipp, scandir, pathlib2, 
configparser, importlib-metadata, pluggy, toml, virtualenv, py, filelock, tox, 
futures, enum34, grpcio, protobuf, grpcio-tools
Successfully installed configparser-3.8.1 contextlib2-0.5.5 enum34-1.1.6 
filelock-3.0.12 futures-3.3.0 grpcio-1.23.0 grpcio-tools-1.3.5 
importlib-metadata-0.19 pathlib2-2.3.4 pluggy-0.12.0 protobuf-3.9.1 py-1.8.0 
scandir-1.10.0 six-1.12.0 toml-0.10.0 tox-3.11.1 virtualenv-16.7.3 zipp-0.5.2

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Obtaining 
file://
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.16.0.dev0)
Collecting dill<0.2.10,>=0.2.9 (from apache-beam==2.16.0.dev0)
Collecting fastavro<0.22,>=0.21.4 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/15/e3/5956c75f68906b119191ef30d9acff661b422cf918a29a03ee0c3ba774be/fastavro-0.21.24-cp27-cp27mu-manylinux1_x86_64.whl
Collecting future<1.0.0,>=0.16.0 (from apache-beam==2.16.0.dev0)
Requirement already satisfied: grpcio<2,>=1.8 in 

 (from apache-beam==2.16.0.dev0) (1.23.0)
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.16.0.dev0)
Collecting httplib2<=0.12.0,>=0.8 (from apache-beam==2.16.0.dev0)
Collecting mock<3.0.0,>=1.0.1 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/e6/35/f187bdf23be87092bd0f1200d43d23076cee4d0dec109f195173fd3ebc79/mock-2.0.0-py2.py3-none-any.whl
Collecting pymongo<4.0.0,>=3.8.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/00/5c/5379d5b8167a5938918d9ee147f865f6f8a64b93947d402cfdca5c1416d2/pymongo-3.9.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting oauth2client<4,>=2.0.1 (from apache-beam==2.16.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in 

 (from apache-beam==2.16.0.dev0) (3.9.1)
Collecting pydot<2,>=1.2.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/33/d1/b1479a770f66d962f545c2101630ce1d5592d90cb4f083d38862e93d16d2/pydot-1.4.1-py2.py3-none-any.whl
Collecting python-dateutil<3,>=2.8.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl
Collecting pytz>=2018.3 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/87/76/46d697698a143e05f77bec5a526bf4e56a0be61d63425b68f4ba553b51f2/pytz-2019.2-py2.py3-none-any.whl
Collecting pyyaml<4.0.0,>=3.12 (from apache-beam==2.16.0.dev0)
Collecting avro<2.0.0,>=1.8.1 (from apache-beam==2.16.0.dev0)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in 

 (from apache-beam==2.16.0.dev0) (3.3.0)
Collecting pyvcf<0.7.0,>=0.6.8 (from apache-beam==2.16.0.dev0)
Collecting typing<3.7.0,>=3.6.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https

Jenkins build is back to normal : beam_PostCommit_Java11_ValidatesRunner_Direct #1590

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Go #4459

2019-08-16 Thread Apache Jenkins Server
tL2FwYWNoZS9iZWFtL3Nka3MvZ28vcGtnL2JlYW0vY29yZS9ydW50aW1lL2NvZGVyeC5kZWNWYXJJbnRaEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
}
  ]
},
{
  "@type": "kind:stream",
  "component_encodings": [
{
  "@type": "kind:length_prefix",
  "component_encodings": [
{
  "@type": 
"Cgd2YXJpbnR6EgIIAhqFAQpxZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL2dvL3Rlc3QvdmVuZG9yL2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy9nby9wa2cvYmVhbS9jb3JlL3J1bnRpbWUvY29kZXJ4LmVuY1ZhckludFoSEAgWIgQIGUAPKgYIFBICCAgikQEKcWdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy9nby90ZXN0L3ZlbmRvci9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvZ28vcGtnL2JlYW0vY29yZS9ydW50aW1lL2NvZGVyeC5kZWNWYXJJbnRaEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
}
  ]
}
  ],
  "is_stream_like": true
}
  ],
  "is_pair_like": true
},
{
  "@type": "kind:global_window"
}
  ],
  "is_wrapper": true
}
  }
],
"parallel_input": {
  "@type": "OutputReference",
  "step_name": "e8",
  "output_name": "i0"
},
"serialized_fn": 
"%0A%27%22%25%0A%02c1%12%1F%0A%1D%0A%1Bbeam:coder:global_window:v1j9%0A%25%0A%23%0A%21beam:windowfn:global_windows:v0.1%10%01%1A%02c1%22%02:%00%28%010%018%02H%01"
  }
},
{
  "kind": "ParallelDo",
  "name": "e10",
  "properties": {
"user_name": "passert.Sum(flat)/passert.sumFn",
"output_info": [
  {
"user_name": "bogus",
"output_name": "bogus",
"encoding": {
  "@type": "kind:windowed_value",
  "component_encodings": [
{
  "@type": "kind:bytes"
},
{
  "@type": "kind:global_window"
}
  ],
  "is_wrapper": true
}
  }
],
"parallel_input": {
  "@type": "OutputReference",
  "step_name": "e9",
  "output_name": "i0"
},
"serialized_fn": "e10"
  }
}
  ],
  "type": "JOB_TYPE_BATCH"
}
2019/08/16 14:35:34 Test flatten:flatten failed: googleapi: Error 429: Quota 
exceeded for quota metric 'dataflow.googleapis.com/create_requests' and limit 
'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for 
consumer 'project_number:844138762903'., rateLimitExceeded
2019/08/16 14:36:04 Job still running ...
2019/08/16 14:36:04 Job still running ...
2019/08/16 14:36:34 Job still running ...
2019/08/16 14:36:34 Job still running ...
2019/08/16 14:37:04 Job still running ...
2019/08/16 14:37:04 Job still running ...
2019/08/16 14:37:34 Job still running ...
2019/08/16 14:37:34 Job still running ...
2019/08/16 14:38:04 Job still running ...
2019/08/16 14:38:04 Job still running ...
2019/08/16 14:38:34 Job still running ...
2019/08/16 14:38:34 Job still running ...
2019/08/16 14:39:04 Job still running ...
2019/08/16 14:39:04 Job still running ...
2019/08/16 14:39:34 Job still running ...
2019/08/16 14:39:35 Job still running ...
2019/08/16 14:40:04 Job still running ...
2019/08/16 14:40:05 Job still running ...
2019/08/16 14:40:34 Job succeeded!
2019/08/16 14:40:34 Test wordcount:kinglear completed
2019/08/16 14:40:35 Job succeeded!
2019/08/16 14:40:35 Test cogbk:cogbk completed
2019/08/16 14:40:35 Result: 5 tests failed

if [[ ! -z "$JOB_PORT" ]]; then
  # Shut down the job server
  kill %1 || echo "Failed to shut down job server"
fi

# Delete the container locally and remotely
docker rmi $CONTAINER:$TAG || echo "Failed to remove container"
Untagged: us.gcr.io/apache-beam-testing/jenkins/go:20190816-143439
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/go@sha256:05828b273693f8aa6f28390496904cd30653208e41db3900220c77f8dce2dd4e
Deleted: sha256:490292f0b614d2afdaa5f8e5ce167eec01fd22d94ba7e05e1fda42f108fb0315
Deleted: sha256:34609b5c97337b4b6e34ce7c28479a2250b0d39e45707d4134fd406406b80a62
Deleted: sha256:6167b3bd4ac7a2a28d75f5f35c78c8499eb050d27ba8b53db2acee5e3da72b47
Deleted: sh

Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #676

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[amaliujia] [BEAM-7965] add retracting mode to model proto.

[jkai] release vendor calcite

--
[...truncated 357.75 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565966508.33_37b17df7-39c7-4a90-bc5a-85f7b52caca3
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565966512.9_412f091d-a4a8-4ff6-92e2-07cd6aa42fb2 failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565966513.53_937b26b0-50e7-447f-b28a-e53ccba8a53e
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565966514.76_f3e0504c-9cb7-49af-a54f-dcca96bff3d5 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565966515.26_b1718f69-c6e0-4a82-8330-64691224af23 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark #4936

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[amaliujia] [BEAM-7965] add retracting mode to model proto.

[jkai] release vendor calcite

--
[...truncated 2.10 KB...]
[src] $ 

 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g :runners:spark:validatesRunner
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :examples:java:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :sdks:java:extensions:protobuf:extractProto
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :runners:spark:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :sdks:java:io:hadoop-common:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :sdks:java:io:hadoop-format:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:io:jdbc:processResources NO-SOURCE
> Task :sdks:java:io:common:compileJava NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processTestResources NO-SOURCE
> Task :sdks:java:io:hadoop-format:processTestResources
> Task :model:job-management:processResources
> Task :sdks:java:io:common:processResources NO-SOURCE
> Task :examples:java:processTestResources
> Task :sdks:java:io:common:classes UP-TO-DATE
> Task :runners:spark:processTestResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:io:common:jar
> Task :sdks:java:io:common:processTestResources NO-SOURCE
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:fn-execution:c

Jenkins build is back to normal : beam_PostCommit_Go #4460

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #4314

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_LoadTests_Python_Combine_Flink_Batch #15

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[dcavazos] [BEAM-7389] Add code examples for KvSwap page

[dcavazos] [BEAM-7389] Add code examples for Map page

[dcavazos] [BEAM-7389] Add code examples for Keys page

[dcavazos] [BEAM-7389] Add code examples for WithTimestamps page

[iemejia] Update build plugins

[amaliujia] [BEAM-7965] add retracting mode to model proto.

[jkai] release vendor calcite

[mxm] [BEAM-7936] Update portable WordCount Gradle task on portability page

[markliu] Fix command format in Release Guide

[github] Update stager.py

[kedin] [SQL] Add custom table name resolution

[kedin] [SQL] Support complex identifiers in DataCatalog

[github] Downgrade log message level

[lukecwik] [BEAM-7987] Drop empty Windmill workitem in WindowingWindmillReader

--
[...truncated 132.82 KB...]

OK

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 38s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/5jvcxggxr7b2s

[beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins3421073839722720325.sh
+ echo Changing number of workers to 6
Changing number of workers to 6
[beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins5646161324430817930.sh
+ gcloud dataproc clusters update beam-loadtests-python-combine-flink-batch-15 
--num-workers=6 --quiet
Waiting on operation 
[projects/apache-beam-testing/regions/global/operations/161c5b01-8195-30f4-988b-361f417813fa].
Waiting for cluster update operation...
.WARNING: Cluster has active YARN containers. If any container is actively 
writing to HDFS then the downsize operation may block until all writers are 
stopped.
..done.
Updated 
[https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-python-combine-flink-batch-15].
[beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins334733113629933985.sh
+ echo src Combine Python Load test: 2GB 10 byte records src
src Combine Python Load test: 2GB 10 byte records src
[Gradle] - Launching build.
[src] $ 

 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g 
-PloadTest.mainClass=apache_beam.testing.load_tests.combine_test:CombineTest.testCombineGlobally
 -Prunner=PortableRunner 
'-PloadTest.args=--job_name=load-tests-python-flink-batch-combine-1-0816150147 
--project=apache-beam-testing --publish_to_big_query=true 
--metrics_dataset=load_test --metrics_table=python_flink_batch_combine_1 
--input_options='{"num_records": 2,"key_size": 1,"value_size": 9}' 
--parallelism=5 --job_endpoint=localhost:8099 
--environment_config=gcr.io/apache-beam-testing/beam_portability/python:latest 
--environment_type=DOCKER --top_count=20 --runner=PortableRunner' 
:sdks:python:apache_beam:testing:load_tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:python:apache_beam:testing:load_tests:setupVirtualenv UP-TO-DATE

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Obtaining 
file://

Build failed in Jenkins: beam_PostCommit_Java_PVR_Spark_Batch #586

2019-08-16 Thread Apache Jenkins Server
See 


--
Started by GitHub push by lukecwik
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-5 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 >  
 > # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 544604c32eb1d8873ef3f40860df7dec518cce71 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 544604c32eb1d8873ef3f40860df7dec518cce71
Commit message: "Merge pull request #9329 from 
amaliujia/add_retraction_in_java_sdk"
 > git rev-list --no-walk 544604c32eb1d8873ef3f40860df7dec518cce71 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g 
:runners:spark:job-server:validatesPortableRunnerBatch
Starting a Gradle Daemon, 3 busy Daemons could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :runners:spark:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :model:job-management:extractProto
> Task :model:fn-execution:extractProto
> Task :runners:core-construction-java:processTestResources NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :runners:reference:java:processResources NO-SOURCE
> Task :runners:spark:processTestResources
> Task :runners:reference:java:processTestResources NO-SOURCE
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:

Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #677

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[aaltay] [BEAM-6694] Added Approximate Quantile Transfrom on Python SDK (#9153)

--
[...truncated 357.55 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565977121.18_ba24e427-199b-49b4-a98c-378f41e67ce3
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565977125.51_888ac259-60cb-4963-a491-fcb4c248a6b2 failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565977126.06_adaf1f4e-d159-4b11-9a65-b73aa88fad7d
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565977127.18_0408f1d7-6015-41cd-b323-83f22a387ece 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565977127.65_813c8240-286e-47c8-9882-acb76b7d8605 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exi

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Spark #4937

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #678

2019-08-16 Thread Apache Jenkins Server
See 


--
[...truncated 357.74 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565979230.29_31c6b2f4-3baa-458c-a4b3-02c866e5c681
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565979235.22_8d143b38-8abc-471f-8d44-be3ecd7f3711 failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565979235.83_98297a72-1c7b-47a2-b923-c416b58d9d55
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565979237.05_f8067199-8b84-4c7a-83b9-70ac76ee9305 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565979237.56_330e01e1-7fc3-44bd-b643-8989672935a8 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #4315

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[aaltay] [BEAM-6694] Added Approximate Quantile Transfrom on Python SDK (#9153)

--
[...truncated 123.75 KB...]
Ran 17 tests in 1622.310s

OK

> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests
>>> RUNNING integration tests with pipeline options: 
>>> --runner=TestDataflowRunner --project=apache-beam-testing 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --sdk_location=
>>>  --requirements_file=postcommit_requirements.txt --num_workers=1 
>>> --sleep_secs=20 --streaming 
>>> --dataflow_worker_jar=
>>>  
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>  
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 
>>> --attr=ValidatesRunner,!sickbay-streaming
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:179: UserWarning: Some syntactic constructs of Python 3 are not yet 
fully supported by Apache Beam.
  'Some syntactic constructs of Python 3 are not yet fully supported by '
:474:
 UserWarning: Normalizing '2.16.0.dev' to '2.16.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
:84:
 UserWarning: Some syntactic constructs of Python 3 are not yet fully supported 
by Apache Beam.
  'Some syntactic constructs of Python 3 are not yet fully supported by '
:59:
 UserWarning: Datastore IO will support Python 3 after replacing 
googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
:47:
 UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: 
BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "

> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_10_35_34-15962242296626550481?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_10_45_03-18431980308205099387?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_10_35_35-13020192841856469783?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_10_45_38-3287442830730687452?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_10_35_35-4678396630167008726?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_10_45_40-16298389945225565392?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_10_35_35-1220555015378607213?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_10_44_59-5575447261022215033?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_10_35_35-15509611312137293305?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_10_45_35-545300940799

Jenkins build is back to normal : beam_PreCommit_Portable_Python_Cron #1032

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py27 #379

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[amaliujia] [BEAM-7965] add retracting mode to model proto.

[jkai] release vendor calcite

[aaltay] [BEAM-6694] Added Approximate Quantile Transfrom on Python SDK (#9153)

--
[...truncated 66.76 KB...]
Task ':sdks:python:test-suites:dataflow:py2:integrationTest' is not up-to-date 
because:
  Task has not declared any outputs despite executing actions.
Custom actions are attached to task 
':sdks:python:test-suites:dataflow:py2:integrationTest'.
Starting process 'command 'sh''. Working directory: 

 Command: sh -c . 

 && 

 --test_opts 
"--tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it 
--attr=IT --nocapture" --pipeline_opts "--project=apache-beam-testing 
--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
--input=gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.* 
--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
--expect_checksum=ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710 --num_workers=10 
--autoscaling_algorithm=NONE --runner=TestDataflowRunner 
--sdk_location=test-suites/dataflow/py2/build/apache-beam.tar.gz" --suite 
integrationTest-perf
Successfully started process 'command 'sh''
>>> RUNNING integration tests with pipeline options: 
>>> --project=apache-beam-testing 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --input=gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.*
>>>  --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --expect_checksum=ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710 --num_workers=10 
>>> --autoscaling_algorithm=NONE --runner=TestDataflowRunner 
>>> --sdk_location=test-suites/dataflow/py2/build/apache-beam.tar.gz
>>>   test options: 
>>> --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it
>>>  --attr=IT --nocapture
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'

> Task :sdks:python:test-suites:dataflow:py2:integrationTest FAILED
:sdks:python:test-suites:dataflow:py2:integrationTest (Thread[Execution worker 
for ':',5,main]) completed. Took 4.188 secs.

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
5 actionable tasks: 5 executed

Publishing build scan...
https://gradle.com/s/ywkecssc5r5cw


STDERR: DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 
2020. Please upgrade your Python as Python 2.7 won't be maintained after that 
date. A future version of pip will drop support for Python 2.7. More details 
about Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
:474:
 UserWarning: Normalizing '2.16.0.dev' to '2.16.0.dev0'
  normalized_version,
beam_fn_api.proto: warning: Import google/protobuf/descriptor.proto but not 
used.
beam_fn_api.proto: warning: Import google/protobuf/wrappers.proto but not used.
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
root: Gen

Build failed in Jenkins: beam_PreCommit_Java_Cron #1680

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[amaliujia] [BEAM-7965] add retracting mode to model proto.

[jkai] release vendor calcite

[aaltay] [BEAM-6694] Added Approximate Quantile Transfrom on Python SDK (#9153)

--
[...truncated 512.21 KB...]
> Task :runners:extensions-java:metrics:check
> Task :runners:extensions-java:metrics:build
> Task :runners:extensions-java:metrics:buildDependents
> Task :sdks:java:extensions:protobuf:assemble
> Task :sdks:java:extensions:protobuf:analyzeClassesDependencies SKIPPED

> Task :sdks:java:core:javadoc
:283:
 warning - Tag @link: can't find continuously(Duration, TerminationCondition) 
in org.apache.beam.sdk.io.FileIO.Match
:936:
 warning - Tag @link: can't find watchForNewFiles in org.apache.beam.sdk.io.Read
:930:
 warning - Tag @link: can't find withEmptyMatchTreatment in 
org.apache.beam.sdk.io.Read
:948:
 warning - Tag @link: can't find withHintMatchesManyFiles() in 
org.apache.beam.sdk.io.Read
:1118:
 warning - Tag @link: can't find watchForNewFiles in org.apache.beam.sdk.io.Read
:1112:
 warning - Tag @link: can't find withEmptyMatchTreatment in 
org.apache.beam.sdk.io.Read

> Task :sdks:java:extensions:protobuf:extractIncludeTestProto
> Task :sdks:java:extensions:protobuf:generateTestProto

> Task :sdks:java:core:javadoc
:816:
 warning - Tag @link: can't find watchForNewFiles in org.apache.beam.sdk.io.Read
:810:
 warning - Tag @link: can't find withEmptyMatchTreatment in 
org.apache.beam.sdk.io.Read
:1280:
 warning - Tag @link: can't find to(FilenamePolicy) in 
org.apache.beam.sdk.io.AvroIO.TypedWrite
:1280:
 warning - Tag @link: can't find to(FilenamePolicy) in 
org.apache.beam.sdk.io.AvroIO.TypedWrite
:1257:
 warning - Tag @link: can't find to(FilenamePolicy) in 
org.apache.beam.sdk.io.AvroIO.TypedWrite
:1578:
 warning - Tag @link: can't find to(FilenamePolicy) in 
org.apache.beam.sdk.io.AvroIO.TypedWrite
:137:
 warning - Tag @link: can't find constant(FilenamePolicy, SerializableFunction) 
in org.apache.beam.sdk.io.DynamicFileDestinations
:204:
 warning - Tag @link: can't find FileBasedSource(Metadata, long, long, long) in 
org.apache.beam.sdk.io.FileBasedSource
:304:
 warning - Tag @link: can't find withNaming(Write.FileNaming) in 
org.apache.beam.sdk.io.FileIO.Write
:304:
 warning - Tag @link: can't find via(Contextful, Sink) in 
org.apache.beam.sdk.io.FileIO.Write
:324:
 warning - Tag @link: can't find continuously(Duration,
 TerminationCondition) in org.apache.beam.sdk.io.FileIO.MatchAll
:182:
 warning - Tag @link: can't find continuously(Duration, TerminationCondition) 
in org.apache.

Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py36 #348

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[amaliujia] [BEAM-7965] add retracting mode to model proto.

[jkai] release vendor calcite

[aaltay] [BEAM-6694] Added Approximate Quantile Transfrom on Python SDK (#9153)

--
[...truncated 66.21 KB...]
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'

> Task :sdks:python:test-suites:dataflow:py36:integrationTest FAILED
:sdks:python:test-suites:dataflow:py36:integrationTest (Thread[Execution worker 
for ':',5,main]) completed. Took 5.299 secs.

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
5 actionable tasks: 5 executed

Publishing build scan...
https://gradle.com/s/6azuk7z7h7rnk


STDERR: DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 
2020. Please upgrade your Python as Python 2.7 won't be maintained after that 
date. A future version of pip will drop support for Python 2.7. More details 
about Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
:474:
 UserWarning: Normalizing '2.16.0.dev' to '2.16.0.dev0'
  normalized_version,
beam_fn_api.proto: warning: Import google/protobuf/descriptor.proto but not 
used.
beam_fn_api.proto: warning: Import google/protobuf/wrappers.proto but not used.
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: idioms
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables fro

Build failed in Jenkins: beam_PostCommit_Python36 #244

2019-08-16 Thread Apache Jenkins Server
See 


--
[...truncated 42.05 KB...]
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: ws_comma
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: No changes to 

RefactoringTool: Refactored 

RefactoringTool: Refactored 

RefactoringTool: Refactored 

RefactoringTool: Refactored 

RefactoringTool: Refactored 

RefactoringTool: Refactored 

RefactoringTool: Refactored 

RefactoringTool: No changes to 

RefactoringTool: Refactored 

RefactoringTool: Refactored 

RefactoringTool: No changes to 

RefactoringTool: Refactored 

RefactoringTool: Refactored 

RefactoringTool: No changes to 

RefactoringTool: Refactored 

RefactoringTool: Files that were modified:
RefactoringTool: 


Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py37 #349

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[amaliujia] [BEAM-7965] add retracting mode to model proto.

[jkai] release vendor calcite

[aaltay] [BEAM-6694] Added Approximate Quantile Transfrom on Python SDK (#9153)

--
[...truncated 66.05 KB...]
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'

> Task :sdks:python:test-suites:dataflow:py37:integrationTest FAILED
:sdks:python:test-suites:dataflow:py37:integrationTest (Thread[Execution worker 
for ':',5,main]) completed. Took 4.338 secs.

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
5 actionable tasks: 5 executed

Publishing build scan...
https://gradle.com/s/45addz57npmdi


STDERR: DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 
2020. Please upgrade your Python as Python 2.7 won't be maintained after that 
date. A future version of pip will drop support for Python 2.7. More details 
about Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
:474:
 UserWarning: Normalizing '2.16.0.dev' to '2.16.0.dev0'
  normalized_version,
beam_fn_api.proto: warning: Import google/protobuf/descriptor.proto but not 
used.
beam_fn_api.proto: warning: Import google/protobuf/wrappers.proto but not used.
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: idioms
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables fro

Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py35 #371

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[amaliujia] [BEAM-7965] add retracting mode to model proto.

[jkai] release vendor calcite

[aaltay] [BEAM-6694] Added Approximate Quantile Transfrom on Python SDK (#9153)

--
[...truncated 66.37 KB...]
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'

> Task :sdks:python:test-suites:dataflow:py35:integrationTest FAILED
:sdks:python:test-suites:dataflow:py35:integrationTest (Thread[Execution worker 
for ':',5,main]) completed. Took 5.388 secs.

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
5 actionable tasks: 5 executed

Publishing build scan...
https://gradle.com/s/weytqzzsyjjq4


STDERR: DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 
2020. Please upgrade your Python as Python 2.7 won't be maintained after that 
date. A future version of pip will drop support for Python 2.7. More details 
about Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
:474:
 UserWarning: Normalizing '2.16.0.dev' to '2.16.0.dev0'
  normalized_version,
beam_fn_api.proto: warning: Import google/protobuf/descriptor.proto but not 
used.
beam_fn_api.proto: warning: Import google/protobuf/wrappers.proto but not used.
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: idioms
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from 
/usr/lib/python2.7/lib2to3/PatternGrammar.txt

Jenkins build is back to normal : beam_PostCommit_Java_PVR_Spark_Batch #587

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #4316

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #679

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[thw] Move Design Documents index to cwiki

--
[...truncated 357.83 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565984534.89_6229ffd6-1ebf-49da-ae3a-3f723db1a454
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565984539.42_80da7c3a-bf83-4981-9fb9-eac1e2e1cdaa failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565984540.02_6a3c0e26-f8c1-4202-8f5f-e5829b366256
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565984541.2_27cb6a1d-e44c-4fc9-8821-1898e6d2531a failed 
in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565984541.71_c432ad5f-4966-4c0e-8e15-d18c9b243ba7 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()

Build failed in Jenkins: beam_PostCommit_Python2 #242

2019-08-16 Thread Apache Jenkins Server
See 

--
[...truncated 735.72 KB...]
[flink-akka.actor.default-dispatcher-8] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2).
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at 
[3]assert_that/{Group, Unkey, Match}) (1/2) (3184126c625fe4532ce2c6d8df5d9c5c) 
switched from CREATED to DEPLOYING.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream 
leak safety net for task MapPartition (MapPartition at [3]assert_that/{Group, 
Unkey, Match}) (1/2) (3184126c625fe4532ce2c6d8df5d9c5c) [DEPLOYING]
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task 
MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) 
(3184126c625fe4532ce2c6d8df5d9c5c) [DEPLOYING].
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: 
MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) 
(3184126c625fe4532ce2c6d8df5d9c5c) [DEPLOYING].
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at 
[3]assert_that/{Group, Unkey, Match}) (1/2) (3184126c625fe4532ce2c6d8df5d9c5c) 
switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-8] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) 
(3184126c625fe4532ce2c6d8df5d9c5c) switched from DEPLOYING to RUNNING.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at 
[3]assert_that/{Group, Unkey, Match}) (2/2) (4e1415ad32b1fbaa422e8c02641da53a) 
switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) 
(4e1415ad32b1fbaa422e8c02641da53a).
[jobmanager-future-thread-1] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink 
(DiscardingOutput) (2/2) (523f38e8ffb927c9388ca06a50a5decd) switched from 
CREATED to SCHEDULED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task MapPartition (MapPartition at 
[3]assert_that/{Group, Unkey, Match}) (2/2) (4e1415ad32b1fbaa422e8c02641da53a) 
[FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) 
4e1415ad32b1fbaa422e8c02641da53a.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink 
(DiscardingOutput) (2/2) (523f38e8ffb927c9388ca06a50a5decd) switched from 
SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink 
(DiscardingOutput) (2/2) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink 
(DiscardingOutput) (2/2).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) 
(4e1415ad32b1fbaa422e8c02641da53a) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (2/2) 
(523f38e8ffb927c9388ca06a50a5decd) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak 
safety net for task DataSink (DiscardingOutput) (2/2) 
(523f38e8ffb927c9388ca06a50a5decd) [DEPLOYING]
[DataSink (DiscardingOutput) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink 
(DiscardingOutput) (2/2) (523f38e8ffb927c9388ca06a50a5decd) [DEPLOYING].
[DataSink (DiscardingOutput) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Registering task at network: 
DataSink (DiscardingOutput) (2/2) (523f38e8ffb927c9388ca06a50a5decd) 
[DEPLOYING].
[DataSink (DiscardingOutput) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task -

Jenkins build is back to normal : beam_PostCommit_Python36 #245

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #680

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [BEAM-7802] Make SQL example slightly simpler

[iemejia] [BEAM-7802] Inline AvroUtils methods to have only one public AvroUtils

[iemejia] [BEAM-7802] Fix minor issues (access modifiers + static) in

[iemejia] [BEAM-7802] Make Schema.toString method multi OS friendly

[iemejia] [BEAM-7802] Add AvroUtils.schemaCoder method to infer a Beam schema 
from

--
[...truncated 357.78 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565988049.6_33720194-62e3-4863-8407-3b38238afcf9
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565988054.28_dfd82067-9aa1-4502-b8c8-8ce50b5e4d69 failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565988054.86_7d735f7a-60e5-47df-b35c-00ebbc6c5064
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565988056.1_e5345359-9fad-47fe-ab83-6d9b0cba9aea failed 
in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565988056.59_8373efb5-4ed5-4961-b424-1ef997d41c1c 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
-

Build failed in Jenkins: beam_PostCommit_Python35 #244

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[thw] Move Design Documents index to cwiki

--
[...truncated 90.85 KB...]
 tableId: 'python_write_table'> with schema {'fields': [{'name': 'number', 
'type': 'INTEGER'}, {'name': 'str', 'type': 'STRING'}]}.
root: DEBUG: Created the table with id python_write_table
root: INFO: Created table 
apache-beam-testing.python_write_to_table_15659848185097.python_write_table 
with schema , ]>. Result: , ]>
 selfLink: 
'https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/python_write_to_table_15659848185097/tables/python_write_table'
 tableReference: 
 type: 'TABLE'>.
root: DEBUG: finish 
root: DEBUG: finish 
root: DEBUG: finish 
root: DEBUG: finish 
root: DEBUG: Attempting to flush to all destinations. Total buffered: 4
root: DEBUG: Flushing data to 
apache-beam-testing:python_write_to_table_15659848185097.python_write_table. 
Total 4 rows.
root: DEBUG: Passed: True. Errors are []
root: DEBUG: Wait for the bundle bundle_1 to finish.
root: INFO: Attempting to perform query SELECT number, str FROM 
python_write_to_table_15659848185097.python_write_table to BQ
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
 HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/queries/2b795a73-ebe2-43f3-ae89-2aba3c898f0d?location=US&timeoutMs=1&maxResults=0
 HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/jobs/2b795a73-ebe2-43f3-ae89-2aba3c898f0d?location=US
 HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon78c6e15a4b17c201d491d49f03b677642488820a/data
 HTTP/1.1" 200 None
root: INFO: Result of query is: []
root: INFO: Deleting dataset python_write_to_table_15659848185097 in project 
apache-beam-testing
- >> end captured logging << -
:1142:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
:1142:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
:642:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location

--
XML: nosetests-postCommitIT-direct-py35.xml
--
XML: 

--
Ran 15 tests in 24.264s

FAILED (SKIP=1, failures=1)

> Task :sdks:python:test-suites:direct:py35:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT
>>> RUNNING integration tests with pipeline options: 
>>> --runner=TestDataflowRunner --project=a

Build failed in Jenkins: beam_PostCommit_Java11_ValidatesRunner_Direct #1596

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[kyle.winkelman] [BEAM-7989] Remove side inputs from CacheVisitor calculation.

--
[...truncated 611 B...]
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision ee8a3332183839789298e8228e95aa1199fd1b0d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f ee8a3332183839789298e8228e95aa1199fd1b0d
Commit message: "Merge pull request #9357: [BEAM-7989] Remove side inputs from 
CacheVisitor calculation"
 > git rev-list --no-walk c825d719659b5f6e2ee0cb2c444ee8ff266af882 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 -Dorg.gradle.java.home=/usr/lib/jvm/java-8-openjdk-amd64 
:runners:direct-java:shadowJar :runners:direct-java:shadowTestJar
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :runners:direct-java:processTestResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shado

Jenkins build is back to normal : beam_PostCommit_Python2 #243

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #681

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[kyle.winkelman] [BEAM-7989] Remove side inputs from CacheVisitor calculation.

--
[...truncated 357.57 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565989091.2_0f58fb32-c2e3-41c4-a0ef-c65d5401f495
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565989095.75_f6c7cac8-851c-4463-954d-46e8be3d16e8 failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565989096.33_27b98333-80bc-4be3-a054-a50e3a5163a1
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565989097.52_f12120ed-4e0a-432f-b6fa-067ef9b29fa1 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565989098.01_33b954a2-f85a-4979-957d-f76fd95bd0e3 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit_

Build failed in Jenkins: beam_PostCommit_Go_VR_Flink #734

2019-08-16 Thread Apache Jenkins Server
See 
<https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/734/display/redirect?page=changes>

Changes:

[kyle.winkelman] [BEAM-7989] Remove side inputs from CacheVisitor calculation.

--
[...truncated 378.10 KB...]
key: "n1"
value: <
  unique_name: "n1"
  coder_id: "c0"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  pcollections: <
key: "n2"
value: <
  unique_name: "n2"
  coder_id: "c3"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  pcollections: <
key: "n3"
value: <
  unique_name: "n3"
  coder_id: "c0"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  pcollections: <
key: "n4"
value: <
  unique_name: "n4"
  coder_id: "c3"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  pcollections: <
key: "n5"
value: <
  unique_name: "n5"
  coder_id: "c0"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  pcollections: <
key: "n6"
value: <
  unique_name: "n6"
  coder_id: "c3"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  pcollections: <
key: "n7"
value: <
  unique_name: "n7"
  coder_id: "c3"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  pcollections: <
key: "n8"
value: <
  unique_name: "n8"
  coder_id: "c4"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  pcollections: <
key: "n9"
value: <
  unique_name: "n9"
  coder_id: "c6"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  windowing_strategies: <
key: "w0"
value: <
  window_fn: <
spec: <
  urn: "beam:windowfn:global_windows:v0.1"
>
  >
  merge_status: NON_MERGING
  window_coder_id: "c1"
  trigger: <
default: <
>
  >
  accumulation_mode: DISCARDING
  output_time: END_OF_WINDOW
  closing_behavior: EMIT_IF_NONEMPTY
  OnTimeBehavior: FIRE_ALWAYS
>
  >
  coders: <
key: "c0"
value: <
  spec: <
urn: "beam:coder:bytes:v1"
  >
>
  >
  coders: <
key: "c1"
value: <
  spec: <
urn: "beam:coder:global_window:v1"
  >
>
  >
  coders: <
key: "c2"
value: <
  spec: <
urn: "beam:go:coder:custom:v1"
payload: 
"Cgd2YXJpbnR6EgIIAhqFAQpxZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL2dvL3Rlc3QvdmVuZG9yL2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy9nby9wa2cvYmVhbS9jb3JlL3J1bnRpbWUvY29kZXJ4LmVuY1ZhckludFoSEAgWIgQIGUAPKgYIFBICCAgikQEKcWdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy9nby90ZXN0L3ZlbmRvci9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvZ28vcGtnL2JlYW0vY29yZS9ydW50aW1lL2NvZGVyeC5kZWNWYXJJbnRaEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
  >
>
  >
  coders: <
key: "c3"
value: <
  spec: <
urn: "beam:coder:length_prefix:v1"
  >
  component_coder_ids: "c2"
>
  >
  coders: <
key: "c4"
value: <
  spec: <
urn: "beam:coder:kv:v1"
  >
  component_coder_ids: "c3"
  component_coder_ids: "c3"
>
  >
  coders: <
key: "c5"
value: <
  spec: <
urn: "beam:coder:iterable:v1"
  >
  component_coder_ids: "c3"
>
  >
  coders: <
key: "c6"
value: <
  spec: <
urn: "beam:coder:kv:v1"
  >
  component_coder_ids: "c3"
  component_coder_ids: "c5"
>
  >
  environments: <
key: "go"
value: <
  urn: "beam:env:docker:v1"
  payload: "\n8us.gcr.io/apache-beam-testing/jenkins/go:20190816-205006"
>
  >
>
root_transform_ids: "e5"
root_transform_ids: "e3"
root_transform_ids: "e4"
root_transform_ids: "e1"
root_transform_ids: "e2"
root_transform_ids: "e6"
root_transform_ids: "e7"
root_transform_ids: "s1"
2019/08/16 21:00:01 Test flatten:flatten failed:connecting to job 
service:
failed to dial server at localhost:53271
caused by:
context dea

Build failed in Jenkins: beam_PostCommit_Python35 #245

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [BEAM-7802] Make SQL example slightly simpler

[iemejia] [BEAM-7802] Inline AvroUtils methods to have only one public AvroUtils

[iemejia] [BEAM-7802] Fix minor issues (access modifiers + static) in

[iemejia] [BEAM-7802] Make Schema.toString method multi OS friendly

[iemejia] [BEAM-7802] Add AvroUtils.schemaCoder method to infer a Beam schema 
from

[kyle.winkelman] [BEAM-7989] Remove side inputs from CacheVisitor calculation.

--
[...truncated 83.93 KB...]
test_datastore_write_limit 
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) 
... ok
test_big_query_write 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_without_schema 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... FAIL
test_big_query_standard_sql_kms_key_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: This test doesn't work on DirectRunner.
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok

==
FAIL: test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
--
Traceback (most recent call last):
  File 
"
 line 163, in test_big_query_standard_sql
big_query_query_to_table_pipeline.run_bq_pipeline(options)
  File 
"
 line 82, in run_bq_pipeline
result = p.run()
  File 
"
 line 107, in run
else test_runner_api))
  File 
"
 line 406, in run
self._options).run(False)
  File 
"
 line 419, in run
return self.runner.run_pipeline(self, self._options)
  File 
"
 line 51, in run_pipeline
hc_assert_that(self.result, pickler.loads(on_success_matcher))
AssertionError: 
Expected: (Test pipeline expected terminated in state: DONE and Expected 
checksum is 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72)
 but: Expected checksum is 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72 Actual 
checksum is da39a3ee5e6b4b0d3255bfef95601890afd80709

 >> begin captured logging << 
root: INFO: Running pipeline with DirectRunner.
root: DEBUG: Query SELECT * FROM (SELECT "apple" as fruit) UNION ALL (SELECT 
"orange" as fruit) does not reference any tables.
root: WARNING: Dataset 
apache-beam-testing:temp_dataset_e5ab4cfa9f5e48da91dceac3dced2279 does not 
exist so we will create it as temporary with location=None
root: DEBUG: Creating or getting table  with schema {'fields': [{'type': 'STRING', 'mode': 
'NULLABLE', 'name': 'fruit'}]}.
root: DEBUG: Created the table with id output_table
root: INFO: Created table 
apache-beam-testing.python_query_to_table_15659889833007.output_table with 
schema ]>. Result: ]>
 selfLink: 
'https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/python_query_to_table_15659889833007/tables/output_table'
 tableReference: 
 type: 'TABLE'>.
root: DEBUG: Attempting to flush to all destinations. Total buffered: 2
root: DEBUG: Flushing data to 
apache-beam-testing:python_query_to_table_15659889833007.output_table. Total 2 
rows.
root: DEBUG: Passed: True. Errors are []
root: INFO: Attempting to perform query SELECT fruit from 
`python_query_to_table_15659889833007.output_table`; to BQ
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
url

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #4319

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[kyle.winkelman] [BEAM-7989] Remove side inputs from CacheVisitor calculation.

--
[...truncated 155.02 KB...]
test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle 
(apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

--
XML: nosetests-validatesRunnerStreamingTests-df.xml
--
XML: 

--
Ran 15 tests in 1085.997s

OK
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_14_53_53-8147594687880065959?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_15_02_10-3729818595381284954?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_14_53_54-13082317437862400637?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_15_01_35-1163593990539337?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_14_53_53-8160144191444148492?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_15_01_54-11469399151794558941?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_14_53_55-1445407897048167161?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_15_02_06-10247223508283113135?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_14_53_54-5548486948231186793?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_15_02_05-11087318111578562690?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_14_53_53-14103353096716473383?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_14_53_53-473673606857626941?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_15_02_05-15860837516050496465?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_14_53_53-16396826984276663905?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_15_02_00-1587319062267639722?project=apache-beam-testing.

> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_14_57_14-4992461372932412257?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_15_05_35-8436832193816003047?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-16_14_

Build failed in Jenkins: beam_PostCommit_Java_PVR_Spark_Batch #590

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [BEAM-7802] Make SQL example slightly simpler

[iemejia] [BEAM-7802] Inline AvroUtils methods to have only one public AvroUtils

[iemejia] [BEAM-7802] Fix minor issues (access modifiers + static) in

[iemejia] [BEAM-7802] Make Schema.toString method multi OS friendly

[iemejia] [BEAM-7802] Add AvroUtils.schemaCoder method to infer a Beam schema 
from

[kyle.winkelman] [BEAM-7989] Remove side inputs from CacheVisitor calculation.

--
Started by GitHub push by iemejia
Started by GitHub push by iemejia
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-1 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 >  
 > # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision ee8a3332183839789298e8228e95aa1199fd1b0d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f ee8a3332183839789298e8228e95aa1199fd1b0d
Commit message: "Merge pull request #9357: [BEAM-7989] Remove side inputs from 
CacheVisitor calculation"
 > git rev-list --no-walk 442fbcd67aa03a41cf61a60c4be9efb560f1917d # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g 
:runners:spark:job-server:validatesPortableRunnerBatch
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:spark:processResources NO-SOURCE
> Task :runners:core-construction-java:processTestResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:reference:java:processResources NO-SOURCE
> Task :model:job-management:extractProto
> Task :model:fn-execution:extractProto
> Task :runners:reference:java:processTestResources NO-SOURCE
> Task :runners:spark:processTestResources
> Task :model:job-management:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :model:fn-execution:processResources
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> T

Jenkins build is back to normal : beam_PostCommit_Java11_ValidatesRunner_Direct #1597

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #682

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[dcavazos] [BEAM-7389] Add code examples for Partition page

--
[...truncated 357.54 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565999121.05_85e4144b-ffdb-48ce-b956-5a52b457ef57
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565999125.39_a1626cb3-faee-4bd2-b203-a152d1757bf1 failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565999125.95_5f2e98df-2b57-4381-bbf4-c49a81b7f5ca
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565999127.07_fb7dd274-f4bf-4907-a46f-dcee981841bb 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565999127.54_ff5a4069-443f-4f8d-ad46-3676c41717e3 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().w

Jenkins build is back to normal : beam_PostCommit_Go_VR_Flink #735

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #683

2019-08-16 Thread Apache Jenkins Server
See 


--
[...truncated 357.70 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1566000644.77_4b87c3c6-205b-4d28-a809-47370868e878
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1566000649.52_3a0ce27d-3b23-4721-8110-a9e42ceeb17b failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1566000650.13_d7f6bf81-cde6-4514-82be-85096d5e4377
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1566000651.38_997dfed9-589e-42d0-85c6-bb49b6a2859c 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1566000651.87_f1cf9002-0679-41df-86c9-194ab0497cd0 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line

Build failed in Jenkins: beam_PreCommit_Portable_Python_Cron #1033

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [BEAM-7802] Make SQL example slightly simpler

[iemejia] [BEAM-7802] Inline AvroUtils methods to have only one public AvroUtils

[iemejia] [BEAM-7802] Fix minor issues (access modifiers + static) in

[iemejia] [BEAM-7802] Make Schema.toString method multi OS friendly

[iemejia] [BEAM-7802] Add AvroUtils.schemaCoder method to infer a Beam schema 
from

[dcavazos] [BEAM-7389] Add code examples for Partition page

[kyle.winkelman] [BEAM-7989] Remove side inputs from CacheVisitor calculation.

[thw] Move Design Documents index to cwiki

--
[...truncated 898.33 KB...]
[ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42-side0 -> Map (1/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task 
ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42-side0 -> Map (1/2) 
(710d336f59edc6ceb80009b050e3a3ea) [CANCELED]
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskmanager.Task - Triggering cancellation of task 
code ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2) (0c2496dfe0ed9ca3789a6c40e5676adc).
[flink-akka.actor.default-dispatcher-16] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - group -> [2]{count, 
format} (1/2) (5ab5d405db892ff8c506ef5545efa075) switched from CANCELING to 
CANCELED.
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskmanager.Task - Attempting to cancel task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(7854c0c6229715fb28358cfd29927f77).
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(7854c0c6229715fb28358cfd29927f77) switched from RUNNING to CANCELING.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) 
(0c2496dfe0ed9ca3789a6c40e5676adc) switched from CANCELING to CANCELED.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources 
for ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2) (0c2496dfe0ed9ca3789a6c40e5676adc).
[flink-akka.actor.default-dispatcher-14] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - 
write/Write/WriteImpl/GroupByKey -> [1]write/Write/WriteImpl/Extract -> (Map -> 
ToKeyedWorkItem, Map -> ToKeyedWorkItem) (1/2) 
(5070cb84192a6e6a916585aa37f7a217) switched from CANCELING to CANCELED.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) 
(0c2496dfe0ed9ca3789a6c40e5676adc) [CANCELED]
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskmanager.Task - Triggering cancellation of task 
code ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2) (7854c0c6229715fb28358cfd29927f77).
[flink-akka.actor.default-dispatcher-14] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - 
[1]write/Write/WriteImpl/FinalizeWrite (1/2) (d9d27938bb13e6149888932cace6916b) 
switched from CANCELING to CANCELED.
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskmanager.Task - Attempting to cancel task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (1/2) 
(13707c396ca396797de347cdb57fdde3).
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (1/2) 
(13707c396ca396797de347cdb57fdde3) switched from RUNNING to CANCELING.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(7854c0c6229715fb28358cfd29927f77) switched from CANCELING to CANCELED.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources 
for ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2) (7854c0c6229715fb28358cfd29927f77).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(7854c0c6229715fb28358cfd29927f77) [CANCELED]
[flink-a

Build failed in Jenkins: beam_PreCommit_Java_Cron #1681

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [BEAM-7802] Make SQL example slightly simpler

[iemejia] [BEAM-7802] Inline AvroUtils methods to have only one public AvroUtils

[iemejia] [BEAM-7802] Fix minor issues (access modifiers + static) in

[iemejia] [BEAM-7802] Make Schema.toString method multi OS friendly

[iemejia] [BEAM-7802] Add AvroUtils.schemaCoder method to infer a Beam schema 
from

[dcavazos] [BEAM-7389] Add code examples for Partition page

[kyle.winkelman] [BEAM-7989] Remove side inputs from CacheVisitor calculation.

[thw] Move Design Documents index to cwiki

--
[...truncated 438.18 KB...]
> Task :sdks:java:io:hadoop-common:spotbugsMain
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.

> Task :sdks:java:io:elasticsearch:spotbugsMain
The following classes needed for analysis were missing:
  com.google.auto.value.AutoValue$Builder
  com.google.auto.value.AutoValue

> Task :sdks:java:io:elasticsearch:test NO-SOURCE
> Task :sdks:java:io:elasticsearch:check
> Task :sdks:java:io:elasticsearch:build
> Task :sdks:java:extensions:sql:hcatalog:javadoc
> Task :sdks:java:core:javadoc
> Task :sdks:java:extensions:sql:validateShadedJarDoesntLeakNonProjectClasses
> Task :sdks:java:extensions:sql:check
> Task :sdks:java:extensions:sql:build
> Task :sdks:java:fn-execution:test
> Task :sdks:java:fn-execution:check
> Task :sdks:java:fn-execution:build
[main] INFO org.gradle.internal.nativeintegration.services.NativeServices - 
Initialized native services in: /home/jenkins/.gradle/native
> Task :sdks:java:extensions:google-cloud-platform-core:test
> Task :sdks:java:extensions:google-cloud-platform-core:check
> Task :sdks:java:extensions:google-cloud-platform-core:build

> Task :sdks:java:core:spotbugsMain
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.

> Task :sdks:java:io:hadoop-common:test
> Task :sdks:java:io:hadoop-common:check
> Task :sdks:java:io:hadoop-common:build
> Task :sdks:java:extensions:sql:jdbc:shadowJarTest
> Task :runners:flink:1.7:check
> Task :runners:flink:1.7:build
> Task :runners:flink:1.7:buildDependents
> Task :sdks:java:extensions:sql:jdbc:preCommit
> Task :runners:flink:1.8:test

> Task :sdks:java:core:spotbugsMain
The following classes needed for analysis were missing:
  com.google.auto.value.AutoValue$Builder
  com.google.auto.value.AutoValue

> Task :sdks:java:core:test
> Task :runners:flink:1.8:check
> Task :runners:flink:1.8:build
> Task :runners:flink:1.8:buildDependents
> Task :sdks:java:core:validateShadedJarDoesntLeakNonProjectClasses
> Task :sdks:java:core:check
> Task :sdks:java:core:build
> Task :sdks:java:core:buildNeeded
> Task :runners:direct-java:jar
> Task :runners:direct-java:packageTests
> Task :runners:direct-java:assemble
> Task :runners:direct-java:analyzeClassesDependencies SKIPPED
> Task :runners:direct-java:analyzeTestClassesDependencies SKIPPED
> Task :runners:direct-java:analyzeDependencies SKIPPED
> Task :runners:direct-java:checkstyleMain
> Task :runners:direct-java:checkstyleTest
> Task :sdks:java:io:amazon-web-services2:test
> Task :sdks:java:io:amazon-web-services2:check
> Task :sdks:java:io:amazon-web-services2:build
> Task :sdks:java:io:amazon-web-services2:buildDependents
> Task :sdks:java:extensions:kryo:test
> Task :sdks:java:extensions:kryo:check
> Task :sdks:java:extensions:kryo:build
> Task :runners:direct-java:javadoc
[main] INFO org.gradle.internal.nativeintegration.services.NativeServices - 
Initialized native services in: /home/jenkins/.gradle/native

> Task :runners:direct-java:spotbugsMain
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.

> Task :sdks:java:extensions:jackson:test
> Task :sdks:java:extensions:jackson:check
> Task :sdks:java:extensions:jackson:build
> Task :sdks:java:extensions:jackson:buildDependents
> Task :sdks:java:io:hadoop-format:test
> Task :sdks:java:extensions:euphoria:test
> Task :sdks:java:extensions:join-library:test
> Task :sdks:java:extensions:sketching:test
> Task :sdks:java:extensions:sorter:test
> Task :sdks:java:io:amazon-web-services:test
> Task :sdks:java:io:amqp:test
> Task :sdks:java:extensions:sketching:check
> Task :sdks:java:extensions:sketching:build
> Task :sdks:java:extensions:sketching:buildDependents
> Task :sdks:java:io:cassandra:test
> Task :sdks:java:extensions:join-library:check
> Task :sdks:java:extensions:join-library:build
> Task :sdks:java:io:clickhouse:test
> Task :sdks:java:io:amqp:check
> Task

Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #4320

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PerformanceTests_WordCountIT_Py27 #380

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Python35 #246

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PerformanceTests_Analysis #470

2019-08-16 Thread Apache Jenkins Server
See 


Changes:

[amaliujia] [BEAM-7965] add retracting mode to model proto.

[jkai] release vendor calcite

[mxm] [BEAM-7936] Update portable WordCount Gradle task on portability page

[iemejia] [BEAM-7802] Make SQL example slightly simpler

[iemejia] [BEAM-7802] Inline AvroUtils methods to have only one public AvroUtils

[iemejia] [BEAM-7802] Fix minor issues (access modifiers + static) in

[iemejia] [BEAM-7802] Make Schema.toString method multi OS friendly

[iemejia] [BEAM-7802] Add AvroUtils.schemaCoder method to infer a Beam schema 
from

[dcavazos] [BEAM-7389] Add code examples for Partition page

[github] Downgrade log message level

[kyle.winkelman] [BEAM-7989] Remove side inputs from CacheVisitor calculation.

[lukecwik] [BEAM-7987] Drop empty Windmill workitem in WindowingWindmillReader

[thw] Move Design Documents index to cwiki

[aaltay] [BEAM-6694] Added Approximate Quantile Transfrom on Python SDK (#9153)

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 >  # 
 > timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 96abacba9b8c7475c753eb3c0b58cca27c46feb1 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 96abacba9b8c7475c753eb3c0b58cca27c46feb1
Commit message: "Merge pull request #9261 from davidcavazos/partition-page"
 > git rev-list --no-walk fd67fd3d3fa7015749b81fb84c9296b7ba347883 # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Analysis] $ /bin/bash -xe 
/tmp/jenkins318890089652940023.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Analysis] $ /bin/bash -xe 
/tmp/jenkins5641405277497402362.sh
+ rm -rf .env
[beam_PerformanceTests_Analysis] $ /bin/bash -xe 
/tmp/jenkins1414454542228066931.sh
+ virtualenv .env --python=python2.7 --system-site-packages
New python executable in 

Also creating executable in 

Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Analysis] $ /bin/bash -xe 
/tmp/jenkins1242880098022214983.sh
+ .env/bin/pip install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Requirement already up-to-date: setuptools in 
./.env/lib/python2.7/site-packages (41.1.0)
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages 
(19.2.2)
[beam_PerformanceTests_Analysis] $ /bin/bash -xe 
/tmp/jenkins8502367469829131719.sh
+ .env/bin/pip install requests google.cloud.bigquery mock 
google.cloud.bigtable google.cloud
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Requirement already satisfied: requests in /usr/lib/python2.7/dist-packages 
(2.9.1)
Collecting google.cloud.bigquery
  Using cached 
https://files.pythonhosted.org/packages/35/ef/a926bcbd1aaff3ea15b0a116ae56af524a969388a46e3343d7d5fd528cc9/google_cloud_bigquery-1.18.0-py2.py3-none-a

Jenkins build is back to normal : beam_PostCommit_Java_PVR_Spark_Batch #591

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PerformanceTests_WordCountIT_Py36 #349

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PerformanceTests_MongoDBIO_IT #2034

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PerformanceTests_WordCountIT_Py35 #372

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PerformanceTests_WordCountIT_Py37 #350

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java11_ValidatesRunner_Direct #1599

2019-08-16 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-3 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 96abacba9b8c7475c753eb3c0b58cca27c46feb1 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 96abacba9b8c7475c753eb3c0b58cca27c46feb1
Commit message: "Merge pull request #9261 from davidcavazos/partition-page"
 > git rev-list --no-walk 96abacba9b8c7475c753eb3c0b58cca27c46feb1 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 -Dorg.gradle.java.home=/usr/lib/jvm/java-8-openjdk-amd64 
:runners:direct-java:shadowJar :runners:direct-java:shadowTestJar
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:direct-java:processTestResources NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:pip

Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #684

2019-08-16 Thread Apache Jenkins Server
See 


--
[...truncated 357.53 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1566022037.24_c3601a94-7be4-47c4-8b49-6d8ff0899a85
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1566022041.96_103adfa3-e127-4fe6-81fa-1295d89a88ca failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1566022042.55_ddbf6351-ef33-42f7-b243-2883317a4338
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1566022043.74_2df43fb4-557f-4729-94d5-8c76f1f2402e 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1566022044.23_6d43dd6e-f6a9-4643-b790-70f51ee9e559 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line

Build failed in Jenkins: beam_PreCommit_Portable_Python_Cron #1034

2019-08-16 Thread Apache Jenkins Server
See 


--
[...truncated 898.41 KB...]
[ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42-side0 -> Map (2/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task 
ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42-side0 -> Map (2/2) 
(4133955c01c95a91070b6f0a52841f99) [CANCELED]
[flink-akka.actor.default-dispatcher-16] INFO 
org.apache.flink.runtime.taskmanager.Task - Attempting to cancel task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) 
(6e13a4847f894c03c4dc2f5378ea08e7).
[flink-akka.actor.default-dispatcher-16] INFO 
org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) 
(6e13a4847f894c03c4dc2f5378ea08e7) switched from RUNNING to CANCELING.
[flink-akka.actor.default-dispatcher-16] INFO 
org.apache.flink.runtime.taskmanager.Task - Triggering cancellation of task 
code ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2) (6e13a4847f894c03c4dc2f5378ea08e7).
[flink-akka.actor.default-dispatcher-16] INFO 
org.apache.flink.runtime.taskmanager.Task - Attempting to cancel task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(4c31548f509e30f971223c622c951be0).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) 
(6e13a4847f894c03c4dc2f5378ea08e7) switched from CANCELING to CANCELED.
[flink-akka.actor.default-dispatcher-16] INFO 
org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(4c31548f509e30f971223c622c951be0) switched from RUNNING to CANCELING.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources 
for ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2) (6e13a4847f894c03c4dc2f5378ea08e7).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) 
(6e13a4847f894c03c4dc2f5378ea08e7) [CANCELED]
[flink-akka.actor.default-dispatcher-16] INFO 
org.apache.flink.runtime.taskmanager.Task - Triggering cancellation of task 
code ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2) (4c31548f509e30f971223c622c951be0).
[flink-akka.actor.default-dispatcher-16] INFO 
org.apache.flink.runtime.taskmanager.Task - Attempting to cancel task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (1/2) 
(de396ce7ed1efb44b92285ede8397096).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(4c31548f509e30f971223c622c951be0) switched from CANCELING to CANCELED.
[flink-akka.actor.default-dispatcher-16] INFO 
org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (1/2) 
(de396ce7ed1efb44b92285ede8397096) switched from RUNNING to CANCELING.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources 
for ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2) (4c31548f509e30f971223c622c951be0).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(4c31548f509e30f971223c622c951be0) [CANCELED]
[flink-akka.actor.default-dispatcher-16] INFO 
org.apache.flink.runtime.taskmanager.Task - Triggering cancellation of task 
code ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map 
(1/2) (de396ce7ed1efb44b92285ede8397096).
[flink-akka.actor.default-dispatcher-16] INFO 
org.apache.flink.runtime.taskmanager.Task - Attempting to cancel task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (2/2) 
(07752039dc63afa79bc96e95409c951d).
[flink-akka.actor.default-dispatcher-16] INFO 
org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (2/2) 
(07752039dc63afa79bc96e95409c951d) switched from RUNNING to CANCELING.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map 
(1/2)] INFO org.apache.flink.runtime.task

Build failed in Jenkins: beam_sonarqube_report #680

2019-08-16 Thread Apache Jenkins Server
See 

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-2 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init  # 
 > timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 96abacba9b8c7475c753eb3c0b58cca27c46feb1 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 96abacba9b8c7475c753eb3c0b58cca27c46feb1
Commit message: "Merge pull request #9261 from davidcavazos/partition-page"
 > git rev-list --no-walk 96abacba9b8c7475c753eb3c0b58cca27c46feb1 # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
Injecting SonarQube environment variables using the configuration: ASF Sonar 
Analysis
[Gradle] - Launching build.
[src] $  
--continue -PdisableSpotlessCheck=true --no-parallel test jacocoTestReport 
sonarqube
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build
Configuration on demand is an incubating feature.

> Configure project :beam-test-tools
Found go 1.12 in /usr/bin/go, use it.

> Configure project :sdks:java:container
Found go 1.12 in /usr/bin/go, use it.

> Configure project :sdks:go
Found go 1.12 in /usr/bin/go, use it.

> Configure project :sdks:go:container
Found go 1.12 in /usr/bin/go, use it.

> Configure project :sdks:go:examples
Found go 1.12 in /usr/bin/go, use it.

> Configure project :sdks:go:test
Found go 1.12 in /usr/bin/go, use it.

> Configure project :sdks:python:container
Found go 1.12 in /usr/bin/go, use it.

> Task :beam-test-infra-metrics:compileJava NO-SOURCE
> Task :beam-test-infra-metrics:compileGroovy NO-SOURCE
> Task :beam-test-infra-metrics:processResources NO-SOURCE
> Task :beam-test-infra-metrics:classes UP-TO-DATE
> Task :beam-test-infra-metrics:compileTestJava NO-SOURCE
> Task :beam-test-infra-metrics:compileTestGroovy FROM-CACHE
> Task :beam-test-infra-metrics:processTestResources NO-SOURCE
> Task :beam-test-infra-metrics:testClasses UP-TO-DATE
> Task :beam-test-infra-metrics:test FROM-CACHE
> Task :release:compileJava NO-SOURCE
> Task :release:compileGroovy FROM-CACHE
> Task :release:processResources NO-SOURCE
> Task :release:classes UP-TO-DATE
> Task :release:compileTestJava NO-SOURCE
> Task :release:compileTestGroovy NO-SOURCE
> Task :release:processTestResources NO-SOURCE
> Task :release:testClasses UP-TO-DATE
> Task :release:test NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:extractProto
> Task :model:job-management:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:processResources
> Task :model:job-management:classes
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=510656a0-62e5-49b8-8d8d-db74c89079f2, 
currentDir=

Build failed in Jenkins: beam_PostCommit_XVR_Flink #111

2019-08-16 Thread Apache Jenkins Server
See 


--
[...truncated 3.67 MB...]
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (7/16) 
(7e3df41dc53de92c32610dae2542f2d3) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (7/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (7/16) 
(d651849f770c86ff6be2717cb470f5b3) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (7/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak 
safety net for task DataSink (DiscardingOutput) (7/16) 
(d651849f770c86ff6be2717cb470f5b3) [DEPLOYING]
[DataSink (DiscardingOutput) (7/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink 
(DiscardingOutput) (7/16) (d651849f770c86ff6be2717cb470f5b3) [DEPLOYING].
[DataSink (DiscardingOutput) (7/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - Registering task at network: 
DataSink (DiscardingOutput) (7/16) (d651849f770c86ff6be2717cb470f5b3) 
[DEPLOYING].
[DataSink (DiscardingOutput) (7/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (7/16) 
(d651849f770c86ff6be2717cb470f5b3) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-8] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink 
(DiscardingOutput) (7/16) (d651849f770c86ff6be2717cb470f5b3) switched from 
DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (7/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (7/16) 
(d651849f770c86ff6be2717cb470f5b3) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (7/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink 
(DiscardingOutput) (7/16) (d651849f770c86ff6be2717cb470f5b3).
[DataSink (DiscardingOutput) (7/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are 
closed for task DataSink (DiscardingOutput) (7/16) 
(d651849f770c86ff6be2717cb470f5b3) [FINISHED]
[flink-akka.actor.default-dispatcher-9] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task DataSink 
(DiscardingOutput) d651849f770c86ff6be2717cb470f5b3.
[flink-akka.actor.default-dispatcher-9] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink 
(DiscardingOutput) (7/16) (d651849f770c86ff6be2717cb470f5b3) switched from 
RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (14/16)] 
INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at 
[3]assert_that/{Group, Unkey, Match}) (14/16) 
(5ac6e7888554d36590ec6e706c0429a9) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (14/16)] 
INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (14/16) 
(5ac6e7888554d36590ec6e706c0429a9).
[jobmanager-future-thread-2] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink 
(DiscardingOutput) (14/16) (53f0aba254f9363dac9270c718766cb9) switched from 
CREATED to SCHEDULED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (14/16)] 
INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task MapPartition (MapPartition at 
[3]assert_that/{Group, Unkey, Match}) (14/16) 
(5ac6e7888554d36590ec6e706c0429a9) [FINISHED]
[flink-akka.actor.default-dispatcher-6] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) 
5ac6e7888554d36590ec6e706c0429a9.
[flink-akka.actor.default-dispatcher-9] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink 
(DiscardingOutput) (14/16) (53f0aba254f9363dac9270c718766cb9) switched from 
SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-9] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink 
(DiscardingOutput) (14/16) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-6] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink 
(DiscardingOutput) (14/16).
[flink-akka.actor.default-dispatcher-9] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (14/16) 
(5ac6e7888554d36590ec6e706c0429a9) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (14/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (14/16) 
(53f0aba254f9363dac9270c718766cb9) switched from CREATED

Jenkins build is back to normal : beam_PreCommit_Java_Cron #1682

2019-08-16 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py36 #350

2019-08-16 Thread Apache Jenkins Server
See 


--
[...truncated 57.38 KB...]
Requirement already satisfied: six>=1.5.2 in 

 (from grpcio<2,>=1.8->apache-beam==2.16.0.dev0) (1.12.0)
Collecting requests>=2.7.0 (from hdfs<3.0.0,>=2.1.0->apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.16.0.dev0)
Collecting pbr>=0.11 (from mock<3.0.0,>=1.0.1->apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/f9/d8/bd657bfa0e89eb71ad5e977ed99a9bb2b44e5db68d9190970637c26501bb/pbr-5.4.2-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5 (from 
oauth2client<4,>=2.0.1->apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/be/70/e5ea8afd6d08a4b99ebfc77bd1845248d56cfcf43d11f9dc324b9580a35c/pyasn1_modules-0.2.6-py2.py3-none-any.whl
Collecting rsa>=3.1.4 (from oauth2client<4,>=2.0.1->apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7 (from oauth2client<4,>=2.0.1->apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/6a/6e/209351ec34b7d7807342e2bb6ff8a96eef1fd5dcac13bdbadf065c2bb55c/pyasn1-0.4.6-py2.py3-none-any.whl
Requirement already satisfied: setuptools in 

 (from protobuf<4,>=3.5.0.post1->apache-beam==2.16.0.dev0) (41.1.0)
Collecting pyparsing>=2.1.4 (from pydot<2,>=1.2.0->apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/11/fa/0160cd525c62d7abd076a070ff02b2b94de589f1a9789774f17d7c54058e/pyparsing-2.4.2-py2.py3-none-any.whl
Collecting fasteners>=0.14 (from 
google-apitools<0.5.29,>=0.5.28->apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/18/bd/55eb2d6397b9c0e263af9d091ebdb756b15756029b3cededf6461481bc63/fasteners-0.15-py2.py3-none-any.whl
Collecting google-api-core[grpc]<2.0.0dev,>=1.6.0 (from 
google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/71/e5/7059475b3013a3c75abe35015c5761735ab224eb1b129fee7c8e376e7805/google_api_core-1.14.2-py2.py3-none-any.whl
Collecting grpc-google-iam-v1<0.12dev,>=0.11.4 (from 
google-cloud-pubsub<0.40.0,>=0.39.0->apache-beam==2.16.0.dev0)
Collecting google-resumable-media>=0.3.1 (from 
google-cloud-bigquery<1.18.0,>=1.6.0->apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/e2/5d/4bc5c28c252a62efe69ed1a1561da92bd5af8eca0cdcdf8e60354fae9b29/google_resumable_media-0.3.2-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from 
requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from 
requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from 
requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from 
requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting monotonic>=0.1 (from 
fasteners>=0.14->google-apitools<0.5.29,>=0.5.28->apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting googleapis-common-protos<2.0dev,>=1.6.0 (from 
google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.16.0.dev0)
Collecting google-auth<2.0dev,>=0.4.0 (from 
google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/c5/9b/ed0516cc1f7609fb0217e3057ff4f0f9f3e3ce79a369c6af4a6c5ca25664/google_auth-1.6.3-py2.py3-none-any.whl
Installing collected packages: crcmod, dill, fastavro, future, chardet, 
certifi, urllib3, idna, requests, docopt,