Jenkins build is back to normal : beam_PerformanceTests_WordCountIT_Py37 #343

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #662

2019-08-15 Thread Apache Jenkins Server
See 


--
[...truncated 361.87 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565870835.25_e7a91f87-e55f-4c78-b486-5fb459ffba8a
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565870839.26_ea1cceb5-401f-4542-a90a-0925cecad3da failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565870839.76_0720a11a-4b9d-4f66-a06b-626369833698
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565870840.83_452051dd-f749-4c0d-92e4-9b1162ce1848 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565870841.26_69de2d43-0428-4f64-8bd4-32766cbb71b5 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line

Jenkins build is back to normal : beam_PreCommit_Portable_Python_Cron #1027

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #4300

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python36 #229

2019-08-15 Thread Apache Jenkins Server
See 


--
[...truncated 159.07 KB...]
  }
},
{
  "kind": "ParallelWrite",
  "name": "s6",
  "properties": {
"create_disposition": "CREATE_IF_NEEDED",
"dataset": "BigQueryTornadoesIT",
"display_data": [],
"encoding": {
  "@type": "kind:windowed_value",
  "component_encodings": [
{
  "@type": 
"RowAsDictJsonCoder$eNprYE5OLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBK5xfp",
  "component_encodings": []
},
{
  "@type": "kind:global_window"
}
  ],
  "is_wrapper": true
},
"format": "bigquery",
"parallel_input": {
  "@type": "OutputReference",
  "output_name": "out",
  "step_name": "s5"
},
"schema": "{\"fields\": [{\"name\": \"month\", \"type\": \"INTEGER\", 
\"mode\": \"NULLABLE\"}, {\"name\": \"tornado_count\", \"type\": \"INTEGER\", 
\"mode\": \"NULLABLE\"}]}",
"table": "monthly_tornadoes_1565870598327",
"user_name": "Write/WriteToBigQuery/NativeWrite",
"write_disposition": "WRITE_TRUNCATE"
  }
}
  ],
  "type": "JOB_TYPE_BATCH"
}
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url 
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json
 after exception HttpError accessing 
:
 response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 
'application/json; charset=UTF-8', 'date': 'Thu, 15 Aug 2019 12:03:27 GMT', 
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 
'transfer-encoding': 'chunked', 'status': '429', 'content-length': '598', 
'-content-encoding': 'gzip'}>, content <{
  "error": {
"code": 429,
"message": "Quota exceeded for quota metric 
'dataflow.googleapis.com/create_requests' and limit 
'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for 
consumer 'project_number:844138762903'.",
"status": "RESOURCE_EXHAUSTED",
"details": [
  {
"@type": "type.googleapis.com/google.rpc.Help",
"links": [
  {
"description": "Google developer console API key",
"url": 
"https://console.developers.google.com/project/844138762903/apiui/credential";
  }
]
  }
]
  }
}
>
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url 
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json
 after exception HttpError accessing 
:
 response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 
'application/json; charset=UTF-8', 'date': 'Thu, 15 Aug 2019 12:03:29 GMT', 
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 
'transfer-encoding': 'chunked', 'status': '429', 'content-length': '598', 
'-content-encoding': 'gzip'}>, content <{
  "error": {
"code": 429,
"message": "Quota exceeded for quota metric 
'dataflow.googleapis.com/create_requests' and limit 
'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for 
consumer 'project_number:844138762903'.",
"status": "RESOURCE_EXHAUSTED",
"details": [
  {
"@type": "type.googleapis.com/google.rpc.Help",
"links": [
  {
"description": "Google developer console API key",
"url": 
"https://console.developers.google.com/project/844138762903/apiui/credential";
  }
]
  }
]
  }
}
>
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url 
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json
 after exception HttpError accessing 
:
 response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 
'application/json; charset=UTF-8', 'date': 'Thu, 15 Aug 2019 12:03:34 GMT', 
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 
'transfer-encoding': 'chunked', 'status': '429', 'content-length': '598', 
'-content-encoding': 'gzip'}>, content <{
  "error": {
"code": 429,
"message": "Quota exceeded for quota metric 
'dataflow.googleapis.com/create_requests' and limit 
'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for 
consum

Build failed in Jenkins: beam_PostCommit_Python2 #226

2019-08-15 Thread Apache Jenkins Server
See 

--
[...truncated 538.82 KB...]
[jobmanager-future-thread-2] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) 
(f8315c22068ad1893b40722dcebabdf3) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-8] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) 
(f8315c22068ad1893b40722dcebabdf3) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-8] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) (attempt #0) to 
localhost
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2).
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at 
[3]assert_that/{Group, Unkey, Match}) (2/2) (f8315c22068ad1893b40722dcebabdf3) 
switched from CREATED to DEPLOYING.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream 
leak safety net for task MapPartition (MapPartition at [3]assert_that/{Group, 
Unkey, Match}) (2/2) (f8315c22068ad1893b40722dcebabdf3) [DEPLOYING]
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task 
MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) 
(f8315c22068ad1893b40722dcebabdf3) [DEPLOYING].
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: 
MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) 
(f8315c22068ad1893b40722dcebabdf3) [DEPLOYING].
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at 
[3]assert_that/{Group, Unkey, Match}) (2/2) (f8315c22068ad1893b40722dcebabdf3) 
switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (2/2) 
(f8315c22068ad1893b40722dcebabdf3) switched from DEPLOYING to RUNNING.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at 
assert_that/Group/GroupByKey) (2/2) (53b2d290ef6362b6f67fbd2618dd585f) switched 
from RUNNING to FINISHED.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2) 
(53b2d290ef6362b6f67fbd2618dd585f).
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are 
closed for task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (2/2) 
(53b2d290ef6362b6f67fbd2618dd585f) [FINISHED]
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at 
assert_that/Group/GroupByKey) (1/2) (02f77a26219088a43365050c5a6badae) switched 
from RUNNING to FINISHED.
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2) 
(02f77a26219088a43365050c5a6badae).
[GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2)] INFO 
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are 
closed for task GroupReduce (GroupReduce at assert_that/Group/GroupByKey) (1/2) 
(02f77a26219088a43365050c5a6badae) [FINISHED]
[flink-akka.actor.default-dispatcher-8] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task GroupReduce 
(GroupReduce at assert_that/Group/GroupByKey) 53b2d290ef6362b6f67fbd2618dd585f.
[jobmanager-future-thread-16] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2) 
(679c42a4db01283597f41790d0c5b8e1) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-8] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task GroupReduce 
(GroupReduce at assert_tha

Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #64

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[kedin] Refactor release guide.

--
[...truncated 131.33 KB...]
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
testGroupByKey 
(apache_beam.testing.load_tests.group_by_key_test.GroupByKeyTest) ... ok

--
XML: 

--
Ran 1 test in 80.342s

OK

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 1m 28s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/bhe3gfxlv7nty

[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins1742831554946601490.sh
+ echo Changing number of workers to 6
Changing number of workers to 6
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins1285485799351159864.sh
+ gcloud dataproc clusters update beam-loadtests-python-gbk-flink-batch-64 
--num-workers=6 --quiet
Waiting on operation 
[projects/apache-beam-testing/regions/global/operations/0a16ddd2-f7f2-3630-9329-766fd7c98b77].
Waiting for cluster update operation...
.WARNING: Cluster has active YARN containers. If any container is actively 
writing to HDFS then the downsize operation may block until all writers are 
stopped.
...done.
Updated 
[https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-python-gbk-flink-batch-64].
[beam_LoadTests_Python_GBK_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins4914515595327733062.sh
+ echo src Load test: 2GB of 10B records src
src Load test: 2GB of 10B records src
[Gradle] - Launching build.
[src] $ 

 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g 
-PloadTest.mainClass=apache_beam.testing.load_tests.group_by_key_test:GroupByKeyTest.testGroupByKey
 -Prunner=PortableRunner 
'-PloadTest.args=--job_name=load_tests_Python_Flink_Batch_GBK_1_0815100241 
--publish_to_big_query=false --project=apache-beam-testing 
--metrics_dataset=load_test --metrics_table=python_flink_batch_GBK_1 
--input_options='{"num_records": 2,"key_size": 1,"value_size":9}' 
--iterations=1 --fanout=1 --parallelism=5 --job_endpoint=localhost:8099 
--environment_config=gcr.io/apache-beam-testing/beam_portability/python:latest 
--environment_type=DOCKER --runner=PortableRunner' 
:sdks:python:apache_beam:testing:load_tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:python:apache_beam:testing:load_tests:setupVirtualenv UP-TO-DATE

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Obtaining 
file://
Requirement already satisfied: crcmod<2.0,>=1.7 in 


Jenkins build is back to normal : beam_LoadTests_Python_GBK_Dataflow_Batch #76

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_LoadTests_Python_Combine_Flink_Batch #14

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[kedin] Refactor release guide.

--
[...truncated 130.61 KB...]
  Found existing installation: apache-beam 2.16.0.dev0
Not uninstalling apache-beam at 

 outside environment 

Can't uninstall 'apache-beam'. No files were found to uninstall.
  Running setup.py develop for apache-beam
Successfully installed apache-beam

> Task :sdks:python:apache_beam:testing:load_tests:run
:474:
 UserWarning: Normalizing '2.16.0.dev' to '2.16.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
testCombineGlobally (apache_beam.testing.load_tests.combine_test.CombineTest) 
... ok

--
XML: 

--
Ran 1 test in 31.388s

OK

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 39s
3 actionable tasks: 2 executed, 1 up-to-date

Publishing build scan...
https://gradle.com/s/n522mvbx5zgpo

[beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins4890330377632403975.sh
+ echo Changing number of workers to 6
Changing number of workers to 6
[beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins479515342477292662.sh
+ gcloud dataproc clusters update beam-loadtests-python-combine-flink-batch-14 
--num-workers=6 --quiet
Waiting on operation 
[projects/apache-beam-testing/regions/global/operations/d0f2f993-2607-39b5-bad9-7b917f01e4b6].
Waiting for cluster update operation...
.WARNING: Cluster has active YARN containers. If any container is actively 
writing to HDFS then the downsize operation may block until all writers are 
stopped.
..done.
Updated 
[https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-python-combine-flink-batch-14].
[beam_LoadTests_Python_Combine_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins5431042310269987580.sh
+ echo src Combine Python Load test: 2GB 10 byte records src
src Combine Python Load test: 2GB 10 byte records src
[Gradle] - Launching build.
[src] $ 

 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g 
-PloadTest.mainClass=apache_beam.testing.load_tests.combine_test:CombineTest.testCombineGlobally
 -Prunner=PortableRunner 
'-PloadTest.args=--job_name=load-tests-python-flink-batch-combine-1-0815150144 
--project=apache-beam-testing --publish_to_big_query=true 
--metrics_dataset=load_test --metrics_table=python_flink_batch_combine_1 
--input_options='{"num_records": 2,"key_size": 1,"value_size": 9}' 
--parallelism=5 --job_endpoint=localhost:8099 
--environment_config=gcr.io/apache-beam-testing/beam_portability/python:latest 
--environment_type=DOCKER --top_count=20 --runner=PortableRunner' 
:sdks:python:apache_beam:testing:load_tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotl

Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #663

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[markliu] Fix command format in Release Guide

--
[...truncated 361.92 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565889256.79_efd8c1f6-ac1c-4725-aa44-b8746d2450b9
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565889261.18_9f2744a0-c032-46d1-84a4-7bca61efa6b6 failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565889261.76_de80ac49-a987-479d-891e-14e9fa53d415
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565889262.9_11f43f96-cf53-43d6-a2dc-dfc4941c2652 failed 
in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565889263.36_103053fc-eace-4cc6-b968-6237bde82be0 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finis

Build failed in Jenkins: beam_PostCommit_Python2 #227

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[markliu] Fix command format in Release Guide

--
[...truncated 106.30 KB...]
Collecting idna<2.9,>=2.5 (from requests>=2.7.0->hdfs==2.1.0->-r 
/tmp/base_image_requirements.txt (line 34))
  Downloading 
https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
 (58kB)
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from 
requests>=2.7.0->hdfs==2.1.0->-r /tmp/base_image_requirements.txt (line 34))
  Downloading 
https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
 (150kB)
Collecting certifi>=2017.4.17 (from requests>=2.7.0->hdfs==2.1.0->-r 
/tmp/base_image_requirements.txt (line 34))
  Downloading 
https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
 (157kB)
Collecting google-auth<2.0dev,>=0.4.0 (from 
google-api-core[grpc]<2.0.0dev,>=1.4.1->google-cloud-pubsub==0.39.0->-r 
/tmp/base_image_requirements.txt (line 53))
  Downloading 
https://files.pythonhosted.org/packages/c5/9b/ed0516cc1f7609fb0217e3057ff4f0f9f3e3ce79a369c6af4a6c5ca25664/google_auth-1.6.3-py2.py3-none-any.whl
 (73kB)
Collecting h5py (from keras-applications>=1.0.5->tensorflow==1.11.0->-r 
/tmp/base_image_requirements.txt (line 70))
  Downloading 
https://files.pythonhosted.org/packages/53/08/27e4e9a369321862ffdce80ff1770553e9daec65d98befb2e14e7478b698/h5py-2.9.0-cp27-cp27mu-manylinux1_x86_64.whl
 (2.8MB)
Collecting markdown>=2.6.8 (from 
tensorboard<1.12.0,>=1.11.0->tensorflow==1.11.0->-r 
/tmp/base_image_requirements.txt (line 70))
  Downloading 
https://files.pythonhosted.org/packages/c0/4e/fd492e91abdc2d2fcb70ef453064d980688762079397f779758e055f6575/Markdown-3.1.1-py2.py3-none-any.whl
 (87kB)
Collecting werkzeug>=0.11.10 (from 
tensorboard<1.12.0,>=1.11.0->tensorflow==1.11.0->-r 
/tmp/base_image_requirements.txt (line 70))
  Downloading 
https://files.pythonhosted.org/packages/d1/ab/d3bed6b92042622d24decc7aadc8877badf18aeca1571045840ad4956d3f/Werkzeug-0.15.5-py2.py3-none-any.whl
 (328kB)
Collecting cachetools>=2.0.0 (from 
google-auth<2.0dev,>=0.4.0->google-api-core[grpc]<2.0.0dev,>=1.4.1->google-cloud-pubsub==0.39.0->-r
 /tmp/base_image_requirements.txt (line 53))
  Downloading 
https://files.pythonhosted.org/packages/2f/a6/30b0a0bef12283e83e58c1d6e7b5aabc7acfc4110df81a4471655d33e704/cachetools-3.1.1-py2.py3-none-any.whl
Building wheels for collected packages: avro, fastavro, crcmod, dill, future, 
hdfs, httplib2, oauth2client, pydot, pyvcf, pyyaml, typing, googledatastore, 
proto-google-cloud-datastore-v1, python-snappy, python-gflags, docopt, 
grpc-google-iam-v1, googleapis-common-protos, gast, keras-applications, 
absl-py, termcolor
  Building wheel for avro (setup.py): started
  Building wheel for avro (setup.py): finished with status 'done'
  Created wheel for avro: filename=avro-1.8.2-cp27-none-any.whl size=36662 
sha256=7b28593055368f65da5901f236350f3c748f0c61be1b80604c5769c89388d53d
  Stored in directory: 
/root/.cache/pip/wheels/bf/0b/75/1b3517b7d36ddc8ba5d22c0df5eb01e83979f34420066d643e
  Building wheel for fastavro (setup.py): started
  Building wheel for fastavro (setup.py): finished with status 'done'
  Created wheel for fastavro: 
filename=fastavro-0.21.4-cp27-cp27mu-linux_x86_64.whl size=994354 
sha256=de5f27150196e0ff4caa676db7a33b77f2dfe0b3df4ebca87273a58886b380f1
  Stored in directory: 
/root/.cache/pip/wheels/19/eb/58/a3b86f4ae93b28c5dd23ece0cef5f5af242f4fc9a22afbe55b
  Building wheel for crcmod (setup.py): started
  Building wheel for crcmod (setup.py): finished with status 'done'
  Created wheel for crcmod: filename=crcmod-1.7-cp27-cp27mu-linux_x86_64.whl 
size=32323 
sha256=820393bda17c443115bb1ae2e87586779401267b741438a00333b8cf7e53c4fc
  Stored in directory: 
/root/.cache/pip/wheels/50/24/4d/4580ca4a299f1ad6fd63443e6e584cb21e9a07988e4aa8daac
  Building wheel for dill (setup.py): started
  Building wheel for dill (setup.py): finished with status 'done'
  Created wheel for dill: filename=dill-0.2.9-cp27-none-any.whl size=77407 
sha256=a9d2fe1700884c2d2e4066e4522d8c9ce1280aa6a06a5f147bab28e1d7fe86af
  Stored in directory: 
/root/.cache/pip/wheels/5b/d7/0f/e58eae695403de585269f4e4a94e0cd6ca60ec0c202936fa4a
  Building wheel for future (setup.py): started
  Building wheel for future (setup.py): finished with status 'done'
  Created wheel for future: filename=future-0.16.0-cp27-none-any.whl 
size=499258 
sha256=0713d70a95a334b69023b2e8758fbbf1ae62d06699b63a4d0d9b0171c6b58a68
  Stored in directory: 
/root/.cache/pip/wheels/bf/c9/a3/c538d90ef17cf7823fa51fc701a7a7a910a80f6a405bf15b1a
  Building wheel for hdfs (setup.py): started
  Building wheel for hdfs (setup.py): finished with status 'done'
  Created wheel for hdfs

Jenkins build is back to normal : beam_PostCommit_Python36 #230

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #664

2019-08-15 Thread Apache Jenkins Server
See 


--
[...truncated 361.83 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565892596.26_b12a0662-476b-4dec-bd32-145accf56383
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565892601.17_ba0fd8e6-f9ee-461e-8ffb-809107ed0e5e failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565892601.75_6564e6bb-3375-4ca6-836e-77e21c3e96dd
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565892602.95_8afe17df-6834-4163-8645-2a473408c6e6 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565892603.47_fe315592-a19b-413d-aa69-7d645ae885bb 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark #4924

2019-08-15 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-2 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision ab15beec3eca0dfdb5ee5be85c79c9bde2f30aeb (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f ab15beec3eca0dfdb5ee5be85c79c9bde2f30aeb
Commit message: "Merge pull request #9349: Fix command format in Release Guide"
 > git rev-list --no-walk ab15beec3eca0dfdb5ee5be85c79c9bde2f30aeb # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g :runners:spark:validatesRunner
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :examples:java:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :sdks:java:io:hadoop-format:processResources NO-SOURCE
> Task :sdks:java:io:common:compileJava NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :examples:java:processTestResources
> Task :runners:spark:processResources NO-SOURCE
> Task :sdks:java:io:jdbc:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto
> Task :model:job-management:extractProto
> Task :sdks:java:io:hadoop-common:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:io:common:processResources NO-SOURCE
> Task :sdks:java:io:common:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:testing:test-utils:processTestResources NO-SOURCE
> Task :sdks:java:io:hadoop-format:processTestResources
> Task :runners:spark:processTestResources
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sd

Build failed in Jenkins: beam_PreCommit_Portable_Python_Cron #1028

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[markliu] Fix command format in Release Guide

--
[...truncated 906.51 KB...]
[ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42-side0 -> Map (2/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42-side0 -> Map (2/2) 
(e8c59543f2f0f4f3ff52e106e585).
[flink-akka.actor.default-dispatcher-17] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - [3]{read, split, 
pair_with_one} -> ToKeyedWorkItem (1/2) (d18f46b0c636f9f2430237ae2ca87d6e) 
switched from CANCELING to CANCELED.
[ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42-side0 -> Map (2/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task 
ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42-side0 -> Map (2/2) 
(e8c59543f2f0f4f3ff52e106e585) [CANCELED]
[flink-akka.actor.default-dispatcher-11] INFO 
org.apache.flink.runtime.taskmanager.Task - Triggering cancellation of task 
code ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2) (6c02c614b28e46ee4380123f4d1e8a56).
[flink-akka.actor.default-dispatcher-17] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - 
[1]write/Write/WriteImpl/PreFinalize -> Map -> ToKeyedWorkItem (2/2) 
(6e0618484e15a3b4f3145910619a5da6) switched from CANCELING to CANCELED.
[flink-akka.actor.default-dispatcher-11] INFO 
org.apache.flink.runtime.taskmanager.Task - Attempting to cancel task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(868789467419bec4a589299d91d2).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) 
(6c02c614b28e46ee4380123f4d1e8a56) switched from CANCELING to CANCELED.
[flink-akka.actor.default-dispatcher-11] INFO 
org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(868789467419bec4a589299d91d2) switched from RUNNING to CANCELING.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources 
for ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2) (6c02c614b28e46ee4380123f4d1e8a56).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (1/2) 
(6c02c614b28e46ee4380123f4d1e8a56) [CANCELED]
[flink-akka.actor.default-dispatcher-11] INFO 
org.apache.flink.runtime.taskmanager.Task - Triggering cancellation of task 
code ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2) (868789467419bec4a589299d91d2).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(868789467419bec4a589299d91d2) switched from CANCELING to CANCELED.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources 
for ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2) (868789467419bec4a589299d91d2).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(868789467419bec4a589299d91d2) [CANCELED]
[flink-akka.actor.default-dispatcher-11] INFO 
org.apache.flink.runtime.taskmanager.Task - Attempting to cancel task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (1/2) 
(0d9867e4e91112b58cec9ade85bf7885).
[flink-akka.actor.default-dispatcher-11] INFO 
org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (1/2) 
(0d9867e4e91112b58cec9ade85bf7885) switched from RUNNING to CANCELING.
[flink-akka.actor.default-dispatcher-11] INFO 
org.apache.flink.runtime.taskmanager.Task - Triggering cancellation of task 
code ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map 
(1/2) (0d9867e4e91112b58cec9ade85bf7885).
[flink-akka.actor.default-dispatcher-11] INFO 
org.apache.flink.runtime.taskmanager.Task - Attempting to cancel task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (2/2) 
(43cdd654b9039cdaca7bdb36fcb4298e).
[flink-akka.act

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #4302

2019-08-15 Thread Apache Jenkins Server
See 


--
[...truncated 124.79 KB...]
Ran 17 tests in 1616.722s

OK

> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_11_17_07-15690265444152890790?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_11_25_38-15137015676291381733?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_11_34_30-10771026261525344743?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_11_17_10-12584838270629814376?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_11_27_06-786813123607137620?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_11_17_10-496645404709575?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_11_26_22-14522378949464981394?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_11_17_09-9299669188505831330?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_11_25_30-1405117885125610251?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_11_17_10-184346675001997857?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_11_26_40-6114085641708006261?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_11_17_10-12390758444194289429?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_11_26_22-11707413776865751909?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_11_17_10-16873563278416407111?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_11_26_42-3074344556467370723?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_11_17_09-15043033808344383322?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_11_26_51-15714938834066331848?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle 
(apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_flatten_multiple_pcollections_having_multiple_consumers 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_as_list_and_as_dict_side_inputs 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

--
XML: nosetests-validatesRunnerBatchTests-df-py37.xml
--
XML: 

--
Ran 17 tests in 1619.80

Build failed in Jenkins: beam_PreCommit_Python_Cron #1678

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[markliu] Fix command format in Release Guide

--
[...truncated 5.07 MB...]
test_basic_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_composite_type_assertion 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_invalid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_must_be_primitive_type_or_constraint 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_mix_positional_and_keyword_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_only_positional_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_valid_simple_type_arguments 
(apache_beam.typehints.typehints_test.TakesDecoratorTestCase) ... ok
test_coercion_fail (apache_beam.typehints.typehints_test.TestCoerceToKvType) 
... ok
test_coercion_success (apache_beam.typehints.typehints_test.TestCoerceToKvType) 
... ok
test_functions_as_regular_generator 
(apache_beam.typehints.typehints_test.TestGeneratorWrapper) ... ok
test_compatibility (apache_beam.typehints.typehints_test.TupleHintTestCase) ... 
ok
test_compatibility_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_getitem_invalid_ellipsis_type_param 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_getitem_params_must_be_type_or_constraint 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_raw_tuple (apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_repr (apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_composite_type 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_composite_type_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_simple_type_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_invalid_simple_types 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_must_be_tuple 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_must_have_same_arity 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_valid_composite_type_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_valid_composite_types 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_valid_simple_type_arbitrary_length 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_type_check_valid_simple_types 
(apache_beam.typehints.typehints_test.TupleHintTestCase) ... ok
test_dict_union (apache_beam.typehints.typehints_test.UnionHintTestCase) ... ok
test_empty_union (apache_beam.typehints.typehints_test.UnionHintTestCase) ... ok
test_getitem_duplicates_ignored 
(apache_beam.typehints.typehints_test.UnionHintTestCase) ... ok
test_getitem_must_be_valid_type_param 
(apache_beam.typehints.typehints_test.UnionHintTestCase) ... ok
test_getitem_must_be_valid_type_param_cant_be_object_instance 
(apache_beam.typehints.typehints_test.UnionHintTestCase) ... ok
test_getitem_nested_unions_flattened 
(apache_beam.typehints.typehints_test.UnionHintTestCase) ... ok
test_nested_compatibility 
(apache_beam.typehints.typehints_test.UnionHintTestCase) ... ok
test_union_hint_compatibility 
(apache_beam.typehints.typehints_test.UnionHintTestCase) ... ok
test_union_hint_enforcement_composite_type_in_union 
(apache_beam.typehints.typehints_test.UnionHintTestCase) ... ok
test_union_hint_enforcement_not_part_of_union 
(apache_beam.typehints.typehints_test.UnionHintTestCase) ... ok
test_union_hint_enforcement_part_of_union 
(apache_beam.typehints.typehints_test.UnionHintTestCase) ... ok
test_union_hint_repr (apache_beam.typehints.typehints_test.UnionHintTestCase) 
... ok
Tests if custom message prints an empty string ... ok
test_deprecated_with_since_current 
(apache_beam.utils.annotations_test.AnnotationTests) ... ok
test_deprecated_with_since_current_message 
(apache_beam.utils.annotations_test.AnnotationTests) ... ok
test_deprecated_with_since_current_message_class 
(apache_beam.utils.annotations_test.AnnotationTests) ... ok
test_deprecated_with_since_current_message_custom 
(apache_beam.utils.annotations_test.AnnotationTests) ... ok
test_deprecated_without_current 
(apache_beam.utils.annotations_test.AnnotationTests) ... ok
test_deprecated_without_since_custom_should_fail 
(apache_beam.utils.annotations_test.AnnotationTests) ... ok
test_deprecated_without_since_should_fail 
(apache_beam.utils.annotations_test.AnnotationTests) ... ok
Tests since replacement token inclusion on the ... ok
test_experimental_with_current 
(apache_beam.utils.annotations_test.AnnotationTests

Jenkins build is back to normal : beam_PostCommit_Python2 #228

2019-08-15 Thread Apache Jenkins Server
See 


-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #665

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[dcavazos] [BEAM-7389] Add code examples for KvSwap page

[dcavazos] [BEAM-7389] Add code examples for Map page

[dcavazos] [BEAM-7389] Add code examples for Keys page

[dcavazos] [BEAM-7389] Add code examples for WithTimestamps page

--
[...truncated 362.27 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565897778.99_b2df3615-3b98-48cd-b412-e1458de9c23e
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565897783.74_631a927d-dc04-4b25-873c-391afae73cdf failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565897784.31_df8d06db-0f55-47cd-9a95-c5321b933439
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565897785.46_ca188b72-3f74-4504-a62e-f0565db2482f 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565897785.93_42d3c736-f9fc-4100-a8de-e4dca289fb4d 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/ru

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Spark #4925

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #4303

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #666

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[iemejia] Update build plugins

--
[...truncated 362.30 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565901625.15_1a6c9f74-99c4-4e3f-8484-5c9e02301786
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565901629.72_3dc8b75f-6994-4fd3-8bb2-7abbaeda6ad3 failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565901630.32_74338277-07b3-4687-ba36-ae19346891b2
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565901631.5_5a4f361e-0467-43f0-9c4c-468d8da11ffd failed 
in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565901631.99_6c1f04f9-210c-42e6-80d6-a1254ef4563a 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apa

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Direct #968

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[github] Update stager.py

--
Started by GitHub push by angoenka
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-4 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 22cee9184da6e6d65b9172d9a9b8d4c854a2ef3a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 22cee9184da6e6d65b9172d9a9b8d4c854a2ef3a
Commit message: "Merge pull request #9350 from the1plummie/patch-1"
 > git rev-list --no-walk 64507feef7982ba455290fa17ead840aa1d29dcd # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 :runners:direct-java:validatesRunner
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :runners:core-construction-java:processTestResources NO-SOURCE
> Task :runners:direct-java:processTestResources NO-SOURCE
> Task :model:job-management:extractProto
> Task :model:fn-execution:extractProto
> Task :model:job-management:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :model:fn-execution:processResources
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-e

Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #667

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[github] Update stager.py

--
[...truncated 362.06 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565903267.87_ea6fd9c7-363f-4216-97b2-9db0230e41aa
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565903272.5_3f33890c-6c46-4fe2-a8f3-7ce80e7f6a89 failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565903273.12_9fe7d9cb-89c6-4b99-8504-246a6bcaa0b9
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565903274.33_00a7a343-a2ec-4aa6-817f-9318388fc57d 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565903274.82_55cbe16e-5c9e-4c4c-a1d6-d5af0587a4dc 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_b

Build failed in Jenkins: beam_PostCommit_Java11_ValidatesRunner_Direct #1582

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[kedin] [SQL] Add custom table name resolution

--
Started by GitHub push by akedin
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision a44f7d82787198df0397578f40a893a39027f6a6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f a44f7d82787198df0397578f40a893a39027f6a6
Commit message: "Merge pull request #9343 from 
akedin/custom-table-name-resolution"
 > git rev-list --no-walk 22cee9184da6e6d65b9172d9a9b8d4c854a2ef3a # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 -Dorg.gradle.java.home=/usr/lib/jvm/java-8-openjdk-amd64 
:runners:direct-java:shadowJar :runners:direct-java:shadowTestJar
Starting a Gradle Daemon, 1 busy and 1 stopped Daemons could not be reused, use 
--status for details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :runners:direct-java:processTestResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-managemen

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Direct #969

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #668

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[kedin] [SQL] Add custom table name resolution

--
[...truncated 362.09 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565906568.15_39c0c7fc-3240-4f66-896d-5f18747729df
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565906573.19_d24a7e41-142f-4a6d-b2f7-29aea202a94e failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565906573.81_7fb1f562-c022-4bf7-a338-4ceb12d8a0d9
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565906575.05_b6ba0928-9933-48c2-95f1-f21cb159c569 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565906575.6_08eb8af4-1694-4712-8018-3d3d8fd8a7f8 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_fini

Build failed in Jenkins: beam_PostCommit_Go_VR_Spark #655

2019-08-15 Thread Apache Jenkins Server
See 
<https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/655/display/redirect?page=changes>

Changes:

[kedin] [SQL] Add custom table name resolution

--
[...truncated 155.06 KB...]
value: <
  unique_name: "passert.Sum(flat)"
  subtransforms: "e8"
  subtransforms: "e9"
  subtransforms: "e10"
  inputs: <
key: "n7"
value: "n7"
  >
>
  >
  pcollections: <
key: "n1"
value: <
  unique_name: "n1"
  coder_id: "c0"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  pcollections: <
key: "n2"
value: <
  unique_name: "n2"
  coder_id: "c3"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  pcollections: <
key: "n3"
value: <
  unique_name: "n3"
  coder_id: "c0"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  pcollections: <
key: "n4"
value: <
  unique_name: "n4"
  coder_id: "c3"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  pcollections: <
key: "n5"
value: <
  unique_name: "n5"
  coder_id: "c0"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  pcollections: <
key: "n6"
value: <
  unique_name: "n6"
  coder_id: "c3"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  pcollections: <
key: "n7"
value: <
  unique_name: "n7"
  coder_id: "c3"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  pcollections: <
key: "n8"
value: <
  unique_name: "n8"
  coder_id: "c4"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  pcollections: <
key: "n9"
value: <
  unique_name: "n9"
  coder_id: "c6"
  is_bounded: BOUNDED
  windowing_strategy_id: "w0"
>
  >
  windowing_strategies: <
key: "w0"
value: <
  window_fn: <
spec: <
  urn: "beam:windowfn:global_windows:v0.1"
>
  >
  merge_status: NON_MERGING
  window_coder_id: "c1"
  trigger: <
default: <
>
  >
  accumulation_mode: DISCARDING
  output_time: END_OF_WINDOW
  closing_behavior: EMIT_IF_NONEMPTY
  OnTimeBehavior: FIRE_ALWAYS
>
  >
  coders: <
key: "c0"
value: <
  spec: <
urn: "beam:coder:bytes:v1"
  >
>
  >
  coders: <
key: "c1"
value: <
  spec: <
urn: "beam:coder:global_window:v1"
  >
>
  >
  coders: <
key: "c2"
value: <
  spec: <
urn: "beam:go:coder:custom:v1"
payload: 
"Cgd2YXJpbnR6EgIIAhqFAQpxZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL2dvL3Rlc3QvdmVuZG9yL2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy9nby9wa2cvYmVhbS9jb3JlL3J1bnRpbWUvY29kZXJ4LmVuY1ZhckludFoSEAgWIgQIGUAPKgYIFBICCAgikQEKcWdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy9nby90ZXN0L3ZlbmRvci9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvZ28vcGtnL2JlYW0vY29yZS9ydW50aW1lL2NvZGVyeC5kZWNWYXJJbnRaEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
  >
>
  >
  coders: <
key: "c3"
value: <
  spec: <
urn: "beam:coder:length_prefix:v1"
  >
  component_coder_ids: "c2"
>
  >
  coders: <
key: "c4"
value: <
  spec: <
urn: "beam:coder:kv:v1"
  >
  component_coder_ids: "c3"
  component_coder_ids: "c3"
>
  >
  coders: <
key: "c5"
value: <
  spec: <
urn: "beam:coder:iterable:v1"
  >
  component_coder_ids: "c3"
>
  >
  coders: <
key: "c6"
value: <
  spec: <
urn: "beam:coder:kv:v1"
  >
  component_coder_ids: "c3"
  component_coder_ids: "c5"
>
  >
  environments: <
key: "go"
value: <
  urn: "beam:env:docker:v1"
  payload: "\n8us.gcr.io/apache-beam-testing/jenkins/go:20190815-215502"
>
  >
>
root_transform_ids: "e5"
root_transform_ids: "e6"
root_transform_ids: "e3"
root_transform_ids: "e4"
root_transform_ids: "e1

Build failed in Jenkins: beam_PostCommit_XVR_Flink #96

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[kedin] [SQL] Add custom table name resolution

--
[...truncated 4.02 MB...]
[DataSink (DiscardingOutput) (11/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak 
safety net for task DataSink (DiscardingOutput) (11/16) 
(25541aca8003b01bc7b3d70d4069f393) [DEPLOYING]
[DataSink (DiscardingOutput) (11/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink 
(DiscardingOutput) (11/16) (25541aca8003b01bc7b3d70d4069f393) [DEPLOYING].
[DataSink (DiscardingOutput) (11/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - Registering task at network: 
DataSink (DiscardingOutput) (11/16) (25541aca8003b01bc7b3d70d4069f393) 
[DEPLOYING].
[DataSink (DiscardingOutput) (11/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (11/16) 
(25541aca8003b01bc7b3d70d4069f393) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink 
(DiscardingOutput) (11/16) (25541aca8003b01bc7b3d70d4069f393) switched from 
DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (11/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (11/16) 
(25541aca8003b01bc7b3d70d4069f393) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (11/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink 
(DiscardingOutput) (11/16) (25541aca8003b01bc7b3d70d4069f393).
[DataSink (DiscardingOutput) (11/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are 
closed for task DataSink (DiscardingOutput) (11/16) 
(25541aca8003b01bc7b3d70d4069f393) [FINISHED]
[flink-akka.actor.default-dispatcher-6] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task DataSink 
(DiscardingOutput) 25541aca8003b01bc7b3d70d4069f393.
[flink-akka.actor.default-dispatcher-6] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink 
(DiscardingOutput) (11/16) (25541aca8003b01bc7b3d70d4069f393) switched from 
RUNNING to FINISHED.
[grpc-default-executor-1] WARN bundle_processor.create_operation - No unique 
name set for transform fn/read/ref_PCollection_PCollection_27:0 
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (8/16)] 
INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at 
[3]assert_that/{Group, Unkey, Match}) (8/16) (d2c679f5addf1b39a6b50cd841d50ef2) 
switched from RUNNING to FINISHED.
[jobmanager-future-thread-12] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink 
(DiscardingOutput) (8/16) (42f66ce650713edb0444784dd7079388) switched from 
CREATED to SCHEDULED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (8/16)] 
INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (8/16) 
(d2c679f5addf1b39a6b50cd841d50ef2).
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (8/16)] 
INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task MapPartition (MapPartition at 
[3]assert_that/{Group, Unkey, Match}) (8/16) (d2c679f5addf1b39a6b50cd841d50ef2) 
[FINISHED]
[flink-akka.actor.default-dispatcher-6] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink 
(DiscardingOutput) (8/16) (42f66ce650713edb0444784dd7079388) switched from 
SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) 
d2c679f5addf1b39a6b50cd841d50ef2.
[flink-akka.actor.default-dispatcher-6] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink 
(DiscardingOutput) (8/16) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink 
(DiscardingOutput) (8/16).
[flink-akka.actor.default-dispatcher-6] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (8/16) 
(d2c679f5addf1b39a6b50cd841d50ef2) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (8/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (8/16) 
(42f66ce650713edb0444784dd7079388) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (8/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak 
safety net for task DataSink (DiscardingOutput) (8/16) 
(42f66ce650713edb0444784dd7079388) [

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #4306

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[kedin] [SQL] Add custom table name resolution

--
[...truncated 157.48 KB...]
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
test_as_list_and_as_dict_side_inputs 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_as_singleton_with_different_defaults 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

--
XML: nosetests-validatesRunnerStreamingTests-df.xml
--
XML: 

--
Ran 15 tests in 1060.892s

OK
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_37_32-2169149881320378830?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_45_28-10255803588409847988?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_37_32-7806482495745033941?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_45_08-4426730301688311108?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_37_31-16408531332015926408?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_44_32-5730538641512155276?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_37_32-2542761261858729475?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_45_53-5837716225516242728?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_37_32-8204431946038616775?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_45_54-747802367853963773?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_37_32-2133522814704708884?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_37_32-16237511037685167527?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_45_33-17623460664355987372?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_37_32-13036780334823686431?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_45_58-6516656156530505116?project=apache-beam-testing.

> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_41_12-7834757586784943000?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_51_05-9190678265547674906?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_41_14-12716106807332385819?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_50_11-6942700433828944530?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_41_14-12454312070643104643?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_15_50_42-11814734421735400261?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/da

Build failed in Jenkins: beam_PostCommit_Java11_ValidatesRunner_Direct #1583

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[kedin] [SQL] Support complex identifiers in DataCatalog

--
Started by GitHub push by akedin
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-3 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > 
 >  # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision fd67fd3d3fa7015749b81fb84c9296b7ba347883 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f fd67fd3d3fa7015749b81fb84c9296b7ba347883
Commit message: "Merge pull request #9353 from 
akedin/datacatalog-custom-name-resolution"
 > git rev-list --no-walk a44f7d82787198df0397578f40a893a39027f6a6 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 -Dorg.gradle.java.home=/usr/lib/jvm/java-8-openjdk-amd64 
:runners:direct-java:shadowJar :runners:direct-java:shadowTestJar
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :runners:direct-java:processTestResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-managemen

Jenkins build is back to normal : beam_PostCommit_Go_VR_Spark #656

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #669

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[kedin] [SQL] Support complex identifiers in DataCatalog

--
[...truncated 361.88 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565910445.92_39be745f-bf0f-4193-a0ca-19152e8d8fa7
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565910450.42_01d3cd01-2bb5-4d35-969e-0b7baa1981b1 failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565910451.0_b32ce344-83ba-4e77-bb19-627bd92a05d5
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565910452.18_fda8d0a2-a78c-48ca-a93c-c9936dc53e61 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565910452.67_f731fb64-0486-4589-817c-4b74b9312033 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_

Build failed in Jenkins: beam_PostCommit_Java_PVR_Spark_Batch #577

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[github] Update stager.py

[kedin] [SQL] Add custom table name resolution

--
Started by GitHub push by angoenka
Started by GitHub push by angoenka
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-1 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 >  
 > # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision a44f7d82787198df0397578f40a893a39027f6a6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f a44f7d82787198df0397578f40a893a39027f6a6
Commit message: "Merge pull request #9343 from 
akedin/custom-table-name-resolution"
 > git rev-list --no-walk 64507feef7982ba455290fa17ead840aa1d29dcd # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 

 --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g 
:runners:spark:job-server:validatesPortableRunnerBatch
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :runners:spark:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:core-construction-java:processTestResources NO-SOURCE
> Task :runners:core-java:processTestResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :runners:reference:java:processResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :runners:reference:java:processTestResources NO-SOURCE
> Task :model:job-management:extractProto
> Task :runners:spark:processTestResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :model:fn-execution:processResources
> Task :model:job-management:processResources
> Task :sdks:java:core:processResources
> Task :sdks:java:core:generateTestAvroProtocol NO-SOURCE
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :sdks:java:core:generateTestAvroJava
> Task :sdks:java:core:generateTestGrammarSource NO-SOURCE
> Task :sdks:java:core:processTestResources
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:job-management:extractIncludeProto
> Task :model:

Jenkins build is back to normal : beam_PostCommit_XVR_Flink #97

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python2 #232

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[kedin] [SQL] Add custom table name resolution

--
[...truncated 766.17 KB...]
"serialized_fn": "ref_AppliedPTransform_count_9", 
"user_name": "count"
  }
}, 
{
  "kind": "ParallelDo", 
  "name": "s8", 
  "properties": {
"display_data": [
  {
"key": "fn", 
"label": "Transform Function", 
"namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
"type": "STRING", 
"value": "format_result"
  }, 
  {
"key": "fn", 
"label": "Transform Function", 
"namespace": "apache_beam.transforms.core.ParDo", 
"shortValue": "CallableWrapperDoFn", 
"type": "STRING", 
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
  }
], 
"non_parallel_inputs": {}, 
"output_info": [
  {
"encoding": {
  "@type": "kind:windowed_value", 
  "component_encodings": [
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": [
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": [], 
  "pipeline_proto_coder_id": 
"ref_Coder_FastPrimitivesCoder_3"
}, 
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": [], 
  "pipeline_proto_coder_id": 
"ref_Coder_FastPrimitivesCoder_3"
}
  ], 
  "is_pair_like": true, 
  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
}, 
{
  "@type": "kind:interval_window"
}
  ], 
  "is_wrapper": true
}, 
"output_name": "out", 
"user_name": "format.out"
  }
], 
"parallel_input": {
  "@type": "OutputReference", 
  "output_name": "out", 
  "step_name": "s7"
}, 
"serialized_fn": "ref_AppliedPTransform_format_10", 
"user_name": "format"
  }
}, 
{
  "kind": "ParallelDo", 
  "name": "s9", 
  "properties": {
"display_data": [
  {
"key": "fn", 
"label": "Transform Function", 
"namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
"type": "STRING", 
"value": ""
  }, 
  {
"key": "fn", 
"label": "Transform Function", 
"namespace": "apache_beam.transforms.core.ParDo", 
"shortValue": "CallableWrapperDoFn", 
"type": "STRING", 
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
  }
], 
"non_parallel_inputs": {}, 
"output_info": [
  {
"encoding": {
  "@type": "kind:windowed_value", 
  "component_encodings": [
{
  "@type": "kind:bytes"
}, 
{
  "@type": "kind:interval_window"
}
  ], 
  "is_wrapper": true
}, 
"output_name": "out", 
"user_name": "encode.out"
  }
], 
"parallel_input": {
  "@type": "OutputReference", 
  "output_name": "out", 
  "step_name": "s8"
}, 
"serialized_fn": "ref_AppliedPTransform_encode_11", 
"user_name": "encode"
  }
}, 
{
  "kind": "ParallelWrite", 
  "name": "s10", 
  "properties": {
"display_data": [], 
"encoding": {
  "@type": "kind:windowed_value", 
  "component_encodings": [
{
  "@type": "kind:bytes"
}, 
{
  "@type": "kind:global_window"
}
  ], 
  "is_wrapper": true
}, 
"format": "pubsub", 
"parallel_input": {
  "@type": "OutputReference", 
  "output_name": "out", 
  "step_name": "s9"
}, 
"pubsub_topic": 
"projects/apache-beam-testing/topics/wc_topic_outputddfc4ffd-59f2-4b11-b2fb-0a06f919328e",
 
"user_name": "WriteToPubSub/Write/NativeWrite"
  }
}
  ], 
  "type"

Jenkins build is back to normal : beam_PostCommit_Java11_ValidatesRunner_Direct #1584

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #4307

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Py_ValCont #4077

2019-08-15 Thread Apache Jenkins Server
See 


--
[...truncated 15.72 KB...]
Included projects: [root project 'beam', project ':beam-test-infra-metrics', 
project ':beam-test-tools', project ':examples', project ':model', project 
':release', project ':runners', project ':sdks', project ':vendor', project 
':website', project ':examples:java', project ':examples:kotlin', project 
':model:fn-execution', project ':model:job-management', project 
':model:pipeline', project ':runners:apex', project 
':runners:core-construction-java', project ':runners:core-java', project 
':runners:direct-java', project ':runners:extensions-java', project 
':runners:flink', project ':runners:gearpump', project 
':runners:google-cloud-dataflow-java', project ':runners:java-fn-execution', 
project ':runners:jet-experimental', project ':runners:local-java', project 
':runners:reference', project ':runners:samza', project ':runners:spark', 
project ':sdks:go', project ':sdks:java', project ':sdks:python', project 
':vendor:bytebuddy-1_9_3', project ':vendor:grpc-1_21_0', project 
':vendor:guava-26_0-jre', project ':vendor:sdks-java-extensions-protobuf', 
project ':runners:extensions-java:metrics', project ':runners:flink:1.5', 
project ':runners:flink:1.6', project ':runners:flink:1.7', project 
':runners:flink:1.8', project ':runners:google-cloud-dataflow-java:examples', 
project ':runners:google-cloud-dataflow-java:examples-streaming', project 
':runners:google-cloud-dataflow-java:worker', project 
':runners:reference:java', project ':runners:samza:job-server', project 
':runners:spark:job-server', project ':sdks:go:container', project 
':sdks:go:examples', project ':sdks:go:test', project ':sdks:java:bom', project 
':sdks:java:build-tools', project ':sdks:java:container', project 
':sdks:java:core', project ':sdks:java:extensions', project 
':sdks:java:fn-execution', project ':sdks:java:harness', project 
':sdks:java:io', project ':sdks:java:javadoc', project 
':sdks:java:maven-archetypes', project ':sdks:java:testing', project 
':sdks:python:apache_beam', project ':sdks:python:container', project 
':sdks:python:test-suites', project ':runners:flink:1.5:job-server', project 
':runners:flink:1.5:job-server-container', project 
':runners:flink:1.6:job-server', project 
':runners:flink:1.6:job-server-container', project 
':runners:flink:1.7:job-server', project 
':runners:flink:1.7:job-server-container', project 
':runners:flink:1.8:job-server', project 
':runners:flink:1.8:job-server-container', project 
':runners:google-cloud-dataflow-java:worker:legacy-worker', project 
':runners:google-cloud-dataflow-java:worker:windmill', project 
':sdks:java:extensions:euphoria', project 
':sdks:java:extensions:google-cloud-platform-core', project 
':sdks:java:extensions:jackson', project ':sdks:java:extensions:join-library', 
project ':sdks:java:extensions:kryo', project ':sdks:java:extensions:protobuf', 
project ':sdks:java:extensions:sketching', project 
':sdks:java:extensions:sorter', project ':sdks:java:extensions:sql', project 
':sdks:java:io:amazon-web-services', project 
':sdks:java:io:amazon-web-services2', project ':sdks:java:io:amqp', project 
':sdks:java:io:bigquery-io-perf-tests', project ':sdks:java:io:cassandra', 
project ':sdks:java:io:clickhouse', project ':sdks:java:io:common', project 
':sdks:java:io:elasticsearch', project ':sdks:java:io:elasticsearch-tests', 
project ':sdks:java:io:file-based-io-tests', project 
':sdks:java:io:google-cloud-platform', project ':sdks:java:io:hadoop-common', 
project ':sdks:java:io:hadoop-file-system', project 
':sdks:java:io:hadoop-format', project ':sdks:java:io:hbase', project 
':sdks:java:io:hcatalog', project ':sdks:java:io:jdbc', project 
':sdks:java:io:jms', project ':sdks:java:io:kafka', project 
':sdks:java:io:kinesis', project ':sdks:java:io:kudu', project 
':sdks:java:io:mongodb', project ':sdks:java:io:mqtt', project 
':sdks:java:io:parquet', project ':sdks:java:io:rabbitmq', project 
':sdks:java:io:redis', project ':sdks:java:io:solr', project 
':sdks:java:io:synthetic', project ':sdks:java:io:tika', project 
':sdks:java:io:xml', project ':sdks:java:maven-archetypes:examples', project 
':sdks:java:maven-archetypes:starter', project 
':sdks:java:testing:expansion-service', project 
':sdks:java:testing:load-tests', project ':sdks:java:testing:nexmark', project 
':sdks:java:testing:test-utils', project ':sdks:python:apache_beam:testing', 
project ':sdks:python:container:py3', project 
':sdks:python:test-suites:dataflow', project ':sdks:python:test-suites:direct', 
project ':sdks:python:test-suites:portable', project 
':sdks:python:test-suites:tox', project 
':sdks:java:extensions:sql:datacatalog', project 
':sdks:java:extensions:sql:hcatalog', project ':sdks:java:extensions:sql:jdbc', 
project ':sdks:java:extensions:sql:shell', project 
':sdks:java:io:elasticsearch-tests:elasticsearch-tests-2', project

Build failed in Jenkins: beam_PostCommit_Python35 #234

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[kedin] [SQL] Support complex identifiers in DataCatalog

--
[...truncated 83.50 KB...]
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) 
... ok
test_streaming_data_only 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_read 
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types 
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_write 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_without_schema 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... FAIL
test_big_query_standard_sql_kms_key_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: This test doesn't work on DirectRunner.
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok

==
FAIL: test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
--
Traceback (most recent call last):
  File 
"
 line 163, in test_big_query_standard_sql
big_query_query_to_table_pipeline.run_bq_pipeline(options)
  File 
"
 line 82, in run_bq_pipeline
result = p.run()
  File 
"
 line 107, in run
else test_runner_api))
  File 
"
 line 406, in run
self._options).run(False)
  File 
"
 line 419, in run
return self.runner.run_pipeline(self, self._options)
  File 
"
 line 51, in run_pipeline
hc_assert_that(self.result, pickler.loads(on_success_matcher))
AssertionError: 
Expected: (Test pipeline expected terminated in state: DONE and Expected 
checksum is 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72)
 but: Expected checksum is 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72 Actual 
checksum is da39a3ee5e6b4b0d3255bfef95601890afd80709

 >> begin captured logging << 
root: INFO: Running pipeline with DirectRunner.
root: DEBUG: Query SELECT * FROM (SELECT "apple" as fruit) UNION ALL (SELECT 
"orange" as fruit) does not reference any tables.
root: WARNING: Dataset 
apache-beam-testing:temp_dataset_e4cdfea28a2b48868ec84348f49e3d0e does not 
exist so we will create it as temporary with location=None
root: DEBUG: Creating or getting table  with schema {'fields': [{'mode': 'NULLABLE', 'type': 
'STRING', 'name': 'fruit'}]}.
root: DEBUG: Created the table with id output_table
root: INFO: Created table 
apache-beam-testing.python_query_to_table_156591064884.output_table with schema 
]>. Result: ]>
 selfLink: 
'https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/python_query_to_table_156591064884/tables/output_table'
 tableReference: 
 type: 'TABLE'>.
root: DEBUG: Attempting to flush to all destinations. Total buffered: 2
root: DEBUG: Flushing data to 
apache-beam-testing:python_query_to_table_156591064884.output_table. Total 2 
rows.
root: DEBUG: Passed: True. Errors are []
root: INFO: Attempting to perform query SELECT fruit from 
`python_query_to_table_156591064884.output_table`; to BQ
google.auth.transport._http_clie

Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #670

2019-08-15 Thread Apache Jenkins Server
See 


--
[...truncated 361.85 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565914204.59_0fe7a359-09fb-43a9-918c-a54c1a0b3ef4
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565914209.21_aabb3596-6133-44af-a58d-cbc4ea94daa0 failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565914209.8_1e3a00ba-1a83-4bb1-913e-5531b1d60da6
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565914211.05_08b01b3c-e885-45be-956e-bffd54c4c41e 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565914211.57_d12ca17c-1830-42c0-8402-044f460b132e 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 

Jenkins build is back to normal : beam_PreCommit_Portable_Python_Cron #1029

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Java_PVR_Spark_Batch #578

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PerformanceTests_Analysis #469

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[dcavazos] [BEAM-7389] Add code examples for KvSwap page

[dcavazos] [BEAM-7389] Add code examples for Map page

[dcavazos] [BEAM-7389] Add code examples for Keys page

[dcavazos] [BEAM-7389] Add code examples for WithTimestamps page

[iemejia] Update build plugins

[markliu] Fix command format in Release Guide

[github] Update stager.py

[kedin] [SQL] Add custom table name resolution

[kedin] [SQL] Support complex identifiers in DataCatalog

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 >  # 
 > timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision fd67fd3d3fa7015749b81fb84c9296b7ba347883 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f fd67fd3d3fa7015749b81fb84c9296b7ba347883
Commit message: "Merge pull request #9353 from 
akedin/datacatalog-custom-name-resolution"
 > git rev-list --no-walk b7bca84e8250bb7fc7d3245af55c8b671fab4736 # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Analysis] $ /bin/bash -xe 
/tmp/jenkins9135454280315907912.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Analysis] $ /bin/bash -xe 
/tmp/jenkins3104492853521759525.sh
+ rm -rf .env
[beam_PerformanceTests_Analysis] $ /bin/bash -xe 
/tmp/jenkins3574844084866136646.sh
+ virtualenv .env --python=python2.7 --system-site-packages
New python executable in 

Also creating executable in 

Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Analysis] $ /bin/bash -xe 
/tmp/jenkins5803890086577389750.sh
+ .env/bin/pip install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Requirement already up-to-date: setuptools in 
./.env/lib/python2.7/site-packages (41.1.0)
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages 
(19.2.2)
[beam_PerformanceTests_Analysis] $ /bin/bash -xe 
/tmp/jenkins431784327879480900.sh
+ .env/bin/pip install requests google.cloud.bigquery mock 
google.cloud.bigtable google.cloud
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Requirement already satisfied: requests in /usr/lib/python2.7/dist-packages 
(2.9.1)
Collecting google.cloud.bigquery
  Using cached 
https://files.pythonhosted.org/packages/35/ef/a926bcbd1aaff3ea15b0a116ae56af524a969388a46e3343d7d5fd528cc9/google_cloud_bigquery-1.18.0-py2.py3-none-any.whl
Collecting mock
  Using cached 
https://files.pythonhosted.org/packages/05/d2/f94e68be6b17f46d2c353564da56e6fb89ef09faeeff3313a046cb810ca9/mock-3.0.5-py2.py3-none-any.whl
Collecting google.cloud.bigtable
  Using cached 
https://files.pythonhosted.org/packages/24/a7/8025eefc5b9b43c656bb1fd9df9ee8da2c849107515e68fae425e8c01036/google_cloud_bigtable-0.34.0-py2.py3-none-any.whl
Collecting google.cloud
  Using cached 
https://files.pythonhosted.org/

Jenkins build is back to normal : beam_PostCommit_Python2 #233

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PreCommit_Python_Cron #1679

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py36 #345

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[dcavazos] [BEAM-7389] Add code examples for KvSwap page

[dcavazos] [BEAM-7389] Add code examples for Keys page

[dcavazos] [BEAM-7389] Add code examples for WithTimestamps page

[iemejia] Update build plugins

[github] Update stager.py

[kedin] [SQL] Add custom table name resolution

[kedin] [SQL] Support complex identifiers in DataCatalog

--
[...truncated 54.59 KB...]
Collecting pymongo<4.0.0,>=3.8.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/6d/27/f30b90f40054948b32df04a8e6355946874d084ac73755986b28d3003578/pymongo-3.9.0-cp36-cp36m-manylinux1_x86_64.whl
Collecting oauth2client<4,>=2.0.1 (from apache-beam==2.16.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in 

 (from apache-beam==2.16.0.dev0) (3.9.1)
Collecting pydot<2,>=1.2.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/33/d1/b1479a770f66d962f545c2101630ce1d5592d90cb4f083d38862e93d16d2/pydot-1.4.1-py2.py3-none-any.whl
Collecting python-dateutil<3,>=2.8.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl
Collecting pytz>=2018.3 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/87/76/46d697698a143e05f77bec5a526bf4e56a0be61d63425b68f4ba553b51f2/pytz-2019.2-py2.py3-none-any.whl
Collecting pyyaml<4.0.0,>=3.12 (from apache-beam==2.16.0.dev0)
Collecting avro-python3<2.0.0,>=1.8.1 (from apache-beam==2.16.0.dev0)
Collecting pyarrow<0.15.0,>=0.11.1 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/6f/df/33d8d6b682750d16baa9f45db825f33e0e0feb479fac1da9758f7ac8fd4b/pyarrow-0.14.1-cp36-cp36m-manylinux2010_x86_64.whl
Collecting cachetools<4,>=3.1.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/2f/a6/30b0a0bef12283e83e58c1d6e7b5aabc7acfc4110df81a4471655d33e704/cachetools-3.1.1-py2.py3-none-any.whl
Collecting google-apitools<0.5.29,>=0.5.28 (from apache-beam==2.16.0.dev0)
Collecting google-cloud-datastore<1.8.0,>=1.7.1 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/d0/aa/29cbcf8cf7d08ce2d55b9dce858f7c632b434cb6451bed17cb4275804217/google_cloud_datastore-1.7.4-py2.py3-none-any.whl
Collecting google-cloud-pubsub<0.40.0,>=0.39.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/c0/9a/4455b1c1450e9b912855b58ca6eee7a27ff1e9b52e4d98c243d93256f469/google_cloud_pubsub-0.39.1-py2.py3-none-any.whl
Collecting google-cloud-bigquery<1.18.0,>=1.6.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/d7/72/e88edd9a0b3c16a7b2c4107b1a9d3ff182b84a29f051ae15293e1375d7fe/google_cloud_bigquery-1.17.0-py2.py3-none-any.whl
Collecting google-cloud-core<2,>=0.28.1 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/ee/f0/084f598629db8e6ec3627688723875cdb03637acb6d86999bb105a71df64/google_cloud_core-1.0.3-py2.py3-none-any.whl
Collecting google-cloud-bigtable<0.33.0,>=0.31.1 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/08/77/b468e209dbb0a6f614e6781f06a4894299a4c6167c2c525cc086caa7c075/google_cloud_bigtable-0.32.2-py2.py3-none-any.whl
Collecting nose>=1.3.7 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/15/d8/dd071918c040f50fa1cf80da16423af51ff8ce4a0f2399b7bf8de45ac3d9/nose-1.3.7-py3-none-any.whl
Collecting nose_xunitmp>=0.4.1 (from apache-beam==2.16.0.dev0)
Collecting numpy<2,>=1.14.3 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/19/b9/bda9781f0a74b90ebd2e046fde1196182900bd4a8e1ea503d3ffebc50e7c/numpy-1.17.0-cp36-cp36m-manylinux1_x86_64.whl
Collecting pandas<0.24,>=0.23.4 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/e1/d8/feeb346d41f181e83fba45224ab14a8d8af019b48af742e047f3845d8cff/pandas-0.23.4-cp36-cp36m-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Collecting tenacity<6.0,>=5.0.2 (from apache-beam==2.16.0.dev0)
  Using cached 
https://files.pythonhosted.org/packages/6a/93/dfcf5b1b46ab29196274b78

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #4308

2019-08-15 Thread Apache Jenkins Server
See 


--
[...truncated 156.40 KB...]
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_17_35_42-16339682644242377892?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_17_26_08-13452746100838025882?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-15_17_35_47-4441302641318939012?project=apache-beam-testing.
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_dofn_lifecycle 
(apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_as_list_and_as_dict_side_inputs 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

--
XML: nosetests-validatesRunnerBatchTests-df-py35.xml
--
XML: 

--
Ran 17 tests in 1697.796s

OK

> Task :sdks:python:test-suites:dataflow:py35:validatesRunnerStreamingTests
>>> RUNNING integration tests with pipeline options: 
>>> --runner=TestDataflowRunner --project=apache-beam-testing 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --sdk_location=
>>>  --requirements_file=postcommit_requirements.txt --num_workers=1 
>>> --sleep_secs=20 --streaming 
>>> --dataflow_worker_jar=
>>>  
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>  
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 
>>> --attr=ValidatesRunner,!sickbay-streaming
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:179: UserWarning: Some syntactic constructs of Python 3 are not yet 
fully supported by Apache Beam.
  'Some syntactic constructs of Python 3 are not yet fully supported by '
:474:
 UserWarning: Normalizing '2.16.0.dev' to '2.16.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'


Jenkins build is back to normal : beam_PostCommit_Python35 #235

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #671

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[github] Downgrade log message level

--
[...truncated 357.52 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565921772.23_b6e6143f-51d5-4969-bd90-96e25bbf7892
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565921777.0_b4b9d646-1716-4163-8c74-2f35c912278e failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565921777.61_f373b2ea-23fb-45c7-9798-03b91f7d9703
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565921778.81_d9277f8c-b8e1-4097-91ff-be453aea91e9 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565921779.31_700415ab-d53d-4530-ae5c-a438e141119b 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  Fil

Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #4309

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Py_ValCont #4078

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python2 #235

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[github] Downgrade log message level

--
[...truncated 165.46 KB...]
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python2-235 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python2-235_test_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-235_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-235_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-235_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python2-235_test_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python2-235_datanode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python2-235_test_net

real0m0.949s
user0m0.669s
sys 0m0.135s

> Task :sdks:python:test-suites:direct:py2:mongodbioIT
Using default tag: latest
latest: Pulling from library/mongo
Digest: sha256:ec1fbbb3f75fdee7c3433ce2d5b8200e7c9916c57902023777bec5754da5e525
Status: Image is up to date for mongo:latest
c02f8dd5ea50f4a315328931039ed3652c6e93a826948e32fadcedf08b7f3de7
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Obtaining 
file://
Requirement already satisfied: crcmod<2.0,>=1.7 in 

 (from apache-beam==2.16.0.dev0) (1.7)
Requirement already satisfied: dill<0.2.10,>=0.2.9 in 

 (from apache-beam==2.16.0.dev0) (0.2.9)
Requirement already satisfied: fastavro<0.22,>=0.21.4 in 

 (from apache-beam==2.16.0.dev0) (0.21.24)
Requirement already satisfied: future<1.0.0,>=0.16.0 in 

 (from apache-beam==2.16.0.dev0) (0.17.1)
Requirement already satisfied: grpcio<2,>=1.8 in 

 (from apache-beam==2.16.0.dev0) (1.23.0)
Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in 

 (from apache-beam==2.16.0.dev0) (2.5.8)
Requirement already satisfied: httplib2<=0.12.0,>=0.8 in 

 (from apache-beam==2.16.0.dev0) (0.12.0)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in 

 (from apache-beam==2.16.0.dev0) (2.0.0)
Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in 

 (from apache-beam==2.16.0.dev0) (3.9.0)
Requirement already satisfied: oauth2client<4,>=2.0.1 in 

 (from apache-beam==2.16.0.dev0) (3.0.0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in 

 (from apache-beam==2.16.0.dev0) (3.9.1)
Requirement already satisfied: pydot<2,>=1.2.0 in 

 (from apache-beam==2.16.0.dev0) (1.4.1)
Requirement already satisfied: python-dateutil<3,>=2.8.0 in 

 (from apache-beam==2.16.0.dev0) (2.8.0)
Requirement already satisfied: pytz>=2018.3 in 

 (from apache-beam==2.16.0.dev0) (2019.2)
Requirement already satisfied: pyyaml<4.0.0,>=3.12 in 

 (from apache-beam==2.16.0.dev0) (3.13)
Requirement already satisfied: avro<2.0.0,>=1.8.1 in 

 (from apache-beam==2.16.0.dev0) (1.9.0)
Require

Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #672

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[lukecwik] [BEAM-7987] Drop empty Windmill workitem in WindowingWindmillReader

--
[...truncated 358.00 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565934079.53_338ea4ac-1695-4e0e-9d09-0ded8401cb52
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565934083.9_e265a147-3acc-42b5-9695-5fb288892470 failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565934084.46_366ba72f-1c68-4c08-9612-ab0cbbb12f19
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565934085.62_660b3494-af36-488b-96bf-b6504638a75d 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565934086.09_cd99e0d8-d156-44b2-b3a1-7ab7fa61025a 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit_

Build failed in Jenkins: beam_PostCommit_Python2 #236

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[lukecwik] [BEAM-7987] Drop empty Windmill workitem in WindowingWindmillReader

--
[...truncated 703.37 KB...]
[GroupReduce (GroupReduce at 
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group) 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for 
task GroupReduce (GroupReduce at 
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group) 
(1/2) (b2a901f34f921dc49f28c05cd0cfcaf1) [DEPLOYING].
[GroupReduce (GroupReduce at 
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group) 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at 
network: GroupReduce (GroupReduce at 
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group) 
(1/2) (b2a901f34f921dc49f28c05cd0cfcaf1) [DEPLOYING].
[GroupReduce (GroupReduce at 
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group) 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce 
(GroupReduce at 
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group) 
(1/2) (b2a901f34f921dc49f28c05cd0cfcaf1) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-8] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce 
(GroupReduce at 
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group) 
(1/2) (b2a901f34f921dc49f28c05cd0cfcaf1) switched from DEPLOYING to RUNNING.
[GroupReduce (GroupReduce at 
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group) 
(1/2)] WARN org.apache.flink.metrics.MetricGroup - The operator name 
GroupReduce (GroupReduce at 
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group) 
exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at 
[3]{ExternalTransform(beam:transforms:xlang:filter_less_than_eq), 
ExternalTransform(beam:transforms:xlang:count)}) -> FlatMap (FlatMap at 
ExtractOutput[0]) -> Map (Key Extractor) -> GroupCombine (GroupCombine at 
GroupCombine: 
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group) -> 
Map (Key Extractor) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
CHAIN MapPartition (MapPartition at 
[3]{ExternalTransform(beam:transforms:xlang:filter_less_than_eq), 
ExternalTransform(beam:transforms:xlang:count)}) -> FlatMap (FlatMap at 
ExtractOutput[0]) -> Map (Key Extractor) -> GroupCombine (GroupCombine at 
GroupCombine: 
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group) -> 
Map (Key Extractor) (1/2) (059bf42a8076768566a5791a523e0e4b) switched from 
RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 
[3]{ExternalTransform(beam:transforms:xlang:filter_less_than_eq), 
ExternalTransform(beam:transforms:xlang:count)}) -> FlatMap (FlatMap at 
ExtractOutput[0]) -> Map (Key Extractor) -> GroupCombine (GroupCombine at 
GroupCombine: 
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group) -> 
Map (Key Extractor) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
Freeing task resources for CHAIN MapPartition (MapPartition at 
[3]{ExternalTransform(beam:transforms:xlang:filter_less_than_eq), 
ExternalTransform(beam:transforms:xlang:count)}) -> FlatMap (FlatMap at 
ExtractOutput[0]) -> Map (Key Extractor) -> GroupCombine (GroupCombine at 
GroupCombine: 
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group) -> 
Map (Key Extractor) (1/2) (059bf42a8076768566a5791a523e0e4b).
[CHAIN MapPartition (MapPartition at 
[3]{ExternalTransform(beam:transforms:xlang:filter_less_than_eq), 
ExternalTransform(beam:transforms:xlang:count)}) -> FlatMap (FlatMap at 
ExtractOutput[0]) -> Map (Key Extractor) -> GroupCombine (GroupCombine at 
GroupCombine: 
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group) -> 
Map (Key Extractor) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
CHAIN MapPartition (MapPartition at 
[3]{ExternalTransform(beam:transforms:xlang:filter_less_than_eq), 
ExternalTransform(beam:transforms:xlang:count)}) -> FlatMap (FlatMap at 
ExtractOutput[0]) -> Map (Key Extractor) -> GroupCombine (GroupCombine at 
GroupCombine: 
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group) -> 
Map (Key Extractor) (2/2) (31ea0482e6d482ae35f1143bafce4e45) switched from 
RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 
[3]{ExternalTransform(beam:transforms:xlang:filter_less_than_eq), 
ExternalTransform(beam:transforms:xlang:count)}) -> FlatMap (FlatMap at 
ExtractOutput[0]) -> Map (Key Extractor) -> GroupCombine (GroupCombine at 
GroupCombine: 
ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Group) -> 
Map (Key Extractor) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
Ensuring all FileSystem streams are cl

Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #673

2019-08-15 Thread Apache Jenkins Server
See 


--
[...truncated 357.99 KB...]
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 268, in 
test_multimap_side_input_type_coercion
equal_to([('a', [1, 3]), ('b', [2])]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_side_input_type_coercion_1565935563.3_5d492408-b1d4-4cd4-b639-6c8df248b7a0
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 105, in 
test_pardo
assert_that(res, equal_to(['aax', 'bcbcx']))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_1565935568.02_b4669ba5-61f2-40f1-913a-98d7116aef43 failed in state 
FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in 
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_and_main_outputs_1565935568.64_9838c3ec-71c0-4fdd-98a7-7a65e800a652
 failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_inputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 188, in 
test_pardo_side_inputs
('a', 'y'), ('b', 'y'), ('c', 'y')]))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_inputs_1565935569.87_f66a8cb6-b421-43d8-9fcd-6ab76d77b452 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_side_outputs (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 159, in 
test_pardo_side_outputs
assert_that(xy.y, equal_to(['y', 'xy']), label='y')
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 455, in 
wait_until_finish
self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_pardo_side_outputs_1565935570.37_7c83a65e-b6d9-4103-b086-c84b1a78d993 
failed in state FAILED: java.lang.ClassCastException: 
org.apache.beam.sdk.coders.LengthPrefixCoder cannot be cast to 
org.apache.beam.sdk.coders.KvCoder

==
ERROR: test_pardo_state_only (__main__.SparkRunnerTest)
--
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 309, in 
test_pardo_state_only
equal_to(expected))
  File "apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 

Build failed in Jenkins: beam_PerformanceTests_AvroIOIT_HDFS #1994

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[github] Downgrade log message level

[lukecwik] [BEAM-7987] Drop empty Windmill workitem in WindowingWindmillReader

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-2 (beam) in workspace 

No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 >  # 
 > timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5d9bb4595c763025a369a959e18c6dd288e72314 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5d9bb4595c763025a369a959e18c6dd288e72314
Commit message: "[BEAM-7987] Drop empty Windmill workitem in 
WindowingWindmillReader (#9336)"
 > git rev-list --no-walk fd67fd3d3fa7015749b81fb84c9296b7ba347883 # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins8243530475976508643.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a 
--verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: 
[--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format: "default"
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins4523033077339690787.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins3605934555789885432.sh
+ kubectl 
--kubeconfig=
 create namespace beam-performancetests-avroioit-hdfs-1994
namespace/beam-performancetests-avroioit-hdfs-1994 created
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins773138678440314006.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=beam-performancetests-avroioit-hdfs-1994
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins1589422155598317282.sh
+ rm -rf 

[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins3166630879081584209.sh
+ rm -rf 

[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins4954304437566987836.sh
+ virtualenv 

 --python=python2.7
New python executable in 

Also creating executable in 

Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins5560973016239565829.sh
+ 

 install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after tha

Build failed in Jenkins: beam_PostCommit_XVR_Flink #101

2019-08-15 Thread Apache Jenkins Server
See 


--
[...truncated 3.87 MB...]
[flink-akka.actor.default-dispatcher-8] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (5/16) 
(ce95a9289d6ef2d8020c2aa5d25674ca) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (5/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (5/16) 
(94cb3c046b776ff2bc1afd2b1a0d439d) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (5/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak 
safety net for task DataSink (DiscardingOutput) (5/16) 
(94cb3c046b776ff2bc1afd2b1a0d439d) [DEPLOYING]
[DataSink (DiscardingOutput) (5/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink 
(DiscardingOutput) (5/16) (94cb3c046b776ff2bc1afd2b1a0d439d) [DEPLOYING].
[DataSink (DiscardingOutput) (5/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - Registering task at network: 
DataSink (DiscardingOutput) (5/16) (94cb3c046b776ff2bc1afd2b1a0d439d) 
[DEPLOYING].
[DataSink (DiscardingOutput) (5/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (5/16) 
(94cb3c046b776ff2bc1afd2b1a0d439d) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-9] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink 
(DiscardingOutput) (5/16) (94cb3c046b776ff2bc1afd2b1a0d439d) switched from 
DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (5/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (5/16) 
(94cb3c046b776ff2bc1afd2b1a0d439d) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (5/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink 
(DiscardingOutput) (5/16) (94cb3c046b776ff2bc1afd2b1a0d439d).
[DataSink (DiscardingOutput) (5/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are 
closed for task DataSink (DiscardingOutput) (5/16) 
(94cb3c046b776ff2bc1afd2b1a0d439d) [FINISHED]
[flink-akka.actor.default-dispatcher-8] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task DataSink 
(DiscardingOutput) 94cb3c046b776ff2bc1afd2b1a0d439d.
[flink-akka.actor.default-dispatcher-8] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink 
(DiscardingOutput) (5/16) (94cb3c046b776ff2bc1afd2b1a0d439d) switched from 
RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (16/16)] 
INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at 
[3]assert_that/{Group, Unkey, Match}) (16/16) 
(13b37a1fc8d1b90614dde87b88b61f97) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (16/16)] 
INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (16/16) 
(13b37a1fc8d1b90614dde87b88b61f97).
[MapPartition (MapPartition at [3]assert_that/{Group, Unkey, Match}) (16/16)] 
INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task MapPartition (MapPartition at 
[3]assert_that/{Group, Unkey, Match}) (16/16) 
(13b37a1fc8d1b90614dde87b88b61f97) [FINISHED]
[jobmanager-future-thread-11] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink 
(DiscardingOutput) (16/16) (25c1b99f728fb93602311d26910ded52) switched from 
CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) 
13b37a1fc8d1b90614dde87b88b61f97.
[flink-akka.actor.default-dispatcher-8] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink 
(DiscardingOutput) (16/16) (25c1b99f728fb93602311d26910ded52) switched from 
SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-8] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink 
(DiscardingOutput) (16/16) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-8] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition 
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (16/16) 
(13b37a1fc8d1b90614dde87b88b61f97) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink 
(DiscardingOutput) (16/16).
[DataSink (DiscardingOutput) (16/16)] INFO 
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (16/16) 
(25c1b99f728fb93602311d26910ded52) switched from CREATE

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #4310

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[lukecwik] [BEAM-7987] Drop empty Windmill workitem in WindowingWindmillReader

--
[...truncated 220.88 KB...]

OK

> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests
>>> RUNNING integration tests with pipeline options: 
>>> --runner=TestDataflowRunner --project=apache-beam-testing 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --sdk_location=
>>>  --requirements_file=postcommit_requirements.txt --num_workers=1 
>>> --sleep_secs=20 --streaming 
>>> --dataflow_worker_jar=
>>>  
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>  
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 
>>> --attr=ValidatesRunner,!sickbay-streaming
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:179: UserWarning: Some syntactic constructs of Python 3 are not yet 
fully supported by Apache Beam.
  'Some syntactic constructs of Python 3 are not yet fully supported by '
:474:
 UserWarning: Normalizing '2.16.0.dev' to '2.16.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
:84:
 UserWarning: Some syntactic constructs of Python 3 are not yet fully supported 
by Apache Beam.
  'Some syntactic constructs of Python 3 are not yet fully supported by '
:59:
 UserWarning: Datastore IO will support Python 3 after replacing 
googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
:47:
 UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: 
BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "

> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests
>>> RUNNING integration tests with pipeline options: 
>>> --runner=TestDataflowRunner --project=apache-beam-testing 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --sdk_location=
>>>  --requirements_file=postcommit_requirements.txt --num_workers=1 
>>> --sleep_secs=20 --streaming 
>>> --dataflow_worker_jar=
>>>  
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>  
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 
>>> --attr=ValidatesRunner,!sickbay-streaming
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:179: UserWarning: S

Build failed in Jenkins: beam_PreCommit_Portable_Python_Cron #1030

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[github] Downgrade log message level

[lukecwik] [BEAM-7987] Drop empty Windmill workitem in WindowingWindmillReader

--
[...truncated 899.88 KB...]
[flink-akka.actor.default-dispatcher-12] INFO 
org.apache.flink.runtime.taskmanager.Task - Triggering cancellation of task 
code ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2) (8af23f413825cfc590926b1bec99c9cf).
[flink-akka.actor.default-dispatcher-12] INFO 
org.apache.flink.runtime.taskmanager.Task - Attempting to cancel task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (1/2) 
(1328c666ead80e6c15bbc5e6be78b058).
[flink-akka.actor.default-dispatcher-12] INFO 
org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (1/2) 
(1328c666ead80e6c15bbc5e6be78b058) switched from RUNNING to CANCELING.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(8af23f413825cfc590926b1bec99c9cf) switched from CANCELING to CANCELED.
[flink-akka.actor.default-dispatcher-12] INFO 
org.apache.flink.runtime.taskmanager.Task - Triggering cancellation of task 
code ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map 
(1/2) (1328c666ead80e6c15bbc5e6be78b058).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources 
for ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2) (8af23f413825cfc590926b1bec99c9cf).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side2 -> Map (2/2) 
(8af23f413825cfc590926b1bec99c9cf) [CANCELED]
[flink-akka.actor.default-dispatcher-12] INFO 
org.apache.flink.runtime.taskmanager.Task - Attempting to cancel task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (2/2) 
(43fe55f70e9c105723eba4bc7d8bb142).
[flink-akka.actor.default-dispatcher-12] INFO 
org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (2/2) 
(43fe55f70e9c105723eba4bc7d8bb142) switched from RUNNING to CANCELING.
[flink-akka.actor.default-dispatcher-12] INFO 
org.apache.flink.runtime.taskmanager.Task - Triggering cancellation of task 
code ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map 
(2/2) (43fe55f70e9c105723eba4bc7d8bb142).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (2/2) 
(43fe55f70e9c105723eba4bc7d8bb142) switched from CANCELING to CANCELED.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources 
for ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map 
(2/2) (43fe55f70e9c105723eba4bc7d8bb142).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map 
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (2/2) 
(43fe55f70e9c105723eba4bc7d8bb142) [CANCELED]
[ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42-side0 -> Map (1/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42-side0 -> Map (1/2) 
(066a581e49e725e6b869077d64c7b738) switched from CANCELING to CANCELED.
[ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42-side0 -> Map (1/2)] 
INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42-side0 -> Map (1/2) 
(066a581e49e725e6b869077d64c7b738).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - 
ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map (1/2) 
(1328c666ead80e6c15bbc5e6be78b058) switched from CANCELING to CANCELED.
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map 
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources 
for ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map 
(1/2) (1328c666ead80e6c15bbc5e6be78b058).
[ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43-side0 -> Map 
(1/2)] INFO org.apache.flink.runtime.taskmanager.

Build failed in Jenkins: beam_PreCommit_Java_Cron #1678

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[github] Downgrade log message level

[lukecwik] [BEAM-7987] Drop empty Windmill workitem in WindowingWindmillReader

--
[...truncated 427.42 KB...]
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.

> Task :sdks:java:extensions:sql:datacatalog:javadoc

> Task :sdks:java:io:elasticsearch:spotbugsMain
The following classes needed for analysis were missing:
  com.google.auto.value.AutoValue$Builder
  com.google.auto.value.AutoValue

> Task :sdks:java:io:elasticsearch:test NO-SOURCE
> Task :sdks:java:io:elasticsearch:check
> Task :sdks:java:io:elasticsearch:build
> Task :sdks:java:fn-execution:test
> Task :sdks:java:fn-execution:check
> Task :sdks:java:fn-execution:build
> Task :sdks:java:core:javadoc
> Task :sdks:java:extensions:google-cloud-platform-core:test
> Task :sdks:java:extensions:google-cloud-platform-core:check
> Task :sdks:java:extensions:google-cloud-platform-core:build
[main] INFO org.gradle.internal.nativeintegration.services.NativeServices - 
Initialized native services in: /home/jenkins/.gradle/native

> Task :sdks:java:core:spotbugsMain
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.

> Task :sdks:java:extensions:sql:validateShadedJarDoesntLeakNonProjectClasses
> Task :sdks:java:extensions:sql:check
> Task :sdks:java:extensions:sql:build
> Task :sdks:java:extensions:sql:jdbc:shadowJarTest
> Task :sdks:java:io:hadoop-common:test
> Task :sdks:java:io:hadoop-common:check
> Task :sdks:java:io:hadoop-common:build
> Task :runners:flink:1.7:check
> Task :runners:flink:1.7:build
> Task :runners:flink:1.7:buildDependents
> Task :sdks:java:extensions:sql:jdbc:preCommit
> Task :runners:flink:1.8:test

> Task :sdks:java:core:spotbugsMain
The following classes needed for analysis were missing:
  com.google.auto.value.AutoValue$Builder
  com.google.auto.value.AutoValue

> Task :sdks:java:core:test
> Task :runners:flink:1.8:check
> Task :runners:flink:1.8:build
> Task :sdks:java:core:validateShadedJarDoesntLeakNonProjectClasses
> Task :runners:flink:1.8:buildDependents
> Task :sdks:java:core:check
> Task :sdks:java:core:build
> Task :sdks:java:core:buildNeeded
> Task :runners:direct-java:jar
> Task :runners:direct-java:packageTests
> Task :runners:direct-java:assemble
> Task :runners:direct-java:analyzeClassesDependencies SKIPPED
> Task :runners:direct-java:analyzeTestClassesDependencies SKIPPED
> Task :runners:direct-java:analyzeDependencies SKIPPED
> Task :runners:direct-java:checkstyleMain
> Task :runners:direct-java:checkstyleTest
> Task :sdks:java:io:amazon-web-services2:test
> Task :sdks:java:io:amazon-web-services2:check
> Task :sdks:java:io:amazon-web-services2:build
> Task :sdks:java:io:amazon-web-services2:buildDependents
> Task :sdks:java:extensions:kryo:test
> Task :sdks:java:extensions:kryo:check
> Task :sdks:java:extensions:kryo:build
> Task :runners:direct-java:javadoc
[main] INFO org.gradle.internal.nativeintegration.services.NativeServices - 
Initialized native services in: /home/jenkins/.gradle/native

> Task :runners:direct-java:spotbugsMain
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.

> Task :sdks:java:io:hadoop-format:test
> Task :sdks:java:extensions:euphoria:test
> Task :sdks:java:extensions:jackson:test
> Task :sdks:java:extensions:join-library:test
> Task :sdks:java:extensions:sketching:test
> Task :sdks:java:extensions:sorter:test
> Task :sdks:java:io:amazon-web-services:test
> Task :sdks:java:io:amqp:test
> Task :sdks:java:extensions:jackson:check
> Task :sdks:java:extensions:jackson:build
> Task :sdks:java:extensions:jackson:buildDependents
> Task :sdks:java:io:cassandra:test
> Task :sdks:java:io:clickhouse:test
> Task :sdks:java:extensions:join-library:check
> Task :sdks:java:extensions:join-library:build
> Task :sdks:java:extensions:sketching:check
> Task :sdks:java:extensions:sketching:build
> Task :sdks:java:extensions:sketching:buildDependents
> Task :sdks:java:io:amqp:check
> Task :sdks:java:io:amqp:build
> Task :sdks:java:io:amqp:buildDependents

> Task :sdks:java:io:cassandra:test

org.apache.beam.sdk.io.cassandra.CassandraIOTest > classMethod FAILED
com.datastax.driver.core.exceptions.NoHostAvailableException at 
CassandraIOTest.java:131

org.apache.beam.sdk.io.cassandra.CassandraIOTest > classMethod FAILED
java.lang.NullPointerException at CassandraIOTest.java:141

6 tests completed, 2 failed

> Task :sdks:java:io:cassandra:test FAILED
> Task :sdks:java:io:hadoop-file-system:test
> Task :sdks:java:

Build failed in Jenkins: beam_PostCommit_Python36 #238

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[lukecwik] [BEAM-7987] Drop empty Windmill workitem in WindowingWindmillReader

--
[...truncated 130.81 KB...]
  ],
  "is_wrapper": true
},
"output_name": "out",
"user_name": "encode.out"
  }
],
"parallel_input": {
  "@type": "OutputReference",
  "output_name": "out",
  "step_name": "s8"
},
"serialized_fn": "ref_AppliedPTransform_encode_11",
"user_name": "encode"
  }
},
{
  "kind": "ParallelWrite",
  "name": "s10",
  "properties": {
"display_data": [],
"encoding": {
  "@type": "kind:windowed_value",
  "component_encodings": [
{
  "@type": "kind:bytes"
},
{
  "@type": "kind:global_window"
}
  ],
  "is_wrapper": true
},
"format": "pubsub",
"parallel_input": {
  "@type": "OutputReference",
  "output_name": "out",
  "step_name": "s9"
},
"pubsub_topic": 
"projects/apache-beam-testing/topics/wc_topic_outputc2ead908-ed68-4c59-a70f-f6a8f69e5eca",
"user_name": "WriteToPubSub/Write/NativeWrite"
  }
}
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url 
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json
 after exception HttpError accessing 
:
 response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 
'application/json; charset=UTF-8', 'date': 'Fri, 16 Aug 2019 05:38:28 GMT', 
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 
'transfer-encoding': 'chunked', 'status': '429', 'content-length': '598', 
'-content-encoding': 'gzip'}>, content <{
  "error": {
"code": 429,
"message": "Quota exceeded for quota metric 
'dataflow.googleapis.com/create_requests' and limit 
'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for 
consumer 'project_number:844138762903'.",
"status": "RESOURCE_EXHAUSTED",
"details": [
  {
"@type": "type.googleapis.com/google.rpc.Help",
"links": [
  {
"description": "Google developer console API key",
"url": 
"https://console.developers.google.com/project/844138762903/apiui/credential";
  }
]
  }
]
  }
}
>
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url 
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json
 after exception HttpError accessing 
:
 response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 
'application/json; charset=UTF-8', 'date': 'Fri, 16 Aug 2019 05:38:30 GMT', 
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 
'transfer-encoding': 'chunked', 'status': '429', 'content-length': '598', 
'-content-encoding': 'gzip'}>, content <{
  "error": {
"code": 429,
"message": "Quota exceeded for quota metric 
'dataflow.googleapis.com/create_requests' and limit 
'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for 
consumer 'project_number:844138762903'.",
"status": "RESOURCE_EXHAUSTED",
"details": [
  {
"@type": "type.googleapis.com/google.rpc.Help",
"links": [
  {
"description": "Google developer console API key",
"url": 
"https://console.developers.google.com/project/844138762903/apiui/credential";
  }
]
  }
]
  }
}
>
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url 
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json
 after exception HttpError accessing 
:
 response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 
'application/json; charset=UTF-8', 'date': 'Fri, 16 Aug 2019 05:38:33 GMT', 
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 
'transfer-encoding': 'chunked', 'status': '429', 'content-length': '598', 
'-content-encoding': 'gzip'}>, content <{
  "error": {
"code": 429,
"message": "Quota exceeded for quota metric 
'dataflow.googleapis.com/create_requests' and limit 
'C

Build failed in Jenkins: beam_PostCommit_Java_PortabilityApi #2618

2019-08-15 Thread Apache Jenkins Server
See 


Changes:

[lukecwik] [BEAM-7987] Drop empty Windmill workitem in WindowingWindmillReader

--
[...truncated 27.24 KB...]
Resolving github.com/googleapis/gax-go: 
commit='317e0006254c44a0ac427cc52a0e083ff0b9622f', 
urls=[https://github.com/googleapis/gax-go.git, 
g...@github.com:googleapis/gax-go.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/hashicorp/hcl: 
commit='23c074d0eceb2b8a5bfdbb271ab780cde70f05a8', 
urls=[https://github.com/hashicorp/hcl.git, g...@github.com:hashicorp/hcl.git]
Resolving github.com/ianlancetaylor/demangle: 
commit='4883227f66371e02c4948937d3e2be1664d9be38', 
urls=[https://github.com/ianlancetaylor/demangle.git, 
g...@github.com:ianlancetaylor/demangle.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/kr/fs: commit='2788f0dbd16903de03cb8186e5c7d97b69ad387b', 
urls=[https://github.com/kr/fs.git, g...@github.com:kr/fs.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/magiconair/properties: 
commit='49d762b9817ba1c2e9d0c69183c2b4a8b8f1d934', 
urls=[https://github.com/magiconair/properties.git, 
g...@github.com:magiconair/properties.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]
Resolving github.com/mitchellh/go-homedir: 
commit='b8bc1bf767474819792c23f32d8286a45736f1c6', 
urls=[https://github.com/mitchellh/go-homedir.git, 
g...@github.com:mitchellh/go-homedir.git]
Resolving github.com/mitchellh/mapstructure: 
commit='a4e142e9c047c904fa2f1e144d9a84e6133024bc', 
urls=[https://github.com/mitchellh/mapstructure.git, 
g...@github.com:mitchellh/mapstructure.git]
Resolving cached github.com/coreos/etcd: 
commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', 
urls=[https://github.com/coreos/etcd.git, g...@github.com:coreos/etcd.git]

> Task :sdks:java:core:shadowJar
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:local-java:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:core-construction-java:jar
> Task :sdks:java:fn-execution:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :runners:core-java:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :examples:java:compileJava FROM-CACHE
> Task :examples:java:classes UP-TO-DATE
> Task :sdks:java:core:compileTestJava FROM-CACHE
> Task :sdks:java:core:testClasses
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-

Jenkins build is back to normal : beam_PerformanceTests_WordCountIT_Py36 #346

2019-08-15 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org