[jira] [Assigned] (BEAM-3675) FlinkRunner: Logging server

2018-03-07 Thread Aljoscha Krettek (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3675?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aljoscha Krettek reassigned BEAM-3675:
--

Assignee: (was: Aljoscha Krettek)

> FlinkRunner: Logging server
> ---
>
> Key: BEAM-3675
> URL: https://issues.apache.org/jira/browse/BEAM-3675
> Project: Beam
>  Issue Type: Sub-task
>  Components: runner-flink
>Reporter: Ben Sidhom
>Priority: Major
>
> An implementation of BeamFnLogging that uses the default Flink logging 
> mechanism.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3673) FlinkRunner: Harness manager for connecting operators to SDK Harnesses

2018-03-07 Thread Aljoscha Krettek (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3673?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aljoscha Krettek reassigned BEAM-3673:
--

Assignee: (was: Aljoscha Krettek)

> FlinkRunner: Harness manager for connecting operators to SDK Harnesses
> --
>
> Key: BEAM-3673
> URL: https://issues.apache.org/jira/browse/BEAM-3673
> Project: Beam
>  Issue Type: Sub-task
>  Components: runner-flink
>Reporter: Ben Sidhom
>Priority: Major
>
> SDK harnesses require a common set of gRPC services to operate. The role of 
> the harness manager is to provide a uniform interface that multiplexes data 
> streams and auxiliary data between SDK environments and operators within a 
> given job.
> Note that multiple operators may communicate with a single SDK environment to 
> amortize container initialization cost. Environments are _not_ shared between 
> different jobs.
> The initial implementation will shell out to local docker, but the harness 
> manager should eventually support working with externally-managed 
> environments (e.g., created by Kubernetes).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_Spark #1442

2018-03-07 Thread Apache Jenkins Server
See 


--
[...truncated 90.76 KB...]
'apache-beam-testing:bqjob_r1e4948c9b1cf8914_016204434936_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-03-08 06:19:03,502 8f69c21f MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-08 06:19:18,618 8f69c21f MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-08 06:19:20,899 8f69c21f MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:02.27s,  CPU:0.36s,  MaxMemory:25380kb 
STDOUT: Upload complete.
Waiting on bqjob_r7b9e0f204a60a7d2_016204438e78_1 ... (0s) Current status: 
RUNNING 
 Waiting on bqjob_r7b9e0f204a60a7d2_016204438e78_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r7b9e0f204a60a7d2_016204438e78_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-03-08 06:19:20,899 8f69c21f MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-08 06:19:50,766 8f69c21f MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-08 06:19:55,270 8f69c21f MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:04.47s,  CPU:0.39s,  MaxMemory:25356kb 
STDOUT: Upload complete.
Waiting on bqjob_r192c6860db618eb1_0162044411e9_1 ... (0s) Current status: 
RUNNING 
 Waiting on bqjob_r192c6860db618eb1_0162044411e9_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r192c6860db618eb1_0162044411e9_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-03-08 06:19:55,271 8f69c21f MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-08 06:20:21,110 8f69c21f MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-08 06:20:23,323 8f69c21f MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:02.20s,  CPU:0.32s,  MaxMemory:25584kb 
STDOUT: Upload complete.
Waiting on bqjob_r475904454856139e_016204448253_1 ... (0s) Current status: 
RUNNING 
 Waiting on bqjob_r475904454856139e_016204448253_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job

Build failed in Jenkins: beam_PerformanceTests_TextIOIT #243

2018-03-07 Thread Apache Jenkins Server
See 


--
[...truncated 16.94 KB...]
Requirement already satisfied: google-auth-httplib2 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
google-cloud-core<0.26dev,>=0.25.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0)
Requirement already satisfied: google-gax<0.16dev,>=0.15.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0)
Requirement already satisfied: grpc-google-iam-v1<0.12dev,>=0.11.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0)
Requirement already satisfied: cachetools>=2.0.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
google-auth<2.0.0dev,>=0.4.0->google-cloud-core<0.26dev,>=0.25.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0)
Requirement already satisfied: future<0.17dev,>=0.16.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0)
Requirement already satisfied: ply==3.8 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0)
Installing collected packages: hdfs, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam hdfs-2.1.0
[beam_PerformanceTests_TextIOIT] $ /bin/bash -xe 
/tmp/jenkins9049456857862630340.sh
+ .env/bin/python PerfKitBenchmarker/pkb.py --project=apache-beam-testing 
--dpb_log_level=INFO --maven_binary=/home/jenkins/tools/maven/latest/bin/mvn 
--bigquery_table=beam_performance.textioit_pkb_results 
--temp_dir= 
--official=true --benchmarks=beam_integration_benchmark --beam_it_timeout=1200 
--beam_it_profile=io-it --beam_prebuilt=true --beam_sdk=java 
--beam_it_module=sdks/java/io/file-based-io-tests 
--beam_it_class=org.apache.beam.sdk.io.text.TextIOIT 
'--beam_it_options=[--project=apache-beam-testing,--tempRoot=gs://temp-storage-for-perf-tests,--numberOfRecords=100,--filenamePrefix=gs://temp-storage-for-perf-tests/beam_PerformanceTests_TextIOIT/243/]'
 '--beam_extra_mvn_properties=[filesystem=gcs]'
2018-03-08 06:00:58,211 3006fc44 MainThread INFO Verbose logging to: 

2018-03-08 06:00:58,212 3006fc44 MainThread INFO PerfKitBenchmarker 
version: v1.12.0-388-g4da37ab
2018-03-08 06:00:58,213 3006fc44 MainThread INFO Flag values:
--beam_extra_mvn_properties=[filesystem=gcs]
--beam_it_class=org.apache.beam.sdk.io.text.TextIOIT
--beam_it_timeout=1200
--beam_it_module=sdks/java/io/file-based-io-tests
--beam_sdk=java
--temp_dir=
--maven_binary=/home/jenkins/tools/maven/latest/bin/mvn
--beam_it_options=[--project=apache-beam-testing,--tempRoot=gs://temp-storage-for-perf-tests,--numberOfRecords=100,--filenamePrefix=gs://temp-storage-for-perf-tests/beam_PerformanceTests_TextIOIT/243/]
--beam_prebuilt
--project=apache-beam-testing
--bigquery_table=beam_performance.textioit_pkb_results
--official
--dpb_log_level=INFO
--beam_it_profile=io-it
--benchmarks=beam_integration_benchmark
2018-03-08 06:00:58,558 3006fc44 MainThread WARNING  The key "flags" was not in 
the default config, but was in user overrides. This may indicate a typo.
2018-03-08 06:00:58,558 3006fc44 MainThread INFO Initializing the edw 
service decoder
2018-03-08 06:00:58,660 3006fc44 MainThread beam_integration_benchmark(1/1) 
INFO Provisioning resources for benchmark beam_integration_benchmark
2018-03-08 06:00:58,662 3006fc44 MainThread beam_integration_benchmark(1/1) 
INFO Preparing benchmark beam_integration_benchmark
2018-03-08 06:00:58,662 3006fc44 MainThread beam_integration_benchmark(1/1) 
INFO Running: git clone https://github.com/apache/beam.git
2018-03-08 06:01:07,072 3006fc44 MainThread beam_integration_benchmark(1/1) 
INFO Running benchmark beam_integration_benchmark
2018-03-08 06:01:07,078 3006fc44 MainThread beam_integration_benchmark(1/1) 
INFO Running: /home/jenkins/tools/maven/latest/bin/mvn -e verify 
-Dit.test=org.apache.beam.sdk.io.text.TextIOIT -DskipITs=false -pl 
sdks/java/io/file-based-io-tests -Pio-it -Pdataflow-runner -Dfilesystem=gcs 
-DintegrationTestPipelineOptions=["--project=apache-beam-testing","--tempRoot=gs://temp-storage-for-perf-tests","--numberOfRecords=100","--filenamePrefix=gs://temp-storage-for-perf-tests/beam_PerformanceTests_TextIOIT/243/","--runner=TestDataflowRunner"]
2018-03-08 06:21:07,083 3006fc44 Thread-2 ERRORIssueCommand 

Build failed in Jenkins: beam_PerformanceTests_Python #998

2018-03-07 Thread Apache Jenkins Server
See 


--
[...truncated 1.11 KB...]
 > git rev-list --no-walk e8db6a81e9b03b5a0bbffc9117f3128ea23ff184 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins5072190037636402985.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6983126648386199617.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins7684426167733845531.sh
+ virtualenv .env --system-site-packages
New python executable in 

Installing setuptools, pip, wheel...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins2877478039690729102.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in ./.env/lib/python2.7/site-packages
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins5720684349800075573.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins726999685574168.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
:318:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/security.html#snimissingwarning.
  SNIMissingWarning
:122:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/security.html#insecureplatformwarning.
  InsecurePlatformWarning
  Using cached numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23))
Requirement already satisfied: contextlib2>=0.5.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 24))
Requirement already satisfied: pywinrm in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: six in 
/home/jenkins/.local/lib/python2.7/site-packages (from absl-py->-r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: MarkupSafe>=0.23 in 
/usr/local/lib/python2.7/dist-packages (from jinja2>=2.7->-r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already 

svn commit: r25584 - /dev/beam/2.4.0/

2018-03-07 Thread robertwb
Author: robertwb
Date: Thu Mar  8 04:55:14 2018
New Revision: 25584

Log:
apache-beam-2.4.0.rc2

Modified:
dev/beam/2.4.0/apache-beam-2.4.0-python.zip
dev/beam/2.4.0/apache-beam-2.4.0-python.zip.asc
dev/beam/2.4.0/apache-beam-2.4.0-python.zip.md5
dev/beam/2.4.0/apache-beam-2.4.0-python.zip.sha1
dev/beam/2.4.0/apache-beam-2.4.0-python.zip.sha256
dev/beam/2.4.0/apache-beam-2.4.0-source-release.zip
dev/beam/2.4.0/apache-beam-2.4.0-source-release.zip.asc
dev/beam/2.4.0/apache-beam-2.4.0-source-release.zip.md5
dev/beam/2.4.0/apache-beam-2.4.0-source-release.zip.sha1
dev/beam/2.4.0/apache-beam-2.4.0-source-release.zip.sha256

Modified: dev/beam/2.4.0/apache-beam-2.4.0-python.zip
==
Binary files - no diff available.

Modified: dev/beam/2.4.0/apache-beam-2.4.0-python.zip.asc
==
--- dev/beam/2.4.0/apache-beam-2.4.0-python.zip.asc (original)
+++ dev/beam/2.4.0/apache-beam-2.4.0-python.zip.asc Thu Mar  8 04:55:14 2018
@@ -1,16 +1,16 @@
 -BEGIN PGP SIGNATURE-
 
-iQIzBAABCgAdFiEEvcmJsBvSpGNgEKHKjxVeCWENafsFAlqfmx0ACgkQjxVeCWEN
-aft3Tg/+IHuC3+1AIzwgBEVi9Mggjt5v9OEmEDCz81G2YBrRg7LzCVBADYYa+KCb
-u8E4uzyYsIPbaHbMZxJMPJKNqjMdjOAh874r1TMwbmuqTc222zOjTB+2rMa4IrAO
-xjrFBG5HIOqcCHONfpH8JFZJniu+7c3oJpdb9QeGzZCc3pLRrFV0mcgERHl2JG65
-AvXmm0/j7KC4u7wr7BxBEsL7U9fF0Qw8M3m1HaWepsd4H/+bWjvUBgfDG2gd13rz
-xDDn7CEGJa7njLdQSbQDgzoIIpSAVmWHiV3Rmecbwv1HlsP0s0omwKkfN0GJRVCx
-+2trTb3mphubhnlQTTFrhyeRqJ8Xu2rgj0BI33z/q8JuDDy/3pnzzq9OwclyMCnx
-rVJODvfBAfkR1I+T0KRYkPYPaw3U33dTr84tMdjbBFdY19Zz15oM1R1/sRT8zUw9
-IhJMBf3dsq8o1JQY6RWvst3st+y9QHTwLnKJBL7QMLytxI0lkbppLKPWYjf1+Xbq
-ezBS1lOpCHHreVHysrASXB0aP8cI5iJ3fMXeQ6e8c4mJMHW7RlJD6JWpaOf7q32L
-iNomET5AQ8kO006yAD+s535zKKRjSroM2TXNX+cLH+H3TBVl/lBA7IPG0Jw4UVeG
-LK2yqULva/QAgiYT7/KvIVS8t3/TfktEk5KC6sSPam/ebRQtBbM=
-=/MJs
+iQIzBAABCgAdFiEEvcmJsBvSpGNgEKHKjxVeCWENafsFAlqgwdUACgkQjxVeCWEN
+afuQ/Q/9E8tCU3QixpwFD4UCwGZ9CdfRCg+bScQiDqp0APZcuY9/GF1kIXH+CXwS
+0reczo2tpnOX/kErtgArOgSs771+zz/jB7Q6JdifuChUemFowodvrOGWATyMKqSY
+3r+mMOzhN6eZx+XRkeyXWOm5KDG7z2qWnUgMvW71RFag1FG1gzxRLvsXQVSu2yyX
+8OwPaUFbvKqnqONDPDgr6/9kh+KjJM6G9qHc3CkW+wGryWmBLtHzwzHivfxYCm5e
+LjNCZ7SK5VapgrVQYmwof9r7Snw/G01hfofzPu3/N/gpWS2t8/dJwTuiNrdlu2jg
+lwRHcb3Zws9bW9FadatXROpHtrlkWrW+mUzLC9GdRqwFz+wPwVIsCfReMYYJOUt8
+Dsw6E7r857RSSmrtU+BXYNQnHAOx9Tvzyy7TZlIaIM22U+fFv5ggveNL6uJttC0d
+II61FjHWm7aqoHXKwbb2qnYGydroK7kFXcrpLQbQoZrM9PxBS/UnZzXHxRIlgl3Z
+yWc5CKZKDSaQJGZ7/iIpcWYYHHD4jthRFpcHa0xV5bT0qWh5YK+0q5+upVt53Dqf
+nlWpHDoHG0UlqmBY7U1aszhcHlZ4HAQUz/hmlNARwQa4DE4Xgy+FZ2jmRYaz67kD
+Rt9lf7aww1dOhOyMJWqlYPuJ8VYD7pFWUS4ZMq/U4V6TwRlPoG4=
+=uWGT
 -END PGP SIGNATURE-

Modified: dev/beam/2.4.0/apache-beam-2.4.0-python.zip.md5
==
--- dev/beam/2.4.0/apache-beam-2.4.0-python.zip.md5 (original)
+++ dev/beam/2.4.0/apache-beam-2.4.0-python.zip.md5 Thu Mar  8 04:55:14 2018
@@ -1 +1 @@
-2200ddbc676cd9cdeaa80365c03f513c  apache-beam-2.4.0-python.zip
+45f67334a8ed075c2c0214eff660aab3  apache-beam-2.4.0-python.zip

Modified: dev/beam/2.4.0/apache-beam-2.4.0-python.zip.sha1
==
--- dev/beam/2.4.0/apache-beam-2.4.0-python.zip.sha1 (original)
+++ dev/beam/2.4.0/apache-beam-2.4.0-python.zip.sha1 Thu Mar  8 04:55:14 2018
@@ -1 +1 @@
-a02908e4b1ec83f41c62eebbfc80a13608780917  apache-beam-2.4.0-python.zip
+212863e203e7f486818118de453b5c5067fb2af3  apache-beam-2.4.0-python.zip

Modified: dev/beam/2.4.0/apache-beam-2.4.0-python.zip.sha256
==
--- dev/beam/2.4.0/apache-beam-2.4.0-python.zip.sha256 (original)
+++ dev/beam/2.4.0/apache-beam-2.4.0-python.zip.sha256 Thu Mar  8 04:55:14 2018
@@ -1 +1 @@
-72d5f7b7b6132f21a43bed1df86d2f8c520e021cb8e744bc30aac51bbe82ba3e  
apache-beam-2.4.0-python.zip
+ff12dac876a877349306221eca6e21eeed933857318aae2bbae32f04c645b475  
apache-beam-2.4.0-python.zip

Modified: dev/beam/2.4.0/apache-beam-2.4.0-source-release.zip
==
Binary files - no diff available.

Modified: dev/beam/2.4.0/apache-beam-2.4.0-source-release.zip.asc
==
--- dev/beam/2.4.0/apache-beam-2.4.0-source-release.zip.asc (original)
+++ dev/beam/2.4.0/apache-beam-2.4.0-source-release.zip.asc Thu Mar  8 04:55:14 
2018
@@ -1,16 +1,16 @@
 -BEGIN PGP SIGNATURE-
 
-iQIzBAABCgAdFiEEvcmJsBvSpGNgEKHKjxVeCWENafsFAlqfm0cACgkQjxVeCWEN
-afvJRhAAotKQVxeCG2Uf11O0DF5NCFZeNYRFgCo0GF7azMD/vblnT9yp8+J5JG3E
-oJ75RR60Ob7umyohTd0zDO/+aTEmCT4vRwemL+SJZa6oedS4TwMbKmXk65xVRdQQ
-wMquVMYaXRWK3gFAUiFPZ0JdzCeJJPprrZlw5HAXJLlYtzF/QP0GW0lSRKnycpVC
-FyugC95dYBVCLXJ0APc4oZid5VneGvJdknWBJy6ncAG2KsYbSAJP5NYHx4PSMUu7

Build failed in Jenkins: beam_PostCommit_Python_Verify #4386

2018-03-07 Thread Apache Jenkins Server
See 


--
[...truncated 1.02 MB...]
copying apache_beam/runners/dataflow/native_io/iobase.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/streaming_create.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/direct/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/bundle_factory.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/clock.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor_test.py 
-> apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/evaluation_context.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/executor.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/helper_transforms.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/transform_evaluator.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/util.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/watermark_manager.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/experimental/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/experimental
copying apache_beam/runners/experimental/python_rpc_direct/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying 
apache_beam/runners/experimental/python_rpc_direct/python_rpc_direct_runner.py 
-> apache-beam-2.5.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/experimental/python_rpc_direct/server.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/job/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/manager.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/utils.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/job
copying apache_beam/runners/portability/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_main.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/test/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/test
copying apache_beam/runners/worker/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/bundle_processor.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/data_plane.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/data_plane_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/log_handler.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/log_handler_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying 

[jira] [Comment Edited] (BEAM-3417) Fix Calcite assertions

2018-03-07 Thread Anton Kedin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3417?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16390732#comment-16390732
 ] 

Anton Kedin edited comment on BEAM-3417 at 3/8/18 4:30 AM:
---

*What fails?*

[Assert in question is in in VolcanoPlanner 
|https://github.com/apache/calcite/blob/9ab47c732ec99c3162954e1eb74eaa30cddf/core/src/main/java/org/apache/calcite/plan/volcano/VolcanoPlanner.java#L546].
 It checks whether [all traits are 
simple|https://github.com/apache/calcite/blob/0938c7b6d767e3242874d87a30d9112512d9243a/core/src/main/java/org/apache/calcite/plan/RelTraitSet.java#L517]
 by checking whether they're not instances of RelCompositeTrait.

*Why it fails?*

In our case, when it fails, traitSet.allSimple() has 2 traits. One is 
BeamLogicalConvention (it's not a composite trait), and another is a 
collation-related composite trait which causes the assertion to fail.

*Where does the composite trait come from?*

We specify the collation trait def in 
[BeamQueryPlanner|https://github.com/apache/beam/blob/14b17ad574342a875c8f99278e18c605aa5b4bc3/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamQueryPlanner.java#L89]
 before parsing. It then [gets replaced in 
LogicalTableScan|https://github.com/apache/calcite/blob/914b5cfbf978e796afeaff7b780e268ed39d8ec5/core/src/main/java/org/apache/calcite/rel/logical/LogicalTableScan.java#L102]
 with the [composite 
trait|https://github.com/apache/calcite/blob/0938c7b6d767e3242874d87a30d9112512d9243a/core/src/main/java/org/apache/calcite/plan/RelTraitSet.java#L239]
 which causes the failure.

*Why LogicalTableScan needs to do the collation magic?*

Dunno, it seems that it adds the statistics information to the collation trait 
so that the engine can handle sorting correctly. It does so only when we ask it 
to by adding the collation trait def.

*Why VolcanoPlanner doesn't like CompositeTraitSet in that part?*

Dunno.

*Do we need the collation trait def?*

Dunno.

*What do we do?*

If we can, it probably makes sense to replace LogicalTableScanRel with some 
kind of BeamIllogicalPCollectionScan which doesn't do all the collation magic 
or makes it configurable


was (Author: kedin):
*What fails?

[Assert in question is in in VolcanoPlanner 
|https://github.com/apache/calcite/blob/9ab47c732ec99c3162954e1eb74eaa30cddf/core/src/main/java/org/apache/calcite/plan/volcano/VolcanoPlanner.java#L546].
 It checks whether [all traits are 
simple|https://github.com/apache/calcite/blob/0938c7b6d767e3242874d87a30d9112512d9243a/core/src/main/java/org/apache/calcite/plan/RelTraitSet.java#L517]
 by checking whether they're not instances of RelCompositeTrait.

*Why it fails?

In our case, when it fails, traitSet.allSimple() has 2 traits. One is 
BeamLogicalConvention (it's not a composite trait), and another is a 
collation-related composite trait which causes the assertion to fail. 

*Where does the composite trait come from?

We specify the collation trait def in 
[BeamQueryPlanner|https://github.com/apache/beam/blob/14b17ad574342a875c8f99278e18c605aa5b4bc3/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamQueryPlanner.java#L89]
 before parsing. It then [gets replaced in 
LogicalTableScan|https://github.com/apache/calcite/blob/914b5cfbf978e796afeaff7b780e268ed39d8ec5/core/src/main/java/org/apache/calcite/rel/logical/LogicalTableScan.java#L102]
 with the [composite 
trait|https://github.com/apache/calcite/blob/0938c7b6d767e3242874d87a30d9112512d9243a/core/src/main/java/org/apache/calcite/plan/RelTraitSet.java#L239]
 which causes the failure.

*Why LogicalTableScan needs to do the collation magic?

Dunno, it seems that it adds the statistics information to the collation trait 
so that the engine can handle sorting correctly. It does so only when we ask it 
to by adding the collation trait def.

*Why VolcanoPlanner doesn't like CompositeTraitSet in that part?

Dunno.

*Do we need the collation trait def?

Dunno.

*What do we do?

If we can, it probably makes sense to replace LogicalTableScanRel with some 
kind of BeamIllogicalPCollectionScan which doesn't do all the collation magic 
or makes it configurable

> Fix Calcite assertions
> --
>
> Key: BEAM-3417
> URL: https://issues.apache.org/jira/browse/BEAM-3417
> Project: Beam
>  Issue Type: Task
>  Components: dsl-sql
>Reporter: Anton Kedin
>Priority: Major
>
> Currently we disable assertions in test for every project which depends on 
> Beam SQL / Calcite. Otherwise it fails assertions when Calcite validates 
> relational representation of the query. E.g. in projects which depend on Beam 
> SQL / Calcite we have to specify 
> {code:java|title=build.gradle}
> test {
>  jvmArgs "-da" 
> }
> {code}
> We need to either update our relational conversion logic 

[jira] [Commented] (BEAM-3417) Fix Calcite assertions

2018-03-07 Thread Anton Kedin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3417?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16390732#comment-16390732
 ] 

Anton Kedin commented on BEAM-3417:
---

*What fails?

[Assert in question is in in VolcanoPlanner 
|https://github.com/apache/calcite/blob/9ab47c732ec99c3162954e1eb74eaa30cddf/core/src/main/java/org/apache/calcite/plan/volcano/VolcanoPlanner.java#L546].
 It checks whether [all traits are 
simple|https://github.com/apache/calcite/blob/0938c7b6d767e3242874d87a30d9112512d9243a/core/src/main/java/org/apache/calcite/plan/RelTraitSet.java#L517]
 by checking whether they're not instances of RelCompositeTrait.

*Why it fails?

In our case, when it fails, traitSet.allSimple() has 2 traits. One is 
BeamLogicalConvention (it's not a composite trait), and another is a 
collation-related composite trait which causes the assertion to fail. 

*Where does the composite trait come from?

We specify the collation trait def in 
[BeamQueryPlanner|https://github.com/apache/beam/blob/14b17ad574342a875c8f99278e18c605aa5b4bc3/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/planner/BeamQueryPlanner.java#L89]
 before parsing. It then [gets replaced in 
LogicalTableScan|https://github.com/apache/calcite/blob/914b5cfbf978e796afeaff7b780e268ed39d8ec5/core/src/main/java/org/apache/calcite/rel/logical/LogicalTableScan.java#L102]
 with the [composite 
trait|https://github.com/apache/calcite/blob/0938c7b6d767e3242874d87a30d9112512d9243a/core/src/main/java/org/apache/calcite/plan/RelTraitSet.java#L239]
 which causes the failure.

*Why LogicalTableScan needs to do the collation magic?

Dunno, it seems that it adds the statistics information to the collation trait 
so that the engine can handle sorting correctly. It does so only when we ask it 
to by adding the collation trait def.

*Why VolcanoPlanner doesn't like CompositeTraitSet in that part?

Dunno.

*Do we need the collation trait def?

Dunno.

*What do we do?

If we can, it probably makes sense to replace LogicalTableScanRel with some 
kind of BeamIllogicalPCollectionScan which doesn't do all the collation magic 
or makes it configurable

> Fix Calcite assertions
> --
>
> Key: BEAM-3417
> URL: https://issues.apache.org/jira/browse/BEAM-3417
> Project: Beam
>  Issue Type: Task
>  Components: dsl-sql
>Reporter: Anton Kedin
>Priority: Major
>
> Currently we disable assertions in test for every project which depends on 
> Beam SQL / Calcite. Otherwise it fails assertions when Calcite validates 
> relational representation of the query. E.g. in projects which depend on Beam 
> SQL / Calcite we have to specify 
> {code:java|title=build.gradle}
> test {
>  jvmArgs "-da" 
> }
> {code}
> We need to either update our relational conversion logic or come up with some 
> other solution so that we don't have to disable assertions globally. If it's 
> an incorrect assertion in Calcite then we need to fix it there.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3796) Implement TypedWrite extending WriteFilesResult for XmlIO

2018-03-07 Thread Eugene Kirpichov (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3796?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16390712#comment-16390712
 ] 

Eugene Kirpichov commented on BEAM-3796:


I don't think this is worth doing. FileIO.write is quite concise to use, so I 
don't see a point in adding its entire API surface to XmlIO, and I don't think 
it would set a good example for other file-based IOs. If FileIO.write had 
existed earlier, XmlIO would never even have become an IO, and would just have 
stayed a Sink, as nothing distinguishes it from other file-based IOs except for 
the format of the files it writes.

> Implement TypedWrite extending WriteFilesResult for XmlIO
> -
>
> Key: BEAM-3796
> URL: https://issues.apache.org/jira/browse/BEAM-3796
> Project: Beam
>  Issue Type: Improvement
>  Components: io-ideas
>Reporter: Łukasz Gajowy
>Assignee: Eugene Kirpichov
>Priority: Major
>
> I mean similar implementation that is there in TextIO or AvroIO. Users could 
> leverage things like getPerDestinationFilenames() directly from the IO 
> without the need to use FileIO explicitly. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (BEAM-3476) Implement Covariance built-in aggregation functions

2018-03-07 Thread Anton Kedin (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3476?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Anton Kedin closed BEAM-3476.
-
   Resolution: Fixed
Fix Version/s: Not applicable

> Implement Covariance built-in aggregation functions
> ---
>
> Key: BEAM-3476
> URL: https://issues.apache.org/jira/browse/BEAM-3476
> Project: Beam
>  Issue Type: Sub-task
>  Components: dsl-sql
>Reporter: Kai Jiang
>Assignee: Kai Jiang
>Priority: Major
> Fix For: Not applicable
>
>
> implement covar_pop(x,y) and covar_samp(x,y)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (BEAM-3581) [SQL] Support for Non-ASCII chars is flaky

2018-03-07 Thread Anton Kedin (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3581?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Anton Kedin closed BEAM-3581.
-
   Resolution: Fixed
Fix Version/s: Not applicable

> [SQL] Support for Non-ASCII chars is flaky
> --
>
> Key: BEAM-3581
> URL: https://issues.apache.org/jira/browse/BEAM-3581
> Project: Beam
>  Issue Type: Bug
>  Components: dsl-sql
>Reporter: Anton Kedin
>Assignee: Anton Kedin
>Priority: Major
> Fix For: Not applicable
>
>  Time Spent: 50m
>  Remaining Estimate: 0h
>
> Beam SQL overrides default charset that Calcite uses and sets it to UTF16. It 
> is done via system properties.
> Problem is that we do this only when it hasn't been set yet. So if system 
> property has been set to ISO-8859-1 (Calcite's default), then test runs will 
> fail when trying to encode characters not supported in that encoding.
> Solution:
>  - because it's a system property, we don't want to force override it;
>  - for the same reason we cannot set it for a specific query execution;
>  - we can expose a static method on BeamSql to override these properties if 
> explicitly requested;
>  - affected tests will explicitly override it;
>  - otherwise behavior will stay unchanged and we will respect defaults and 
> user settings;



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #114

2018-03-07 Thread Apache Jenkins Server
See 


--
[...truncated 1.85 MB...]
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 22.0 (TID 104) in 141 ms on localhost 
(executor driver) (3/4)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 22.0 (TID 107). 13259 bytes result sent to 
driver
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 22.0 (TID 107) in 144 ms on localhost 
(executor driver) (4/4)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 22.0, whose tasks have all completed, from pool 
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 22 (repartition at GroupCombineFunctions.java:242) 
finished in 0.114 s
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 23)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 23 (MapPartitionsRDD[186] at map at 
TranslationUtils.java:129), which has no missing parents
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_10 stored as values in memory (estimated size 82.6 KB, 
free 1817.2 MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_10_piece0 stored as bytes in memory (estimated size 21.9 
KB, free 1817.2 MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_10_piece0 in memory on 127.0.0.1:38620 (size: 21.9 KB, 
free: 1818.0 MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 10 from broadcast at DAGScheduler.scala:1006
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 23 (MapPartitionsRDD[186] at 
map at TranslationUtils.java:129) (first 15 tasks are for partitions Vector(0, 
1, 2, 3))
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 23.0 with 4 tasks
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 23.0 (TID 108, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 4897 bytes)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 23.0 (TID 109, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 4897 bytes)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 23.0 (TID 110, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 4897 bytes)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 23.0 (TID 111, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 4897 bytes)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 23.0 (TID 109)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 23.0 (TID 110)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 23.0 (TID 111)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 23.0 (TID 108)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_183_2 stored as bytes in memory (estimated size 4.0 B, free 
1817.2 MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_183_2 in memory on 127.0.0.1:38620 (size: 4.0 B, free: 1818.0 
MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 23.0 (TID 110). 13025 bytes result sent to 
driver
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 23.0 (TID 110) in 19 ms on localhost (executor 
driver) (1/4)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1070

2018-03-07 Thread Apache Jenkins Server
See 


--
[...truncated 120.98 KB...]
pvalueish_result = self.runner.apply(transform, pvalueish)
  File 
"
 line 174, in apply
return m(transform, input)
  File 
"
 line 180, in apply_PTransform
return transform.expand(input)
  File 
"
 line 165, in expand
| Map(_merge_tagged_vals_under_key, result_ctor, result_ctor_arg))
  File 
"
 line 107, in __or__
return self.pipeline.apply(ptransform, self)
  File 
"
 line 433, in apply
label or transform.label)
  File 
"
 line 443, in apply
return self.apply(transform, pvalueish)
  File 
"
 line 475, in apply
type_options = self._options.view_as(TypeOptions)
  File 
"
 line 227, in view_as
view = cls(self._flags)
  File 
"
 line 150, in __init__
parser = _BeamArgumentParser()
  File "/usr/lib/python2.7/argparse.py", line 1602, in __init__
help=_('show this help message and exit'))
  File "/usr/lib/python2.7/gettext.py", line 581, in gettext
return dgettext(_current_domain, message)
  File "/usr/lib/python2.7/gettext.py", line 545, in dgettext
codeset=_localecodesets.get(domain))
  File "/usr/lib/python2.7/gettext.py", line 480, in translation
mofiles = find(domain, localedir, languages, all=1)
  File "/usr/lib/python2.7/gettext.py", line 456, in find
if os.path.exists(mofile):
  File 
"
 line 18, in exists
os.stat(path)
  File 
"
 line 276, in signalhandler
raise TimedOutException()
TimedOutException: 'test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest)'

==
ERROR: test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest)
--
Traceback (most recent call last):
  File 
"
 line 812, in run
test(orig)
  File 
"
 line 45, in __call__
return self.run(*arg, **kwarg)
  File 
"
 line 133, in run
self.runTest(result)
  File 
"
 line 151, in runTest
test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 395, in __call__
return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 331, in run
testMethod()
  File 
"
 line 177, in test_iterable_side_input
assert_that(result, equal_to([3, 4, 6, 8]))
  File 
"
 line 152, in assert_that
actual | AssertThat()  # pylint: disable=expression-not-assigned
  File 
"
 line 107, in __or__
return 

Jenkins build is back to normal : beam_PostRelease_NightlySnapshot #112

2018-03-07 Thread Apache Jenkins Server
See 




[jira] [Created] (BEAM-3807) SerializableCoder#structuralValue should be more effective for types which don't define an equals method

2018-03-07 Thread Thomas Groh (JIRA)
Thomas Groh created BEAM-3807:
-

 Summary: SerializableCoder#structuralValue should be more 
effective for types which don't define an equals method
 Key: BEAM-3807
 URL: https://issues.apache.org/jira/browse/BEAM-3807
 Project: Beam
  Issue Type: Bug
  Components: sdk-java-core
Reporter: Thomas Groh


Specifically, types which don't implement an equals method (or more loosely, 
types which use the Object equals method) should either implement 
{{#structuralValue}} via using encoded bytes, or require the use of an 
{{Equivalence}} (like the guava definition).

 

It should always be that the following snippet (or its approximation) returns 
{{true}}:

{{ expected(Coder myCoder, }}{{T myElement)}}

{{byte[] elementBytes = CoderUtils.serializeToByteArray(myCoder, myElement);}}

{{T decodedFirst = CoderUtils.deserializeFromByteArray(myCoder, elementBytes);}}

{{T decodedSecond = CoderUtils.deserializeFromByteArray(myCoder, 
elementBytes);}}

{{myCoder.structuralValue(decodedFirst).equals(myCoder.structuralValue(decodedScond));}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (BEAM-3799) Nexmark Query 10 breaks with direct runner

2018-03-07 Thread Thomas Groh (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3799?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thomas Groh resolved BEAM-3799.
---
Resolution: Fixed

This was a mismatch between the assumptions made by {{MutationDetector}} about 
the returned behavior of a {{StructuralValue}} and the implementation in 
{{SerializableCoder}}

 

That mismatch likely requires additional work to be performed in 
{{SerializeableCoder}} to more aggressively validate that elements that encode 
to the same bytes have a meaningfully defined equivalence.

> Nexmark Query 10 breaks with direct runner
> --
>
> Key: BEAM-3799
> URL: https://issues.apache.org/jira/browse/BEAM-3799
> Project: Beam
>  Issue Type: Bug
>  Components: runner-direct
>Affects Versions: 2.4.0, 2.5.0
>Reporter: Ismaël Mejía
>Assignee: Thomas Groh
>Priority: Major
> Fix For: 2.4.0
>
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> While running query 10 with the direct runner like this:
> {quote}mvn exec:java -Dexec.mainClass=org.apache.beam.sdk.nexmark.Main 
> -Pdirect-runner -Dexec.args="--runner=DirectRunner --query=10 
> --streaming=false --manageResources=false --monitorJobs=true 
> --enforceEncodability=true --enforceImmutability=true" -pl 'sdks/java/nexmark'
> {quote}
> I found that it breaks with the direct runner with  following exception (it 
> works ok with the other runners):
> {quote}[WARNING] 
> java.lang.reflect.InvocationTargetException
>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke 
> (NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke 
> (DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke (Method.java:498)
>     at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:294)
>     at java.lang.Thread.run (Thread.java:748)
> Caused by: org.apache.beam.sdk.util.IllegalMutationException: PTransform 
> Query10/Query10.UploadEvents/ParMultiDo(Anonymous) mutated value KV{null, 
> 2015-07-15T00:00:09.999Z shard-3-of-00025 0 ON_TIME null
> } after it was output (new value was KV{null, 2015-07-15T00:00:09.999Z 
> shard-3-of-00025 0 ON_TIME null
> }). Values must not be mutated in any way after being output.
>     at 
> org.apache.beam.runners.direct.ImmutabilityCheckingBundleFactory$ImmutabilityEnforcingBundle.commit
>  (ImmutabilityCheckingBundleFactory.java:134)
>     at org.apache.beam.runners.direct.EvaluationContext.commitBundles 
> (EvaluationContext.java:212)
>     at org.apache.beam.runners.direct.EvaluationContext.handleResult 
> (EvaluationContext.java:152)
>     at 
> org.apache.beam.runners.direct.QuiescenceDriver$TimerIterableCompletionCallback.handleResult
>  (QuiescenceDriver.java:258)
>     at org.apache.beam.runners.direct.DirectTransformExecutor.finishBundle 
> (DirectTransformExecutor.java:190)
>     at org.apache.beam.runners.direct.DirectTransformExecutor.run 
> (DirectTransformExecutor.java:127)
>     at java.util.concurrent.Executors$RunnableAdapter.call 
> (Executors.java:511)
>     at java.util.concurrent.FutureTask.run (FutureTask.java:266)
>     at java.util.concurrent.ThreadPoolExecutor.runWorker 
> (ThreadPoolExecutor.java:1149)
>     at java.util.concurrent.ThreadPoolExecutor$Worker.run 
> (ThreadPoolExecutor.java:624)
>     at java.lang.Thread.run (Thread.java:748)
> Caused by: org.apache.beam.sdk.util.IllegalMutationException: Value KV{null, 
> 2015-07-15T00:00:09.999Z shard-3-of-00025 0 ON_TIME null
> } mutated illegally, new value was KV{null, 2015-07-15T00:00:09.999Z 
> shard-3-of-00025 0 ON_TIME null
> }. Encoding was 
> rO0ABXNyADZvcmcuYXBhY2hlLmJlYW0uc2RrLm5leG1hcmsucXVlcmllcy5RdWVyeTEwJE91dHB1dEZpbGUWUg9rZM1SvgIABUoABWluZGV4TAAIZmlsZW5hbWV0ABJMamF2YS9sYW5nL1N0cmluZztMAAxtYXhUaW1lc3RhbXB0ABdMb3JnL2pvZGEvdGltZS9JbnN0YW50O0wABXNoYXJkcQB-AAFMAAZ0aW1pbmd0ADpMb3JnL2FwYWNoZS9iZWFtL3Nkay90cmFuc2Zvcm1zL3dpbmRvd2luZy9QYW5lSW5mbyRUaW1pbmc7eHAAAHBzcgAVb3JnLmpvZGEudGltZS5JbnN0YW50Lci-0MYOnM0CAAFKAAdpTWlsbGlzeHFOjwLrD3QAFHNoYXJkLTAwMDAzLW9mLTAwMDI1fnIAOG9yZy5hcGFjaGUuYmVhbS5zZGsudHJhbnNmb3Jtcy53aW5kb3dpbmcuUGFuZUluZm8kVGltaW5nAAASAAB4cgAOamF2YS5sYW5nLkVudW0AABIAAHhwdAAHT05fVElNRQ,
>  now 
> 

Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Flink #5188

2018-03-07 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Spark #1441

2018-03-07 Thread Apache Jenkins Server
See 


Changes:

[alan.myrvold] [BEAM-3621] Add Spring repo to build_rules to allow downloading 
pentaho

[tgroh] Fallback to byte equality in MutaitonDetection

--
[...truncated 89.29 KB...]
'apache-beam-testing:bqjob_r1dcf36a88944d663_0162033b6a1a_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)
Upload complete.Waiting on bqjob_r1dcf36a88944d663_0162033b6a1a_1 ... (0s) 
Current status: RUNNING 
 Waiting on 
bqjob_r1dcf36a88944d663_0162033b6a1a_1 ... (0s) Current status: DONE   
2018-03-08 01:30:50,382 703a7c06 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-08 01:31:14,862 703a7c06 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-08 01:31:17,960 703a7c06 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:03.08s,  CPU:0.47s,  MaxMemory:28992kb 
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r78fdef76369df8ff_0162033bd422_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)
Upload complete.Waiting on bqjob_r78fdef76369df8ff_0162033bd422_1 ... (0s) 
Current status: RUNNING 
 Waiting on 
bqjob_r78fdef76369df8ff_0162033bd422_1 ... (0s) Current status: DONE   
2018-03-08 01:31:17,960 703a7c06 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-08 01:31:36,570 703a7c06 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-08 01:31:39,046 703a7c06 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:02.46s,  CPU:0.32s,  MaxMemory:28956kb 
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r45133f3ba6035080_0162033c2842_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)
Upload complete.Waiting on bqjob_r45133f3ba6035080_0162033c2842_1 ... (0s) 
Current status: RUNNING 
 Waiting on 
bqjob_r45133f3ba6035080_0162033c2842_1 ... (0s) Current status: DONE   
2018-03-08 01:31:39,046 703a7c06 MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-08 01:32:07,906 703a7c06 MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-08 01:32:10,886 703a7c06 MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  

Build failed in Jenkins: beam_PerformanceTests_Python #997

2018-03-07 Thread Apache Jenkins Server
See 


Changes:

[alan.myrvold] [BEAM-3621] Add Spring repo to build_rules to allow downloading 
pentaho

[tgroh] Fallback to byte equality in MutaitonDetection

--
[...truncated 1.42 KB...]
 > git checkout -f e8db6a81e9b03b5a0bbffc9117f3128ea23ff184
Commit message: "Merge pull request #4817: Fallback to byte equality in 
MutaitonDetection"
 > git rev-list --no-walk 3ca6e8c517fc0bd2b0fc7202b680a9f85bd7f597 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins4111609001911307831.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6924658788248130744.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3821954960605789363.sh
+ virtualenv .env --system-site-packages
New python executable in 

Installing setuptools, pip, wheel...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins7713865775155341366.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in ./.env/lib/python2.7/site-packages
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins5104613192028369105.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins7808486029668910801.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
  Using cached numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23))
Requirement already satisfied: contextlib2>=0.5.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 24))
Requirement already satisfied: pywinrm in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: six in 
/home/jenkins/.local/lib/python2.7/site-packages (from absl-py->-r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: MarkupSafe in 
/usr/local/lib/python2.7/dist-packages (from jinja2>=2.7->-r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: colorama; extra == "windows" in 
/usr/lib/python2.7/dist-packages (from colorlog[windows]==2.6.0->-r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: xmltodict in 
/home/jenkins/.local/lib/python2.7/site-packages (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: requests-ntlm>=0.3.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: requests>=2.9.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: ntlm-auth>=1.0.2 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
requests-ntlm>=0.3.0->pywinrm->-r 

Jenkins build is back to normal : beam_PerformanceTests_JDBC #303

2018-03-07 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PerformanceTests_TextIOIT #242

2018-03-07 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_XmlIOIT #3

2018-03-07 Thread Apache Jenkins Server
See 


--
[...truncated 717.46 KB...]
[INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce-maven-version) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] 
[INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] artifact io.netty:netty-codec-http2: checking for updates from central
[INFO] Adding ignore: module-info
[INFO] 
[INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce-banned-dependencies) 
@ beam-sdks-java-io-file-based-io-tests ---
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) 
@ beam-sdks-java-io-file-based-io-tests ---
[INFO] 
[INFO] --- maven-resources-plugin:3.0.2:resources (default-resources) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 

[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.7.0:compile (default-compile) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-resources-plugin:3.0.2:testResources (default-testResources) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 

[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.7.0:testCompile (default-testCompile) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 4 source files to 

[INFO] 
:
 Some input files use or override a deprecated API.
[INFO] 
:
 Recompile with -Xlint:deprecation for details.
[INFO] 
[INFO] --- maven-checkstyle-plugin:3.0.0:check (default) @ 
beam-sdks-java-io-file-based-io-tests ---
Downloading from Nexus: 
http://repository.apache.org/snapshots/org/apache/beam/beam-sdks-java-build-tools/2.5.0-SNAPSHOT/maven-metadata.xml
Progress (1): 1.4 kBDownloaded from Nexus: 
http://repository.apache.org/snapshots/org/apache/beam/beam-sdks-java-build-tools/2.5.0-SNAPSHOT/maven-metadata.xml
 (1.4 kB at 8.0 kB/s)
Downloading from Nexus: 
http://repository.apache.org/snapshots/org/apache/beam/beam-sdks-java-build-tools/2.5.0-SNAPSHOT/beam-sdks-java-build-tools-2.5.0-20180307.070335-6.pom
Progress (1): 2.3 kBDownloaded from Nexus: 
http://repository.apache.org/snapshots/org/apache/beam/beam-sdks-java-build-tools/2.5.0-SNAPSHOT/beam-sdks-java-build-tools-2.5.0-20180307.070335-6.pom
 (2.3 kB at 15 kB/s)
Downloading from Nexus: 
http://repository.apache.org/snapshots/org/apache/beam/beam-sdks-java-build-tools/2.5.0-SNAPSHOT/beam-sdks-java-build-tools-2.5.0-20180307.070335-6.jar
Progress (1): 3.8/25 kBProgress (1): 7.9/25 kBProgress (1): 12/25 kB Progress 
(1): 12/25 kBProgress (1): 17/25 kBProgress (1): 21/25 kBProgress (1): 25 kB
  Downloaded from Nexus: 
http://repository.apache.org/snapshots/org/apache/beam/beam-sdks-java-build-tools/2.5.0-SNAPSHOT/beam-sdks-java-build-tools-2.5.0-20180307.070335-6.jar
 (25 kB at 131 kB/s)
[INFO] Starting audit...
Audit done.
[INFO] 
[INFO] --- maven-surefire-plugin:2.20.1:test (default-test) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:regex-properties 
(render-artifact-id) @ beam-sdks-java-io-file-based-io-tests ---
[INFO] 
[INFO] --- maven-jar-plugin:3.0.2:jar (default-jar) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Building jar: 

[INFO] 
[INFO] --- maven-site-plugin:3.7:attach-descriptor (attach-descriptor) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Skipping because packaging 'jar' is not pom.
[INFO] 
[INFO] --- maven-jar-plugin:3.0.2:test-jar (default-test-jar) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Building jar: 

Build failed in Jenkins: beam_PostRelease_NightlySnapshot #111

2018-03-07 Thread Apache Jenkins Server
See 


--
[...truncated 580.85 KB...]
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:52 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Started container container-6
Mar 08, 2018 1:00:52 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-5 msg: [container-5] Entering heartbeat loop..
Mar 08, 2018 1:00:52 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-1 msg: [container-1] Entering heartbeat loop..
Mar 08, 2018 1:00:52 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-2 msg: [container-2] Entering heartbeat loop..
Mar 08, 2018 1:00:52 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-3 msg: [container-3] Entering heartbeat loop..
Mar 08, 2018 1:00:52 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-4 msg: [container-4] Entering heartbeat loop..
Mar 08, 2018 1:00:52 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-6 msg: [container-6] Entering heartbeat loop..
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:52 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Started container container-7
Mar 08, 2018 1:00:52 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-7 msg: [container-7] Entering heartbeat loop..
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:52 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Started container container-8
Mar 08, 2018 1:00:52 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Mar 08, 2018 1:00:52 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Started container container-9
Mar 08, 2018 1:00:52 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-8 msg: [container-8] Entering heartbeat loop..
Mar 08, 2018 1:00:52 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-9 msg: [container-9] Entering heartbeat loop..
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer 
heartbeatLoop
INFO: Waiting for pending request.
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer 
heartbeatLoop
INFO: Waiting for pending request.
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer 
heartbeatLoop
INFO: Waiting for pending request.
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Deploy request: 
[OperatorDeployInfo[id=1,name=PubsubIO.Read/PubsubUnboundedSource/Read(PubsubSource),type=INPUT,checkpoint={,
 0, 
0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream9,bufferServer=localhost
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Deploy request: 
[OperatorDeployInfo[id=7,name=CalculateUserScores/ExtractUserScore/Combine.perKey(SumInteger)/GroupByKey,type=GENERIC,checkpoint={,
 0, 
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream6,sourceNodeId=6,sourcePortName=output,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream29,bufferServer=localhost
Mar 08, 2018 1:00:53 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Deploy request: 
[OperatorDeployInfo[id=19,name=CalculateTeamScores/LeaderboardTeamFixedWindows/Window.Assign,type=OIO,checkpoint={,
 0, 
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=inputPort,streamId=stream18,sourceNodeId=4,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=stream15,bufferServer=]]],
 
OperatorDeployInfo[id=6,name=CalculateUserScores/ExtractUserScore/MapElements/Map/ParMultiDo(Anonymous),type=OIO,checkpoint={,
 0, 
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=stream23,sourceNodeId=5,sourcePortName=outputPort,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=stream6,bufferServer=localhost]]],
 

[jira] [Created] (BEAM-3806) DirectRunner hangs if multiple timers set in the same bundle

2018-03-07 Thread Ben Chambers (JIRA)
Ben Chambers created BEAM-3806:
--

 Summary: DirectRunner hangs if multiple timers set in the same 
bundle
 Key: BEAM-3806
 URL: https://issues.apache.org/jira/browse/BEAM-3806
 Project: Beam
  Issue Type: Bug
  Components: runner-direct
Reporter: Ben Chambers
Assignee: Thomas Groh


See the repro below:
{code:java}
package com.simbly.beam.cassandra;

import org.apache.beam.sdk.coders.KvCoder;
import org.apache.beam.sdk.coders.StringUtf8Coder;
import org.apache.beam.sdk.state.TimeDomain;
import org.apache.beam.sdk.state.Timer;
import org.apache.beam.sdk.state.TimerSpec;
import org.apache.beam.sdk.state.TimerSpecs;
import org.apache.beam.sdk.testing.PAssert;
import org.apache.beam.sdk.testing.TestPipeline;
import org.apache.beam.sdk.testing.TestStream;
import org.apache.beam.sdk.transforms.DoFn;
import org.apache.beam.sdk.transforms.ParDo;
import org.apache.beam.sdk.values.KV;
import org.apache.beam.sdk.values.PCollection;
import org.joda.time.Duration;
import org.junit.Rule;
import org.junit.Test;

public class DirectRunnerTest {

  @Rule
  public TestPipeline pipeline = TestPipeline.create();

  @Test
  public void badTimerBehavior() {
TestStream> stream = TestStream
.create(KvCoder.of(StringUtf8Coder.of(), StringUtf8Coder.of()))
.addElements(KV.of("key1", "v1"))
.advanceWatermarkToInfinity();

PCollection result = pipeline
.apply(stream)
.apply(ParDo.of(new TestDoFn()));
PAssert.that(result).containsInAnyOrder("It works");

pipeline.run().waitUntilFinish();
  }

  private static class TestDoFn extends DoFn, String> {
@TimerId("timer")
private final TimerSpec timer = TimerSpecs.timer(TimeDomain.EVENT_TIME);

@ProcessElement
public void process(ProcessContext c,
@TimerId("timer") Timer timer) {
  timer.offset(Duration.standardMinutes(10)).setRelative();
  timer.offset(Duration.standardMinutes(30)).setRelative();
}

@OnTimer("timer")
public void onTimer(OnTimerContext c, @TimerId("timer") Timer timer) {
  c.output("It works");
}
  }
}
{code}
>From inspection, this seems to be caused by the logic in 
>[WatermarkManager|https://github.com/apache/beam/blob/master/runners/direct-java/src/main/java/org/apache/beam/runners/direct/WatermarkManager.java#L313],
> which does the following if there are multiple timers for akey:
 # Adds the first timer to the `pendingTimers`, `keyTimers`, and 
`existingTimersForKey`.
 # Removes the first timer from `keyTimers`
 # Adds the second timer to `keyTimers` and `existingTimersForKey`.

This leads to inconsistencies since pendingTimers has only the first timer, 
keyTimers only the second, and existingTimers has both. This becomes more 
problematic since one of these lists is used for *firing* (and thus releasing 
holds) and the other is used for holds.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-3805) Python ULR Harness submits null pipeline_options field in PrepareJobRequest

2018-03-07 Thread Axel Magnuson (JIRA)
Axel Magnuson created BEAM-3805:
---

 Summary: Python ULR Harness submits null pipeline_options field in 
PrepareJobRequest
 Key: BEAM-3805
 URL: https://issues.apache.org/jira/browse/BEAM-3805
 Project: Beam
  Issue Type: Bug
  Components: sdk-py-harness
Reporter: Axel Magnuson
Assignee: Robert Bradshaw


the beam_job_api gRPC definition indicates that pipeline_options is a required 
field.  However, the python ULR currently does not set an options struct.  This 
is trivially solved on the job server side by instantiating a default.  
However, there may be cases where the job server assumes it exists and falls 
over on an NPE during translation.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] annotated tag v2.4.0-RC2 created (now 1759c08)

2018-03-07 Thread robertwb
This is an automated email from the ASF dual-hosted git repository.

robertwb pushed a change to annotated tag v2.4.0-RC2
in repository https://gitbox.apache.org/repos/asf/beam.git.


  at 1759c08  (tag)
 tagging bf96804a277837e8e2b9a8818820f1a9c6907cfd (commit)
 replaces v2.4.0-RC1
  by Robert Bradshaw
  on Wed Mar 7 17:04:34 2018 -0800

- Log -
[maven-release-plugin] copy for tag v2.4.0-RC2
---

No new revisions were added by this update.

-- 
To stop receiving notification emails like this one, please contact
rober...@apache.org.


[beam] branch release-2.4.0 updated: [maven-release-plugin] rollback changes from release preparation of v2.4.0-RC2

2018-03-07 Thread robertwb
This is an automated email from the ASF dual-hosted git repository.

robertwb pushed a commit to branch release-2.4.0
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/release-2.4.0 by this push:
 new c6a7f17  [maven-release-plugin] rollback changes from release 
preparation of v2.4.0-RC2
c6a7f17 is described below

commit c6a7f17a414c638a336b91475826a32124077b45
Author: Robert Bradshaw 
AuthorDate: Wed Mar 7 17:04:50 2018 -0800

[maven-release-plugin] rollback changes from release preparation of 
v2.4.0-RC2
---
 examples/java/pom.xml   | 2 +-
 examples/pom.xml| 2 +-
 model/fn-execution/pom.xml  | 2 +-
 model/job-management/pom.xml| 2 +-
 model/pipeline/pom.xml  | 2 +-
 model/pom.xml   | 2 +-
 pom.xml | 4 ++--
 runners/apex/pom.xml| 2 +-
 runners/core-construction-java/pom.xml  | 2 +-
 runners/core-java/pom.xml   | 2 +-
 runners/direct-java/pom.xml | 2 +-
 runners/flink/pom.xml   | 2 +-
 runners/gcp/gcemd/pom.xml   | 2 +-
 runners/gcp/gcsproxy/pom.xml| 2 +-
 runners/gcp/pom.xml | 2 +-
 runners/gearpump/pom.xml| 2 +-
 runners/google-cloud-dataflow-java/pom.xml  | 2 +-
 runners/java-fn-execution/pom.xml   | 2 +-
 runners/local-artifact-service-java/pom.xml | 2 +-
 runners/local-java/pom.xml  | 2 +-
 runners/pom.xml | 2 +-
 runners/reference/java/pom.xml  | 2 +-
 runners/reference/job-server/pom.xml| 2 +-
 runners/reference/pom.xml   | 2 +-
 runners/spark/pom.xml   | 2 +-
 sdks/go/pom.xml | 2 +-
 sdks/java/build-tools/pom.xml   | 2 +-
 sdks/java/container/pom.xml | 2 +-
 sdks/java/core/pom.xml  | 2 +-
 sdks/java/extensions/google-cloud-platform-core/pom.xml | 2 +-
 sdks/java/extensions/jackson/pom.xml| 2 +-
 sdks/java/extensions/join-library/pom.xml   | 2 +-
 sdks/java/extensions/pom.xml| 2 +-
 sdks/java/extensions/protobuf/pom.xml   | 2 +-
 sdks/java/extensions/sketching/pom.xml  | 2 +-
 sdks/java/extensions/sorter/pom.xml | 2 +-
 sdks/java/extensions/sql/pom.xml| 2 +-
 sdks/java/fn-execution/pom.xml  | 2 +-
 sdks/java/harness/pom.xml   | 2 +-
 sdks/java/io/amazon-web-services/pom.xml| 2 +-
 sdks/java/io/amqp/pom.xml   | 2 +-
 sdks/java/io/cassandra/pom.xml  | 2 +-
 sdks/java/io/common/pom.xml | 2 +-
 sdks/java/io/elasticsearch-tests/elasticsearch-tests-2/pom.xml  | 2 +-
 sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/pom.xml  | 2 +-
 sdks/java/io/elasticsearch-tests/elasticsearch-tests-common/pom.xml | 2 +-
 sdks/java/io/elasticsearch-tests/pom.xml| 2 +-
 sdks/java/io/elasticsearch/pom.xml  | 2 +-
 sdks/java/io/file-based-io-tests/pom.xml| 2 +-
 sdks/java/io/google-cloud-platform/pom.xml  | 2 +-
 sdks/java/io/hadoop-common/pom.xml  | 2 +-
 sdks/java/io/hadoop-file-system/pom.xml | 2 +-
 sdks/java/io/hadoop-input-format/pom.xml| 2 +-
 sdks/java/io/hbase/pom.xml  | 2 +-
 sdks/java/io/hcatalog/pom.xml   | 2 +-
 sdks/java/io/jdbc/pom.xml   | 2 +-
 sdks/java/io/jms/pom.xml| 2 +-
 sdks/java/io/kafka/pom.xml   

[beam] branch release-2.4.0 updated: [maven-release-plugin] prepare release v2.4.0-RC2

2018-03-07 Thread robertwb
This is an automated email from the ASF dual-hosted git repository.

robertwb pushed a commit to branch release-2.4.0
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/release-2.4.0 by this push:
 new bf96804  [maven-release-plugin] prepare release v2.4.0-RC2
bf96804 is described below

commit bf96804a277837e8e2b9a8818820f1a9c6907cfd
Author: Robert Bradshaw 
AuthorDate: Wed Mar 7 16:59:02 2018 -0800

[maven-release-plugin] prepare release v2.4.0-RC2
---
 examples/java/pom.xml   | 2 +-
 examples/pom.xml| 2 +-
 model/fn-execution/pom.xml  | 2 +-
 model/job-management/pom.xml| 2 +-
 model/pipeline/pom.xml  | 2 +-
 model/pom.xml   | 2 +-
 pom.xml | 4 ++--
 runners/apex/pom.xml| 2 +-
 runners/core-construction-java/pom.xml  | 2 +-
 runners/core-java/pom.xml   | 2 +-
 runners/direct-java/pom.xml | 2 +-
 runners/flink/pom.xml   | 2 +-
 runners/gcp/gcemd/pom.xml   | 2 +-
 runners/gcp/gcsproxy/pom.xml| 2 +-
 runners/gcp/pom.xml | 2 +-
 runners/gearpump/pom.xml| 2 +-
 runners/google-cloud-dataflow-java/pom.xml  | 2 +-
 runners/java-fn-execution/pom.xml   | 2 +-
 runners/local-artifact-service-java/pom.xml | 2 +-
 runners/local-java/pom.xml  | 2 +-
 runners/pom.xml | 2 +-
 runners/reference/java/pom.xml  | 2 +-
 runners/reference/job-server/pom.xml| 2 +-
 runners/reference/pom.xml   | 2 +-
 runners/spark/pom.xml   | 2 +-
 sdks/go/pom.xml | 2 +-
 sdks/java/build-tools/pom.xml   | 2 +-
 sdks/java/container/pom.xml | 2 +-
 sdks/java/core/pom.xml  | 2 +-
 sdks/java/extensions/google-cloud-platform-core/pom.xml | 2 +-
 sdks/java/extensions/jackson/pom.xml| 2 +-
 sdks/java/extensions/join-library/pom.xml   | 2 +-
 sdks/java/extensions/pom.xml| 2 +-
 sdks/java/extensions/protobuf/pom.xml   | 2 +-
 sdks/java/extensions/sketching/pom.xml  | 2 +-
 sdks/java/extensions/sorter/pom.xml | 2 +-
 sdks/java/extensions/sql/pom.xml| 2 +-
 sdks/java/fn-execution/pom.xml  | 2 +-
 sdks/java/harness/pom.xml   | 2 +-
 sdks/java/io/amazon-web-services/pom.xml| 2 +-
 sdks/java/io/amqp/pom.xml   | 2 +-
 sdks/java/io/cassandra/pom.xml  | 2 +-
 sdks/java/io/common/pom.xml | 2 +-
 sdks/java/io/elasticsearch-tests/elasticsearch-tests-2/pom.xml  | 2 +-
 sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/pom.xml  | 2 +-
 sdks/java/io/elasticsearch-tests/elasticsearch-tests-common/pom.xml | 2 +-
 sdks/java/io/elasticsearch-tests/pom.xml| 2 +-
 sdks/java/io/elasticsearch/pom.xml  | 2 +-
 sdks/java/io/file-based-io-tests/pom.xml| 2 +-
 sdks/java/io/google-cloud-platform/pom.xml  | 2 +-
 sdks/java/io/hadoop-common/pom.xml  | 2 +-
 sdks/java/io/hadoop-file-system/pom.xml | 2 +-
 sdks/java/io/hadoop-input-format/pom.xml| 2 +-
 sdks/java/io/hbase/pom.xml  | 2 +-
 sdks/java/io/hcatalog/pom.xml   | 2 +-
 sdks/java/io/jdbc/pom.xml   | 2 +-
 sdks/java/io/jms/pom.xml| 2 +-
 sdks/java/io/kafka/pom.xml  | 2 +-
 sdks/java/io/kinesis/pom.xml 

Build failed in Jenkins: beam_PostRelease_NightlySnapshot #110

2018-03-07 Thread Apache Jenkins Server
See 


--
[...truncated 620.87 KB...]
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (3s) Current status: 
PENDING

  
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (4s) Current status: 
PENDING

  
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (5s) Current status: 
PENDING

  
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (6s) Current status: 
PENDING

  
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (7s) Current status: 
PENDING

  
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (8s) Current status: 
PENDING

  
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (9s) Current status: 
PENDING

  
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (10s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (11s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (12s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (13s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (14s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (15s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (16s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (17s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (18s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (19s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (20s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (21s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (22s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (23s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (24s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (25s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (26s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (27s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (28s) Current status: 
PENDING

   
Waiting on bqjob_r3711e10e729a3783_0162030fb832_1 ... (29s) Current status: 
PENDING

   
Waiting on 

[jira] [Closed] (BEAM-3780) Add a utility to instantiate a partially unknown coder

2018-03-07 Thread Thomas Groh (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3780?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thomas Groh closed BEAM-3780.
-
   Resolution: Duplicate
Fix Version/s: Not applicable

This is handled via the {{LengthPrefixUnknownCoders#forCoder(id, components, 
true)}} call, followed by an instantiation of the returned Coder - due to the 
idempotence of the call, it can be used to construct the equivalent 
(length-prefixed byte array coder) of an SDK-side wire coder.

> Add a utility to instantiate a partially unknown coder
> --
>
> Key: BEAM-3780
> URL: https://issues.apache.org/jira/browse/BEAM-3780
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>Priority: Major
> Fix For: Not applicable
>
>
> Coders must be understood by the SDK harness that is encoding or decoding the 
> associated elements. However, the pipeline runner is capable of constructing 
> partial coders, where an unknown coder is replaced with a ByteArrayCoder. It 
> then can decompose the portions of elements it is aware of, without having to 
> understand the custom element encodings.
>  
> This should go in CoderTranslation, as an alternative to the full-fidelity 
> rehydration of a coder.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (BEAM-3801) Test that LengthPrefixing Unknown Coders is Idempotent

2018-03-07 Thread Thomas Groh (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3801?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thomas Groh closed BEAM-3801.
-
   Resolution: Not A Problem
Fix Version/s: Not applicable

> Test that LengthPrefixing Unknown Coders is Idempotent
> --
>
> Key: BEAM-3801
> URL: https://issues.apache.org/jira/browse/BEAM-3801
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>Priority: Minor
> Fix For: Not applicable
>
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] branch release-2.4.0 updated (a454b66 -> d2aeb78)

2018-03-07 Thread robertwb
This is an automated email from the ASF dual-hosted git repository.

robertwb pushed a change to branch release-2.4.0
in repository https://gitbox.apache.org/repos/asf/beam.git.


from a454b66  Remove dev suffix from Python version.
 add 384f5bf  Fallback to byte equality in MutationDetection
 new d2aeb78  Merge pull request #4821 from tgroh/cherry_pick_4817

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../apache/beam/sdk/util/MutationDetectors.java| 15 -
 .../beam/sdk/util/MutationDetectorsTest.java   | 39 ++
 2 files changed, 53 insertions(+), 1 deletion(-)

-- 
To stop receiving notification emails like this one, please contact
rober...@apache.org.


[beam] 01/01: Merge pull request #4821 from tgroh/cherry_pick_4817

2018-03-07 Thread robertwb
This is an automated email from the ASF dual-hosted git repository.

robertwb pushed a commit to branch release-2.4.0
in repository https://gitbox.apache.org/repos/asf/beam.git

commit d2aeb785da1d00b90e6195635b336f5b41cdb85c
Merge: a454b66 384f5bf
Author: Robert Bradshaw 
AuthorDate: Wed Mar 7 16:03:41 2018 -0800

Merge pull request #4821 from tgroh/cherry_pick_4817

[BEAM-3799] Fallback to byte equality in MutationDetection

 .../apache/beam/sdk/util/MutationDetectors.java| 15 -
 .../beam/sdk/util/MutationDetectorsTest.java   | 39 ++
 2 files changed, 53 insertions(+), 1 deletion(-)

-- 
To stop receiving notification emails like this one, please contact
rober...@apache.org.


[jira] [Created] (BEAM-3804) Build Go SDK container with Gradle

2018-03-07 Thread Henning Rohde (JIRA)
Henning Rohde created BEAM-3804:
---

 Summary: Build Go SDK container with Gradle
 Key: BEAM-3804
 URL: https://issues.apache.org/jira/browse/BEAM-3804
 Project: Beam
  Issue Type: Sub-task
  Components: sdk-go
Reporter: Henning Rohde
Assignee: Henning Rohde






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (BEAM-3784) Enhance Apache Beam interpreter for Apache Zeppelin

2018-03-07 Thread Daniel Rosato (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3784?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16390360#comment-16390360
 ] 

Daniel Rosato edited comment on BEAM-3784 at 3/7/18 11:30 PM:
--

Hello, I'm a graduate student of Politecnico di Milano, currently pursuing a 
M.Sc in Computer Science. I'm very interested in this topic and I believe I 
have the skills and experience required. I wanted to ask if you could provide 
me with some materials to further read about the context and goals of this 
feature so that I can build a proper proposal and delivery timeline. Any help 
would be greatly appreciated. Thank you!.

 


was (Author: danielrm88):
Hello, I'm a graduate student of Politecnico di Milano, currently pursuing a 
M.Sc in Computer Science. I'm very interested in this topic and I believe I 
have the skills and experience required. I wanted to ask if you could provide 
me with some materials to further read about the context and goals of this 
feature so that I can build a proper proposal and delivery timeline. Any help 
would be greatly appreciate it. Thank you!.

 

> Enhance Apache Beam interpreter for Apache Zeppelin
> ---
>
> Key: BEAM-3784
> URL: https://issues.apache.org/jira/browse/BEAM-3784
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-ideas
>Reporter: Kenneth Knowles
>Assignee: Kenneth Knowles
>Priority: Minor
>  Labels: SQL, bigdata, cloud, gsoc2018, java
>
> Apache Zeppelin includes an integration with Apache Beam: 
> https://zeppelin.apache.org/docs/0.7.0/interpreter/beam.html
> How well does this work for interactive exploration? Can this be enhanced to 
> support Beam SQL? What about unbounded data? Let's find out by exploring the 
> existing interpreter and enhancing it particularly for streaming SQL.
> This project will require the ability to read, write, and run Java and SQL. 
> You will come out of it with familiarity with two Apache big data projects 
> and lots of ideas!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3749) support customized trigger/accumulationMode in BeamSql

2018-03-07 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3749?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16390399#comment-16390399
 ] 

Kenneth Knowles commented on BEAM-3749:
---

I added the test here: https://github.com/apache/beam/pull/4826

> support customized trigger/accumulationMode in BeamSql
> --
>
> Key: BEAM-3749
> URL: https://issues.apache.org/jira/browse/BEAM-3749
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 2h 40m
>  Remaining Estimate: 0h
>
> Currently BeamSql use {{DefaultTrigger}} for aggregation operations. 
> By adding two options {{withTrigger(Trigger)}} and 
> {{withAccumulationMode(AccumulationMode)}}, developers can specify their own 
> aggregation strategies with BeamSql.
> [~xumingming] [~kedin] [~kenn] for any comments.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3784) Enhance Apache Beam interpreter for Apache Zeppelin

2018-03-07 Thread Daniel Rosato (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3784?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16390360#comment-16390360
 ] 

Daniel Rosato commented on BEAM-3784:
-

Hello, I'm a graduate student of Politecnico di Milano, currently pursuing a 
M.Sc in Computer Science. I'm very interested in this topic and I believe I 
have the skills and experience required. I wanted to ask if you could provide 
me with some materials to further read about the context and goals of this 
feature so that I can build a proper proposal and delivery timeline. Any help 
would be greatly appreciate it. Thank you!.

 

> Enhance Apache Beam interpreter for Apache Zeppelin
> ---
>
> Key: BEAM-3784
> URL: https://issues.apache.org/jira/browse/BEAM-3784
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-ideas
>Reporter: Kenneth Knowles
>Assignee: Kenneth Knowles
>Priority: Minor
>  Labels: SQL, bigdata, cloud, gsoc2018, java
>
> Apache Zeppelin includes an integration with Apache Beam: 
> https://zeppelin.apache.org/docs/0.7.0/interpreter/beam.html
> How well does this work for interactive exploration? Can this be enhanced to 
> support Beam SQL? What about unbounded data? Let's find out by exploring the 
> existing interpreter and enhancing it particularly for streaming SQL.
> This project will require the ability to read, write, and run Java and SQL. 
> You will come out of it with familiarity with two Apache big data projects 
> and lots of ideas!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_Verify #4385

2018-03-07 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Fallback to byte equality in MutaitonDetection

--
[...truncated 1.02 MB...]
copying apache_beam/runners/dataflow/native_io/iobase.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/streaming_create.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/direct/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/bundle_factory.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/clock.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor_test.py 
-> apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/evaluation_context.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/executor.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/helper_transforms.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/transform_evaluator.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/util.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/watermark_manager.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/experimental/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/experimental
copying apache_beam/runners/experimental/python_rpc_direct/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying 
apache_beam/runners/experimental/python_rpc_direct/python_rpc_direct_runner.py 
-> apache-beam-2.5.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/experimental/python_rpc_direct/server.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/job/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/manager.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/utils.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/job
copying apache_beam/runners/portability/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_main.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/test/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/test
copying apache_beam/runners/worker/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/bundle_processor.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/data_plane.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/data_plane_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/log_handler.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying 

Build failed in Jenkins: beam_PostRelease_NightlySnapshot #109

2018-03-07 Thread Apache Jenkins Server
See 


--
[...truncated 2.55 MB...]
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

Driver stacktrace:
at 
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1517)
at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1505)
at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1504)
at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at 
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1504)
at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
at scala.Option.foreach(Option.scala:257)
at 
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:814)
at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1732)
at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1687)
at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1676)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at 
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:630)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2029)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2050)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2069)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2094)
at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:936)
at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
at org.apache.spark.rdd.RDD.collect(RDD.scala:935)
at 
org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:361)
at 
org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
at 
org.apache.beam.runners.spark.io.SparkUnboundedSource$ReadReportDStream.compute(SparkUnboundedSource.java:202)
at 
org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1$$anonfun$apply$7.apply(DStream.scala:342)
at 
org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1$$anonfun$apply$7.apply(DStream.scala:342)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at 
org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1.apply(DStream.scala:341)
at 
org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1.apply(DStream.scala:341)
at 
org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
at 
org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:336)
at 
org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:334)
at scala.Option.orElse(Option.scala:289)
at 
org.apache.spark.streaming.dstream.DStream.getOrCompute(DStream.scala:331)
at 
org.apache.spark.streaming.dstream.DStream.generateJob(DStream.scala:432)
at 
org.apache.spark.streaming.DStreamGraph$$anonfun$1.apply(DStreamGraph.scala:122)
at 
org.apache.spark.streaming.DStreamGraph$$anonfun$1.apply(DStreamGraph.scala:121)
at 
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at 
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at 
scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
at 
org.apache.spark.streaming.DStreamGraph.generateJobs(DStreamGraph.scala:121)
at 
org.apache.spark.streaming.scheduler.JobGenerator$$anonfun$3.apply(JobGenerator.scala:249)
at 

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1069

2018-03-07 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Fallback to byte equality in MutaitonDetection

--
[...truncated 125.10 KB...]
return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 331, in run
testMethod()
  File 
"
 line 306, in test_flattened_side_input
pipeline.run()
  File 
"
 line 102, in run
result = super(TestPipeline, self).run()
  File 
"
 line 369, in run
self.to_runner_api(), self.runner, self._options).run(False)
  File 
"
 line 375, in run
if self._options.view_as(SetupOptions).save_main_session:
  File 
"
 line 227, in view_as
view = cls(self._flags)
  File 
"
 line 150, in __init__
parser = _BeamArgumentParser()
  File "/usr/lib/python2.7/argparse.py", line 1602, in __init__
help=_('show this help message and exit'))
  File "/usr/lib/python2.7/gettext.py", line 581, in gettext
return dgettext(_current_domain, message)
  File "/usr/lib/python2.7/gettext.py", line 545, in dgettext
codeset=_localecodesets.get(domain))
  File "/usr/lib/python2.7/gettext.py", line 480, in translation
mofiles = find(domain, localedir, languages, all=1)
  File "/usr/lib/python2.7/gettext.py", line 456, in find
if os.path.exists(mofile):
  File 
"
 line 18, in exists
os.stat(path)
  File 
"
 line 276, in signalhandler
raise TimedOutException()
TimedOutException: 'test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest)'

==
ERROR: test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest)
--
Traceback (most recent call last):
  File 
"
 line 812, in run
test(orig)
  File 
"
 line 45, in __call__
return self.run(*arg, **kwarg)
  File 
"
 line 133, in run
self.runTest(result)
  File 
"
 line 151, in runTest
test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 395, in __call__
return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 331, in run
testMethod()
  File 
"
 line 178, in test_iterable_side_input
pipeline.run()
  File 
"
 line 102, in run
result = super(TestPipeline, self).run()
  File 
"
 line 369, in run
self.to_runner_api(), self.runner, self._options).run(False)
  File 
"
 line 597, in from_runner_api
context.transforms.get_by_id(root_transform_id)]
  File 
"
 line 69, in get_by_id
self._id_to_proto[id], self._pipeline_context)
  File 

Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Flink #5187

2018-03-07 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-3749) support customized trigger/accumulationMode in BeamSql

2018-03-07 Thread Xu Mingmin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3749?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16390324#comment-16390324
 ] 

Xu Mingmin commented on BEAM-3749:
--

Do you run a test to verify that?

> support customized trigger/accumulationMode in BeamSql
> --
>
> Key: BEAM-3749
> URL: https://issues.apache.org/jira/browse/BEAM-3749
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 2h 10m
>  Remaining Estimate: 0h
>
> Currently BeamSql use {{DefaultTrigger}} for aggregation operations. 
> By adding two options {{withTrigger(Trigger)}} and 
> {{withAccumulationMode(AccumulationMode)}}, developers can specify their own 
> aggregation strategies with BeamSql.
> [~xumingming] [~kedin] [~kenn] for any comments.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-3803) [Nexmark] Not all runners support attempted metrics

2018-03-07 Thread Andrew Pilloud (JIRA)
Andrew Pilloud created BEAM-3803:


 Summary: [Nexmark] Not all runners support attempted metrics
 Key: BEAM-3803
 URL: https://issues.apache.org/jira/browse/BEAM-3803
 Project: Beam
  Issue Type: Bug
  Components: runner-dataflow
Reporter: Andrew Pilloud
Assignee: Andrew Pilloud


The dataflow runner only supports committed metrics for batch jobs and 
attempted metrics for streaming jobs. Nexmark should use the best available 
metric source.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3749) support customized trigger/accumulationMode in BeamSql

2018-03-07 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3749?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16390311#comment-16390311
 ] 

Kenneth Knowles commented on BEAM-3749:
---

I think this link will clarify: 
https://github.com/apache/beam/blob/master/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/windowing/Window.java#L333

The trigger is left alone if you use {{Window.into}} when the SQL sets up the 
window function. It will not set it to the {{DefaultTrigger}}.

> support customized trigger/accumulationMode in BeamSql
> --
>
> Key: BEAM-3749
> URL: https://issues.apache.org/jira/browse/BEAM-3749
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 2h 10m
>  Remaining Estimate: 0h
>
> Currently BeamSql use {{DefaultTrigger}} for aggregation operations. 
> By adding two options {{withTrigger(Trigger)}} and 
> {{withAccumulationMode(AccumulationMode)}}, developers can specify their own 
> aggregation strategies with BeamSql.
> [~xumingming] [~kedin] [~kenn] for any comments.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (BEAM-3749) support customized trigger/accumulationMode in BeamSql

2018-03-07 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3749?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16390311#comment-16390311
 ] 

Kenneth Knowles edited comment on BEAM-3749 at 3/7/18 9:58 PM:
---

I think this link will clarify: 
[https://github.com/apache/beam/blob/master/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/windowing/Window.java#L333]

The trigger is left alone if you use {{Window.into}} when the SQL sets up the 
window function. It will not set it to the {{DefaultTrigger}}.

 The code that you showed will use tumbling windows with the configured trigger.


was (Author: kenn):
I think this link will clarify: 
https://github.com/apache/beam/blob/master/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/windowing/Window.java#L333

The trigger is left alone if you use {{Window.into}} when the SQL sets up the 
window function. It will not set it to the {{DefaultTrigger}}.

> support customized trigger/accumulationMode in BeamSql
> --
>
> Key: BEAM-3749
> URL: https://issues.apache.org/jira/browse/BEAM-3749
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 2h 10m
>  Remaining Estimate: 0h
>
> Currently BeamSql use {{DefaultTrigger}} for aggregation operations. 
> By adding two options {{withTrigger(Trigger)}} and 
> {{withAccumulationMode(AccumulationMode)}}, developers can specify their own 
> aggregation strategies with BeamSql.
> [~xumingming] [~kedin] [~kenn] for any comments.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3749) support customized trigger/accumulationMode in BeamSql

2018-03-07 Thread Xu Mingmin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3749?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16390271#comment-16390271
 ] 

Xu Mingmin commented on BEAM-3749:
--

I get your point, below code doesn't use the trigger set in 
{{Window.configure().triggering}}, it use a {{DefaultTrigger}}:
{code}
input.apply(Window.configure().triggering( ... )
.accumulatingFiredPanes()
.withAllowedLateness(Duration.ZERO)
)
.apply( BeamSql.query("SELECT count(*) from PCOLLECTION GROUP BY 
TUMBLE(CURRENT_TIMESTAMP, INTERVAL '1' MINUTE)"));
{code}

So I suppose {{Window.configure().triggering}} should always use with 
{{Window.into}} together.

> support customized trigger/accumulationMode in BeamSql
> --
>
> Key: BEAM-3749
> URL: https://issues.apache.org/jira/browse/BEAM-3749
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 2h 10m
>  Remaining Estimate: 0h
>
> Currently BeamSql use {{DefaultTrigger}} for aggregation operations. 
> By adding two options {{withTrigger(Trigger)}} and 
> {{withAccumulationMode(AccumulationMode)}}, developers can specify their own 
> aggregation strategies with BeamSql.
> [~xumingming] [~kedin] [~kenn] for any comments.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3254) Add Intellij hints to projects containing code generation tasks

2018-03-07 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3254?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16390240#comment-16390240
 ] 

Kenneth Knowles commented on BEAM-3254:
---

I think I added enough of these that it mostly worked out of the box. I use 
Gradle + IntelliJ to develop Beam, and I have no hesitation to wipe out my 
project and import fresh - I have done it many times.

> Add Intellij hints to projects containing code generation tasks
> ---
>
> Key: BEAM-3254
> URL: https://issues.apache.org/jira/browse/BEAM-3254
> Project: Beam
>  Issue Type: Sub-task
>  Components: build-system
>Reporter: Luke Cwik
>Assignee: Kenneth Knowles
>Priority: Major
> Fix For: 2.4.0
>
>
> Several projects rely on proto/avro/java annotation processor generated files 
> and these files are currently not known about by Intellij when the project is 
> imported.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (BEAM-3254) Add Intellij hints to projects containing code generation tasks

2018-03-07 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3254?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles resolved BEAM-3254.
---
   Resolution: Fixed
Fix Version/s: 2.4.0

> Add Intellij hints to projects containing code generation tasks
> ---
>
> Key: BEAM-3254
> URL: https://issues.apache.org/jira/browse/BEAM-3254
> Project: Beam
>  Issue Type: Sub-task
>  Components: build-system
>Reporter: Luke Cwik
>Assignee: Kenneth Knowles
>Priority: Major
> Fix For: 2.4.0
>
>
> Several projects rely on proto/avro/java annotation processor generated files 
> and these files are currently not known about by Intellij when the project is 
> imported.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3254) Add Intellij hints to projects containing code generation tasks

2018-03-07 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3254?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles reassigned BEAM-3254:
-

Assignee: Kenneth Knowles

> Add Intellij hints to projects containing code generation tasks
> ---
>
> Key: BEAM-3254
> URL: https://issues.apache.org/jira/browse/BEAM-3254
> Project: Beam
>  Issue Type: Sub-task
>  Components: build-system
>Reporter: Luke Cwik
>Assignee: Kenneth Knowles
>Priority: Major
>
> Several projects rely on proto/avro/java annotation processor generated files 
> and these files are currently not known about by Intellij when the project is 
> imported.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: Merge pull request #4817: Fallback to byte equality in MutaitonDetection

2018-03-07 Thread tgroh
This is an automated email from the ASF dual-hosted git repository.

tgroh pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit e8db6a81e9b03b5a0bbffc9117f3128ea23ff184
Merge: 94ef840 076f22e
Author: Thomas Groh 
AuthorDate: Wed Mar 7 12:58:24 2018 -0800

Merge pull request #4817: Fallback to byte equality in MutaitonDetection

[BEAM-3799] Fallback to byte equality in MutaitonDetection

 .../apache/beam/sdk/util/MutationDetectors.java| 15 -
 .../beam/sdk/util/MutationDetectorsTest.java   | 39 ++
 2 files changed, 53 insertions(+), 1 deletion(-)

-- 
To stop receiving notification emails like this one, please contact
tg...@apache.org.


[beam] branch master updated (94ef840 -> e8db6a8)

2018-03-07 Thread tgroh
This is an automated email from the ASF dual-hosted git repository.

tgroh pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 94ef840  [BEAM-3621] Add Spring repo to build_rules to allow 
downloading penta…
 add 076f22e  Fallback to byte equality in MutaitonDetection
 new e8db6a8  Merge pull request #4817: Fallback to byte equality in 
MutaitonDetection

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../apache/beam/sdk/util/MutationDetectors.java| 15 -
 .../beam/sdk/util/MutationDetectorsTest.java   | 39 ++
 2 files changed, 53 insertions(+), 1 deletion(-)

-- 
To stop receiving notification emails like this one, please contact
tg...@apache.org.


[jira] [Created] (BEAM-3802) Should be able to call dataflow queryMetrics more then once

2018-03-07 Thread Andrew Pilloud (JIRA)
Andrew Pilloud created BEAM-3802:


 Summary: Should be able to call dataflow queryMetrics more then 
once
 Key: BEAM-3802
 URL: https://issues.apache.org/jira/browse/BEAM-3802
 Project: Beam
  Issue Type: Bug
  Components: runner-dataflow
Reporter: Andrew Pilloud
Assignee: Thomas Groh


When you queryMetrics on the dataflow runner against a batch job, you always 
get the set of metrics as filtered by the first call to queryMetrics.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3802) Should be able to call dataflow queryMetrics more then once

2018-03-07 Thread Andrew Pilloud (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3802?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andrew Pilloud reassigned BEAM-3802:


Assignee: Andrew Pilloud  (was: Thomas Groh)

> Should be able to call dataflow queryMetrics more then once
> ---
>
> Key: BEAM-3802
> URL: https://issues.apache.org/jira/browse/BEAM-3802
> Project: Beam
>  Issue Type: Bug
>  Components: runner-dataflow
>Reporter: Andrew Pilloud
>Assignee: Andrew Pilloud
>Priority: Major
>
> When you queryMetrics on the dataflow runner against a batch job, you always 
> get the set of metrics as filtered by the first call to queryMetrics.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_Verify #4384

2018-03-07 Thread Apache Jenkins Server
See 


Changes:

[alan.myrvold] [BEAM-3621] Add Spring repo to build_rules to allow downloading 
pentaho

--
[...truncated 1.02 MB...]
copying apache_beam/runners/dataflow/native_io/iobase.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/streaming_create.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/direct/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/bundle_factory.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/clock.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor_test.py 
-> apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/evaluation_context.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/executor.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/helper_transforms.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/transform_evaluator.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/util.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/watermark_manager.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/experimental/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/experimental
copying apache_beam/runners/experimental/python_rpc_direct/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying 
apache_beam/runners/experimental/python_rpc_direct/python_rpc_direct_runner.py 
-> apache-beam-2.5.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/experimental/python_rpc_direct/server.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/job/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/manager.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/utils.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/job
copying apache_beam/runners/portability/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_main.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/test/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/test
copying apache_beam/runners/worker/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/bundle_processor.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/data_plane.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/data_plane_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/log_handler.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying 

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT #2

2018-03-07 Thread Apache Jenkins Server
See 


--
[...truncated 23.04 KB...]
[INFO] os.detected.release.like.debian: true
[INFO] os.detected.classifier: linux-x86_64
[INFO] 
[INFO] 
[INFO] Building Apache Beam :: SDKs :: Java :: IO :: File-based-io-tests 
2.5.0-SNAPSHOT
[INFO] 
[INFO] 
[INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce-maven-version) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] 
[INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Adding ignore: module-info
[INFO] 
[INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce-banned-dependencies) 
@ beam-sdks-java-io-file-based-io-tests ---
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) 
@ beam-sdks-java-io-file-based-io-tests ---
[INFO] 
[INFO] --- maven-resources-plugin:3.0.2:resources (default-resources) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 

[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.7.0:compile (default-compile) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-resources-plugin:3.0.2:testResources (default-testResources) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 

[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.7.0:testCompile (default-testCompile) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 4 source files to 

[INFO] 
:
 Some input files use or override a deprecated API.
[INFO] 
:
 Recompile with -Xlint:deprecation for details.
[INFO] 
[INFO] --- maven-checkstyle-plugin:3.0.0:check (default) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Starting audit...
Audit done.
[INFO] 
[INFO] --- maven-surefire-plugin:2.20.1:test (default-test) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:regex-properties 
(render-artifact-id) @ beam-sdks-java-io-file-based-io-tests ---
[INFO] 
[INFO] --- maven-jar-plugin:3.0.2:jar (default-jar) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Building jar: 

[INFO] 
[INFO] --- maven-site-plugin:3.7:attach-descriptor (attach-descriptor) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Skipping because packaging 'jar' is not pom.
[INFO] 
[INFO] --- maven-jar-plugin:3.0.2:test-jar (default-test-jar) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Building jar: 

[INFO] 
[INFO] --- maven-shade-plugin:3.1.0:shade (bundle-and-repackage) @ 
beam-sdks-java-io-file-based-io-tests ---
[INFO] Excluding com.google.code.findbugs:jsr305:jar:3.0.1 from the shaded jar.
[INFO] Excluding com.fasterxml.jackson.core:jackson-core:jar:2.8.9 from the 
shaded jar.
[INFO] Excluding com.fasterxml.jackson.core:jackson-annotations:jar:2.8.9 from 
the shaded jar.
[INFO] Excluding com.fasterxml.jackson.core:jackson-databind:jar:2.8.9 from the 
shaded jar.
[INFO] Excluding org.slf4j:slf4j-api:jar:1.7.25 from the shaded jar.
[INFO] Excluding joda-time:joda-time:jar:2.4 from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-runners-google-cloud-dataflow-java:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-google-cloud-platform-core:jar:2.5.0-SNAPSHOT
 from the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:gcsio:jar:1.4.5 from the shaded 
jar.
[INFO] Excluding 

[jira] [Comment Edited] (BEAM-3749) support customized trigger/accumulationMode in BeamSql

2018-03-07 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3749?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16390129#comment-16390129
 ] 

Kenneth Knowles edited comment on BEAM-3749 at 3/7/18 8:13 PM:
---

What I mean is that the windowing itself is set up in the SQL via using 
TUMBLE/HOP/SESSION. That is independent of the trigger. If the CLI sets up the 
trigger using {{Window.configure()}} and then the SQL set up the WindowFn then 
it will work.


was (Author: kenn):
What I mean is that the windowing itself is set up in the SQL via using 
TUMBLE/HOP/SESSION. That is independent of the trigger.

> support customized trigger/accumulationMode in BeamSql
> --
>
> Key: BEAM-3749
> URL: https://issues.apache.org/jira/browse/BEAM-3749
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 2h 10m
>  Remaining Estimate: 0h
>
> Currently BeamSql use {{DefaultTrigger}} for aggregation operations. 
> By adding two options {{withTrigger(Trigger)}} and 
> {{withAccumulationMode(AccumulationMode)}}, developers can specify their own 
> aggregation strategies with BeamSql.
> [~xumingming] [~kedin] [~kenn] for any comments.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3749) support customized trigger/accumulationMode in BeamSql

2018-03-07 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3749?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16390129#comment-16390129
 ] 

Kenneth Knowles commented on BEAM-3749:
---

What I mean is that the windowing itself is set up in the SQL via using 
TUMBLE/HOP/SESSION. That is independent of the trigger.

> support customized trigger/accumulationMode in BeamSql
> --
>
> Key: BEAM-3749
> URL: https://issues.apache.org/jira/browse/BEAM-3749
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 2h 10m
>  Remaining Estimate: 0h
>
> Currently BeamSql use {{DefaultTrigger}} for aggregation operations. 
> By adding two options {{withTrigger(Trigger)}} and 
> {{withAccumulationMode(AccumulationMode)}}, developers can specify their own 
> aggregation strategies with BeamSql.
> [~xumingming] [~kedin] [~kenn] for any comments.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3749) support customized trigger/accumulationMode in BeamSql

2018-03-07 Thread Xu Mingmin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3749?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16390124#comment-16390124
 ] 

Xu Mingmin commented on BEAM-3749:
--

It doesn't, with {{Window.configure()}} the code is
{code}
input
  .apply(Window.into(FixedWindows.of(Duration.standardMinutes(2
  .apply(Window.configure().triggering( ... )
.accumulatingFiredPanes()

.withAllowedLateness(Duration.ZERO)
)
  .apply( BeamSql.query("SELECT count(*)  from PCOLLECTION " ))
{code}
 

> support customized trigger/accumulationMode in BeamSql
> --
>
> Key: BEAM-3749
> URL: https://issues.apache.org/jira/browse/BEAM-3749
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Xu Mingmin
>Assignee: Xu Mingmin
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 2h 10m
>  Remaining Estimate: 0h
>
> Currently BeamSql use {{DefaultTrigger}} for aggregation operations. 
> By adding two options {{withTrigger(Trigger)}} and 
> {{withAccumulationMode(AccumulationMode)}}, developers can specify their own 
> aggregation strategies with BeamSql.
> [~xumingming] [~kedin] [~kenn] for any comments.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] branch revert-4805-go-sdk updated (bea6536 -> 56515fa)

2018-03-07 Thread kenn
This is an automated email from the ASF dual-hosted git repository.

kenn pushed a change to branch revert-4805-go-sdk
in repository https://gitbox.apache.org/repos/asf/beam.git.


omit bea6536  Revert "Merge master into go-sdk"
 add 303efb1  Bump sdks/go/container.pom.xml to 2.5.0-SNAPSHOT
 add d562f65  Merge pull request #4809: Bump sdks/go/container/pom.xml to 
2.5.0-SNAPSHOT
 add a9a1fa9  [BEAM-3791] Update version number in build_rules.gradle
 add 5c1e49c  Merge pull request #4810 from alanmyrvold/alan-version
 add 788c83c  [BEAM-3753] Fix failing integration tests
 add f28d198  [BEAM-3753] Rename *ITCase.java tests files to *Test.java
 add 8ec9c81  Merge pull request #4767: [BEAM-3753] Fix Flink Runner 
integration tests
 add c6467d8  [BEAM-3043] Set user-specified PTransform names on Flink 
operators
 add d497bfa  Merge pull request #4759: [BEAM-3043] Set user-specified 
PTransform names on Flink operators
 add 24ff5b6  Make StateInternals short state method defaulting to the 
implementation all runners use to simplify the contract the user has to 
implement
 add 3ca6e8c  Merge pull request #4813: [BEAM-3794] Make StateInternals 
short state method defaulting to the implementation all runners use to simplify 
the contract the user has to implement
 new 56515fa  Revert "Merge master into go-sdk"

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (bea6536)
\
 N -- N -- N   refs/heads/revert-4805-go-sdk (56515fa)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 build_rules.gradle |  2 +-
 .../apex/translation/utils/ApexStateInternals.java |  6 ---
 .../beam/runners/core/InMemoryStateInternals.java  |  6 ---
 .../apache/beam/runners/core/StateInternals.java   |  5 ++-
 .../direct/CopyOnAccessInMemoryStateInternals.java |  5 ---
 runners/flink/build.gradle |  1 +
 runners/flink/pom.xml  |  6 +++
 .../flink/FlinkBatchTransformTranslators.java  | 40 +++-
 .../flink/FlinkStreamingTransformTranslators.java  | 43 --
 .../state/FlinkBroadcastStateInternals.java|  9 -
 .../state/FlinkKeyGroupStateInternals.java |  9 -
 .../streaming/state/FlinkSplitStateInternals.java  |  9 -
 .../streaming/state/FlinkStateInternals.java   |  9 -
 ...ingITCase.java => ReadSourceStreamingTest.java} | 14 +--
 .../{ReadSourceITCase.java => ReadSourceTest.java} |  4 +-
 ...nsITCase.java => TopWikipediaSessionsTest.java} | 15 ++--
 .../spark/stateful/SparkStateInternals.java|  6 ---
 17 files changed, 81 insertions(+), 108 deletions(-)
 rename 
runners/flink/src/test/java/org/apache/beam/runners/flink/{ReadSourceStreamingITCase.java
 => ReadSourceStreamingTest.java} (81%)
 rename 
runners/flink/src/test/java/org/apache/beam/runners/flink/{ReadSourceITCase.java
 => ReadSourceTest.java} (96%)
 rename 
runners/flink/src/test/java/org/apache/beam/runners/flink/streaming/{TopWikipediaSessionsITCase.java
 => TopWikipediaSessionsTest.java} (93%)

-- 
To stop receiving notification emails like this one, please contact
k...@apache.org.


Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1068

2018-03-07 Thread Apache Jenkins Server
See 


Changes:

[alan.myrvold] [BEAM-3621] Add Spring repo to build_rules to allow downloading 
pentaho

--
[...truncated 142.04 KB...]
part = context.transforms.get_by_id(transform_id)
  File 
"
 line 69, in get_by_id
self._id_to_proto[id], self._pipeline_context)
  File 
"
 line 842, in from_runner_api
part = context.transforms.get_by_id(transform_id)
  File 
"
 line 69, in get_by_id
self._id_to_proto[id], self._pipeline_context)
  File 
"
 line 842, in from_runner_api
part = context.transforms.get_by_id(transform_id)
  File 
"
 line 69, in get_by_id
self._id_to_proto[id], self._pipeline_context)
  File 
"
 line 847, in from_runner_api
for tag, id in proto.outputs.items()}
  File 
"
 line 847, in 
for tag, id in proto.outputs.items()}
  File 
"
 line 69, in get_by_id
self._id_to_proto[id], self._pipeline_context)
  File 
"
 line 152, in from_runner_api
element_type=pickler.loads(proto.coder_id),
  File 
"
 line 221, in loads
return dill.loads(s)
  File 
"
 line 277, in loads
return load(file)
  File 
"
 line 266, in load
obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 857, in load
key = read(1)
  File 
"
 line 276, in signalhandler
raise TimedOutException()
TimedOutException: 'test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest)'

==
ERROR: test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest)
--
Traceback (most recent call last):
  File 
"
 line 812, in run
test(orig)
  File 
"
 line 45, in __call__
return self.run(*arg, **kwarg)
  File 
"
 line 133, in run
self.runTest(result)
  File 
"
 line 151, in runTest
test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 395, in __call__
return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 331, in run
testMethod()
  File 
"
 line 178, in test_iterable_side_input
pipeline.run()
  File 
"
 line 102, in run
result = super(TestPipeline, self).run()
  File 

[jira] [Resolved] (BEAM-3621) HCatalog failing frequently on fetch of org.pentaho:pentaho-aggdesigner-algorithm:5.1.5-jhyde

2018-03-07 Thread Luke Cwik (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3621?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Luke Cwik resolved BEAM-3621.
-
   Resolution: Fixed
Fix Version/s: Not applicable

> HCatalog failing frequently on fetch of 
> org.pentaho:pentaho-aggdesigner-algorithm:5.1.5-jhyde
> -
>
> Key: BEAM-3621
> URL: https://issues.apache.org/jira/browse/BEAM-3621
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-hcatalog
>Reporter: Kenneth Knowles
>Assignee: Alan Myrvold
>Priority: Critical
>  Labels: flake
> Fix For: Not applicable
>
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> The download seems to fail. There are three aspects to this:
>  - The Gradle build should have a local persistent cache so it isn't 
> fetching. But as I understand it Gradle has a policy of still pinging the 
> repo even when cached.
>  - The Gradle build is probably re-fetching due to {{--rerun-tasks}} so we 
> need to stop doing that.
>  - The artifact is not in "Central" repository but in "Spring Plugins". That 
> is probably why all the failures are on this artifact and no other artifacts 
> cause failures.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3621) HCatalog failing frequently on fetch of org.pentaho:pentaho-aggdesigner-algorithm:5.1.5-jhyde

2018-03-07 Thread Luke Cwik (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3621?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Luke Cwik reassigned BEAM-3621:
---

Assignee: Alan Myrvold

> HCatalog failing frequently on fetch of 
> org.pentaho:pentaho-aggdesigner-algorithm:5.1.5-jhyde
> -
>
> Key: BEAM-3621
> URL: https://issues.apache.org/jira/browse/BEAM-3621
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-hcatalog
>Reporter: Kenneth Knowles
>Assignee: Alan Myrvold
>Priority: Critical
>  Labels: flake
> Fix For: Not applicable
>
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> The download seems to fail. There are three aspects to this:
>  - The Gradle build should have a local persistent cache so it isn't 
> fetching. But as I understand it Gradle has a policy of still pinging the 
> repo even when cached.
>  - The Gradle build is probably re-fetching due to {{--rerun-tasks}} so we 
> need to stop doing that.
>  - The artifact is not in "Central" repository but in "Spring Plugins". That 
> is probably why all the failures are on this artifact and no other artifacts 
> cause failures.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3798) Performance tests flaky due to Dataflow transient errors

2018-03-07 Thread JIRA

[ 
https://issues.apache.org/jira/browse/BEAM-3798?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16390090#comment-16390090
 ] 

Łukasz Gajowy commented on BEAM-3798:
-

I have access only to Jenkins logs. I'm not sure that Dataflow job ID is 
visible in Jenkins logs. I reproduced the issue many times on our own Dataflow 
project. 

Example Jenkins logs of job that seems to have this issue: 
[https://builds.apache.org/view/A-D/view/Beam/job/beam_PerformanceTests_JDBC/291/console]
 

> Performance tests flaky due to Dataflow transient errors
> 
>
> Key: BEAM-3798
> URL: https://issues.apache.org/jira/browse/BEAM-3798
> Project: Beam
>  Issue Type: Bug
>  Components: runner-dataflow
>Reporter: Łukasz Gajowy
>Assignee: Thomas Groh
>Priority: Major
>
> Performance tests are flaky due to transient errors that happened during data 
> processing (eg. SocketTimeoutException while connecting to DB). Currently 
> exceptions that happen on Dataflow runner but are retried successfully, fail 
> the test regardless of the final job state (giving a false-negative result). 
> Possible solution for batch scenarios:
> We could "rethrow" exceptions that happened due to transient errors *only* if 
> the job status is other than DONE.
> Possible solution for streaming scenarios:
> (don't know yet)
> [Link to discussion on dev list 
> |https://lists.apache.org/thread.html/e480f8181913dc81d2d4cd1430557a646537473ccf29fe6390229098@%3Cdev.beam.apache.org%3E]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #108

2018-03-07 Thread Apache Jenkins Server
See 


Changes:

[yifanzou] BEAM-3339 Mobile gaming automation for Java nightly snapshot

--
[...truncated 2.41 MB...]
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at 
org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient.modifyAckDeadline(PubsubJsonClient.java:233)
at 
org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubReader.nackBatch(PubsubUnboundedSource.java:661)
at 
org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubCheckpoint.nackAll(PubsubUnboundedSource.java:355)
at 
org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubSource.createReader(PubsubUnboundedSource.java:1157)
at 
org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource$PubsubSource.createReader(PubsubUnboundedSource.java:1096)
at 
org.apache.beam.runners.spark.io.MicrobatchSource$ReaderLoader.call(MicrobatchSource.java:312)
at 
org.apache.beam.runners.spark.io.MicrobatchSource$ReaderLoader.call(MicrobatchSource.java:299)
at 
org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4904)
at 
org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3628)
at 
org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2336)
at 
org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
at 
org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
at 
org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at 
org.apache.beam.runners.spark.repackaged.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899)
at 
org.apache.beam.runners.spark.io.MicrobatchSource.getOrCreateReader(MicrobatchSource.java:131)
at 
org.apache.beam.runners.spark.stateful.StateSpecFunctions$1.apply(StateSpecFunctions.java:154)
at 
org.apache.beam.runners.spark.stateful.StateSpecFunctions$1.apply(StateSpecFunctions.java:105)
at 
org.apache.spark.streaming.StateSpec$$anonfun$1.apply(StateSpec.scala:181)
at 
org.apache.spark.streaming.StateSpec$$anonfun$1.apply(StateSpec.scala:180)
at 
org.apache.spark.streaming.rdd.MapWithStateRDDRecord$$anonfun$updateRecordWithData$1.apply(MapWithStateRDD.scala:57)
at 
org.apache.spark.streaming.rdd.MapWithStateRDDRecord$$anonfun$updateRecordWithData$1.apply(MapWithStateRDD.scala:55)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at 
org.apache.spark.streaming.rdd.MapWithStateRDDRecord$.updateRecordWithData(MapWithStateRDD.scala:55)
at 
org.apache.spark.streaming.rdd.MapWithStateRDD.compute(MapWithStateRDD.scala:159)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334)
at 
org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1038)
at 
org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1029)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:969)
at 
org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1029)
at 
org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:760)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:285)
at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

Mar 07, 2018 7:17:08 PM org.apache.spark.internal.Logging$class logInfo
INFO: 

[beam] branch master updated (3ca6e8c -> 94ef840)

2018-03-07 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 3ca6e8c  Merge pull request #4813: [BEAM-3794] Make StateInternals 
short state method defaulting to the implementation all runners use to simplify 
the contract the user has to implement
 add 9f28897  [BEAM-3621] Add Spring repo to build_rules to allow 
downloading pentaho dependency.
 new 94ef840  [BEAM-3621] Add Spring repo to build_rules to allow 
downloading penta…

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 build_rules.gradle | 3 +++
 1 file changed, 3 insertions(+)

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


[beam] 01/01: [BEAM-3621] Add Spring repo to build_rules to allow downloading penta…

2018-03-07 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 94ef840c47d05487820e4aff8ba95095d49e7d79
Merge: 3ca6e8c 9f28897
Author: Lukasz Cwik 
AuthorDate: Wed Mar 7 11:16:03 2018 -0800

[BEAM-3621] Add Spring repo to build_rules to allow downloading penta…

 build_rules.gradle | 3 +++
 1 file changed, 3 insertions(+)


-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


[jira] [Created] (BEAM-3801) Test that LengthPrefixing Unknown Coders is Idempotent

2018-03-07 Thread Thomas Groh (JIRA)
Thomas Groh created BEAM-3801:
-

 Summary: Test that LengthPrefixing Unknown Coders is Idempotent
 Key: BEAM-3801
 URL: https://issues.apache.org/jira/browse/BEAM-3801
 Project: Beam
  Issue Type: Bug
  Components: runner-core
Reporter: Thomas Groh
Assignee: Thomas Groh






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #107

2018-03-07 Thread Apache Jenkins Server
See 


--
[...truncated 563.29 KB...]
Mar 07, 2018 6:57:38 PM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Mar 07, 2018 6:57:38 PM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 4 non-empty blocks out of 4 blocks
Mar 07, 2018 6:57:38 PM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Mar 07, 2018 6:57:38 PM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 3 non-empty blocks out of 4 blocks
Mar 07, 2018 6:57:38 PM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Mar 07, 2018 6:57:38 PM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 07, 2018 6:57:38 PM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Mar 07, 2018 6:57:38 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 2.0 (TID 8). 13584 bytes result sent to driver
Mar 07, 2018 6:57:38 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 2.0 (TID 8) in 65 ms on localhost (executor 
driver) (1/4)
Mar 07, 2018 6:57:39 PM 
org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer f595b23b-86d4-4ad6-9566-cce6be4bdb24 for window 
[2015-11-17T00:00:00.000Z..2015-11-17T01:00:00.000Z) pane 
PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} 
destination null
Mar 07, 2018 6:57:39 PM 
org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer c1817075-5332-420e-b05c-2fd817609451 for window 
[2015-11-17T00:00:00.000Z..2015-11-17T01:00:00.000Z) pane 
PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} 
destination null
Mar 07, 2018 6:57:39 PM 
org.apache.beam.sdk.io.WriteFiles$WriteShardsIntoTempFilesFn processElement
INFO: Opening writer f79821a2-79f9-4748-8846-1a3abf425750 for window 
[2015-11-17T00:00:00.000Z..2015-11-17T01:00:00.000Z) pane 
PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, onTimeIndex=0} 
destination null
Mar 07, 2018 6:57:39 PM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file 
/tmp/groovy-generated-8987517876280218121-tmpdir/word-count-beam/.temp-beam-2018-03-07_18-57-28-0/c1817075-5332-420e-b05c-2fd817609451
Mar 07, 2018 6:57:39 PM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file 
/tmp/groovy-generated-8987517876280218121-tmpdir/word-count-beam/.temp-beam-2018-03-07_18-57-28-0/f79821a2-79f9-4748-8846-1a3abf425750
Mar 07, 2018 6:57:39 PM org.apache.beam.sdk.io.FileBasedSink$Writer close
INFO: Successfully wrote temporary file 
/tmp/groovy-generated-8987517876280218121-tmpdir/word-count-beam/.temp-beam-2018-03-07_18-57-28-0/f595b23b-86d4-4ad6-9566-cce6be4bdb24
Mar 07, 2018 6:57:39 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 2.0 (TID 9). 16368 bytes result sent to driver
Mar 07, 2018 6:57:39 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 2.0 (TID 11). 16411 bytes result sent to driver
Mar 07, 2018 6:57:39 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 2.0 (TID 9) in 275 ms on localhost (executor 
driver) (2/4)
Mar 07, 2018 6:57:39 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 2.0 (TID 11) in 276 ms on localhost (executor 
driver) (3/4)
Mar 07, 2018 6:57:39 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 2.0 (TID 10). 16368 bytes result sent to driver
Mar 07, 2018 6:57:39 PM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 2.0 (TID 10) in 280 ms on localhost (executor 
driver) (4/4)
Mar 07, 2018 6:57:39 PM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 2.0, whose tasks have all completed, from pool 
Mar 07, 2018 6:57:39 PM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 2 (repartition at GroupCombineFunctions.java:242) 
finished in 0.281 s
Mar 07, 2018 6:57:39 PM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Mar 07, 2018 6:57:39 PM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Mar 07, 2018 6:57:39 PM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ShuffleMapStage 3, ResultStage 4)
Mar 07, 2018 6:57:39 PM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Mar 07, 2018 6:57:39 PM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ShuffleMapStage 3 (MapPartitionsRDD[70] at repartition at 
GroupCombineFunctions.java:242), which has no missing parents
Mar 07, 2018 6:57:39 PM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_3 

[jira] [Commented] (BEAM-3798) Performance tests flaky due to Dataflow transient errors

2018-03-07 Thread Chamikara Jayalath (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3798?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16390022#comment-16390022
 ] 

Chamikara Jayalath commented on BEAM-3798:
--

Do you have the Dataflow job ID of a job that passes with transient errors ? 
(couldn't find this from Jenkins logs)

> Performance tests flaky due to Dataflow transient errors
> 
>
> Key: BEAM-3798
> URL: https://issues.apache.org/jira/browse/BEAM-3798
> Project: Beam
>  Issue Type: Bug
>  Components: runner-dataflow
>Reporter: Łukasz Gajowy
>Assignee: Thomas Groh
>Priority: Major
>
> Performance tests are flaky due to transient errors that happened during data 
> processing (eg. SocketTimeoutException while connecting to DB). Currently 
> exceptions that happen on Dataflow runner but are retried successfully, fail 
> the test regardless of the final job state (giving a false-negative result). 
> Possible solution for batch scenarios:
> We could "rethrow" exceptions that happened due to transient errors *only* if 
> the job status is other than DONE.
> Possible solution for streaming scenarios:
> (don't know yet)
> [Link to discussion on dev list 
> |https://lists.apache.org/thread.html/e480f8181913dc81d2d4cd1430557a646537473ccf29fe6390229098@%3Cdev.beam.apache.org%3E]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam-site] 01/02: Update site to mention the 2 Python lint envs

2018-03-07 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit c06a5a1d9f181c054f0b1af0ee36d8fd4bd20915
Author: Holden Karau 
AuthorDate: Thu Mar 1 18:37:10 2018 -0800

Update site to mention the 2 Python lint envs
---
 content/contribute/contribution-guide/index.html | 2 +-
 src/contribute/contribution-guide.md | 3 ++-
 2 files changed, 3 insertions(+), 2 deletions(-)

diff --git a/content/contribute/contribution-guide/index.html 
b/content/contribute/contribution-guide/index.html
index ec6addf..fa8ebe8 100644
--- a/content/contribute/contribution-guide/index.html
+++ b/content/contribute/contribution-guide/index.html
@@ -435,7 +435,7 @@ environment before testing your code.
 command.
 
 $ pip install tox
-$ tox -e lint
+$ tox -e lint_py2,lint_py3
 
 
 
diff --git a/src/contribute/contribution-guide.md 
b/src/contribute/contribution-guide.md
index 5a7f0b9..fcf7502 100644
--- a/src/contribute/contribution-guide.md
+++ b/src/contribute/contribution-guide.md
@@ -248,7 +248,8 @@ To Check for lint errors locally, install "tox" package and 
run following
 command.
 
 $ pip install tox
-$ tox -e lint
+$ tox -e lint_py2,lint_py3
+
 
 Beam supports running Python SDK tests using Maven. For this, navigate to root
 directory of your Apache Beam clone and execute following command. Currently

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] branch mergebot updated (fb71d85 -> 941d495)

2018-03-07 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a change to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git.


 discard fb71d85  This closes #396
 discard 2c9ff00  Update site to mention the 2 Python lint envs
 new c06a5a1  Update site to mention the 2 Python lint envs
 new 941d495  This closes #396

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (fb71d85)
\
 N -- N -- N   refs/heads/mergebot (941d495)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] 02/02: This closes #396

2018-03-07 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit 941d495b26eeda9a461fe54665254cb52c725ffc
Merge: d4609c5 c06a5a1
Author: Mergebot 
AuthorDate: Wed Mar 7 10:39:27 2018 -0800

This closes #396

 content/contribute/contribution-guide/index.html | 2 +-
 src/contribute/contribution-guide.md | 3 ++-
 2 files changed, 3 insertions(+), 2 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Spark #4365

2018-03-07 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostRelease_NightlySnapshot #106

2018-03-07 Thread Apache Jenkins Server
See 


--
GitHub pull request #4788 of commit 16fd31864f2b7d6d28d74cde4c43420f81d112cc, 
no merge conflicts.
Setting status of 16fd31864f2b7d6d28d74cde4c43420f81d112cc to PENDING with url 
https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/106/ and 
message: 'Build started sha1 is merged.'
Using context: Jenkins: ./gradlew :release:runQuickstartsJava
[EnvInject] - Loading node environment variables.
Building remotely on beam3 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/4788/*:refs/remotes/origin/pr/4788/*
 > git rev-parse refs/remotes/origin/pr/4788/merge^{commit} # timeout=10
 > git rev-parse refs/remotes/origin/origin/pr/4788/merge^{commit} # timeout=10
Checking out Revision 48278d6a893581f633cb4bdde93353ab7eb4bb74 
(refs/remotes/origin/pr/4788/merge)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 48278d6a893581f633cb4bdde93353ab7eb4bb74
Commit message: "Merge 16fd31864f2b7d6d28d74cde4c43420f81d112cc into 
3ca6e8c517fc0bd2b0fc7202b680a9f85bd7f597"
 > git rev-list --no-walk 334472674e50d3f7a9aa4ca5db311cea87ccdb23 # timeout=10
First time build. Skipping changelog.
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 
 
-Pver= -Prepourl= :release:runQuickstartsJava
Parallel execution with configuration on demand is an incubating feature.
Applying build_rules.gradle to src
Applying build_rules.gradle to apex
applyJavaNature with [artifactId:beam-runners-apex] for project apex
Applying build_rules.gradle to fn-execution
applyJavaNature with [enableFindbugs:false] for project fn-execution
applyGrpcNature with default configuration for project fn-execution
Applying build_rules.gradle to core-java
applyJavaNature with [artifactId:runners-core-java] for project core-java
Generating :runnullJavanull

FAILURE: Build failed with an exception.

* What went wrong:
Could not determine the dependencies of task ':release:runQuickstartsJava'.
> Task with path ':runners:apex:runMobileGamingJavaSpark' not found in project 
> ':release'.

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 14s
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


Build failed in Jenkins: beam_PerformanceTests_Spark #1440

2018-03-07 Thread Apache Jenkins Server
See 


Changes:

[grzegorz.kolakowski] [BEAM-3043] Set user-specified PTransform names on Flink 
operators

[grzegorz.kolakowski] [BEAM-3753] Fix failing integration tests

[grzegorz.kolakowski] [BEAM-3753] Rename *ITCase.java tests files to *Test.java

[rmannibucau] Make StateInternals short state method defaulting to the 
implementation

--
[...truncated 95.62 KB...]
'apache-beam-testing:bqjob_r79b3a5c9dd7319d0_016201b08299_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-03-07 18:20:17,868 e938099d MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-07 18:20:43,201 e938099d MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-07 18:20:45,659 e938099d MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:02.45s,  CPU:0.27s,  MaxMemory:25484kb 
STDOUT: Upload complete.
Waiting on bqjob_rb4eed11defee83a_016201b1aa5a_1 ... (0s) Current status: 
RUNNING 
Waiting on bqjob_rb4eed11defee83a_016201b1aa5a_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_rb4eed11defee83a_016201b1aa5a_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-03-07 18:20:45,660 e938099d MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-07 18:21:02,133 e938099d MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-07 18:21:04,499 e938099d MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:02.35s,  CPU:0.31s,  MaxMemory:25456kb 
STDOUT: Upload complete.
Waiting on bqjob_r1d5353cf2db82900_016201b1f4c0_1 ... (0s) Current status: 
RUNNING 
 Waiting on bqjob_r1d5353cf2db82900_016201b1f4c0_1 ... (0s) 
Current status: DONE   
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r1d5353cf2db82900_016201b1f4c0_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: 
/usr/lib/google-cloud-sdk/platform/bq/third_party/oauth2client/contrib/gce.py:73:
 UserWarning: You have requested explicit scopes to be used with a GCE service 
account.
Using this argument will have no effect on the actual scopes for tokens
requested. These scopes are set at VM instance creation time and
can't be overridden in the request.

  warnings.warn(_SCOPES_WARNING)

2018-03-07 18:21:04,500 e938099d MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-07 18:21:25,506 e938099d MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-03-07 18:21:28,008 e938099d MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1,  WallTime:0:02.49s,  CPU:0.25s,  MaxMemory:25452kb 
STDOUT: Upload complete.
Waiting on 

[jira] [Commented] (BEAM-1755) Python-SDK: Move build specific scripts to a dedicated folder

2018-03-07 Thread Ismael Solis Moreno (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1755?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16389924#comment-16389924
 ] 

Ismael Solis Moreno commented on BEAM-1755:
---

Hello,

I am starting and would like to contribute on this item. I would like to 
understand clearly the requirement.

thanks.

> Python-SDK: Move build specific scripts to a dedicated folder
> -
>
> Key: BEAM-1755
> URL: https://issues.apache.org/jira/browse/BEAM-1755
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-py-core
>Reporter: Tibor Kiss
>Priority: Minor
>  Labels: newbie, starter
>
> There are numerous build related files (run_*.sh, generate_pydoc.sh and most 
> recently findSupportedPython.groovy) are located now in Python-SDK's root.
> We should create a dedicated {{build_utils}} directory and relocate the 
> scripts.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3799) Nexmark Query 10 breaks with direct runner

2018-03-07 Thread Robert Bradshaw (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3799?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16389910#comment-16389910
 ] 

Robert Bradshaw commented on BEAM-3799:
---

The direct runner does extra checking. Looks like the benchmarks is incorrectly 
written. 

> Nexmark Query 10 breaks with direct runner
> --
>
> Key: BEAM-3799
> URL: https://issues.apache.org/jira/browse/BEAM-3799
> Project: Beam
>  Issue Type: Bug
>  Components: runner-direct
>Affects Versions: 2.4.0, 2.5.0
>Reporter: Ismaël Mejía
>Assignee: Thomas Groh
>Priority: Major
> Fix For: 2.4.0
>
>
> While running query 10 with the direct runner like this:
> {quote}mvn exec:java -Dexec.mainClass=org.apache.beam.sdk.nexmark.Main 
> -Pdirect-runner -Dexec.args="--runner=DirectRunner --query=10 
> --streaming=false --manageResources=false --monitorJobs=true 
> --enforceEncodability=true --enforceImmutability=true" -pl 'sdks/java/nexmark'
> {quote}
> I found that it breaks with the direct runner with  following exception (it 
> works ok with the other runners):
> {quote}[WARNING] 
> java.lang.reflect.InvocationTargetException
>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke 
> (NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke 
> (DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke (Method.java:498)
>     at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:294)
>     at java.lang.Thread.run (Thread.java:748)
> Caused by: org.apache.beam.sdk.util.IllegalMutationException: PTransform 
> Query10/Query10.UploadEvents/ParMultiDo(Anonymous) mutated value KV{null, 
> 2015-07-15T00:00:09.999Z shard-3-of-00025 0 ON_TIME null
> } after it was output (new value was KV{null, 2015-07-15T00:00:09.999Z 
> shard-3-of-00025 0 ON_TIME null
> }). Values must not be mutated in any way after being output.
>     at 
> org.apache.beam.runners.direct.ImmutabilityCheckingBundleFactory$ImmutabilityEnforcingBundle.commit
>  (ImmutabilityCheckingBundleFactory.java:134)
>     at org.apache.beam.runners.direct.EvaluationContext.commitBundles 
> (EvaluationContext.java:212)
>     at org.apache.beam.runners.direct.EvaluationContext.handleResult 
> (EvaluationContext.java:152)
>     at 
> org.apache.beam.runners.direct.QuiescenceDriver$TimerIterableCompletionCallback.handleResult
>  (QuiescenceDriver.java:258)
>     at org.apache.beam.runners.direct.DirectTransformExecutor.finishBundle 
> (DirectTransformExecutor.java:190)
>     at org.apache.beam.runners.direct.DirectTransformExecutor.run 
> (DirectTransformExecutor.java:127)
>     at java.util.concurrent.Executors$RunnableAdapter.call 
> (Executors.java:511)
>     at java.util.concurrent.FutureTask.run (FutureTask.java:266)
>     at java.util.concurrent.ThreadPoolExecutor.runWorker 
> (ThreadPoolExecutor.java:1149)
>     at java.util.concurrent.ThreadPoolExecutor$Worker.run 
> (ThreadPoolExecutor.java:624)
>     at java.lang.Thread.run (Thread.java:748)
> Caused by: org.apache.beam.sdk.util.IllegalMutationException: Value KV{null, 
> 2015-07-15T00:00:09.999Z shard-3-of-00025 0 ON_TIME null
> } mutated illegally, new value was KV{null, 2015-07-15T00:00:09.999Z 
> shard-3-of-00025 0 ON_TIME null
> }. Encoding was 
> rO0ABXNyADZvcmcuYXBhY2hlLmJlYW0uc2RrLm5leG1hcmsucXVlcmllcy5RdWVyeTEwJE91dHB1dEZpbGUWUg9rZM1SvgIABUoABWluZGV4TAAIZmlsZW5hbWV0ABJMamF2YS9sYW5nL1N0cmluZztMAAxtYXhUaW1lc3RhbXB0ABdMb3JnL2pvZGEvdGltZS9JbnN0YW50O0wABXNoYXJkcQB-AAFMAAZ0aW1pbmd0ADpMb3JnL2FwYWNoZS9iZWFtL3Nkay90cmFuc2Zvcm1zL3dpbmRvd2luZy9QYW5lSW5mbyRUaW1pbmc7eHAAAHBzcgAVb3JnLmpvZGEudGltZS5JbnN0YW50Lci-0MYOnM0CAAFKAAdpTWlsbGlzeHFOjwLrD3QAFHNoYXJkLTAwMDAzLW9mLTAwMDI1fnIAOG9yZy5hcGFjaGUuYmVhbS5zZGsudHJhbnNmb3Jtcy53aW5kb3dpbmcuUGFuZUluZm8kVGltaW5nAAASAAB4cgAOamF2YS5sYW5nLkVudW0AABIAAHhwdAAHT05fVElNRQ,
>  now 
> rO0ABXNyADZvcmcuYXBhY2hlLmJlYW0uc2RrLm5leG1hcmsucXVlcmllcy5RdWVyeTEwJE91dHB1dEZpbGUWUg9rZM1SvgIABUoABWluZGV4TAAIZmlsZW5hbWV0ABJMamF2YS9sYW5nL1N0cmluZztMAAxtYXhUaW1lc3RhbXB0ABdMb3JnL2pvZGEvdGltZS9JbnN0YW50O0wABXNoYXJkcQB-AAFMAAZ0aW1pbmd0ADpMb3JnL2FwYWNoZS9iZWFtL3Nkay90cmFuc2Zvcm1zL3dpbmRvd2luZy9QYW5lSW5mbyRUaW1pbmc7eHAAAHBzcgAVb3JnLmpvZGEudGltZS5JbnN0YW50Lci-0MYOnM0CAAFKAAdpTWlsbGlzeHFOjwLrD3QAFHNoYXJkLTAwMDAzLW9mLTAwMDI1fnIAOG9yZy5hcGFjaGUuYmVhbS5zZGsudHJhbnNmb3Jtcy53aW5kb3dpbmcuUGFuZUluZm8kVGltaW5nAAASAAB4cgAOamF2YS5sYW5nLkVudW0AABIAAHhwdAAHT05fVElNRQ.
>     at 
> org.apache.beam.sdk.util.MutationDetectors$CodedValueMutationDetector.illegalMutation
>  (MutationDetectors.java:144)
>     at 
> org.apache.beam.sdk.util.MutationDetectors$CodedValueMutationDetector.verifyUnmodifiedThrowingCheckedExceptions
>  

Build failed in Jenkins: beam_PerformanceTests_Python #996

2018-03-07 Thread Apache Jenkins Server
See 


Changes:

[grzegorz.kolakowski] [BEAM-3043] Set user-specified PTransform names on Flink 
operators

[grzegorz.kolakowski] [BEAM-3753] Fix failing integration tests

[grzegorz.kolakowski] [BEAM-3753] Rename *ITCase.java tests files to *Test.java

[rmannibucau] Make StateInternals short state method defaulting to the 
implementation

--
[...truncated 726 B...]
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3ca6e8c517fc0bd2b0fc7202b680a9f85bd7f597 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3ca6e8c517fc0bd2b0fc7202b680a9f85bd7f597
Commit message: "Merge pull request #4813: [BEAM-3794] Make StateInternals 
short state method defaulting to the implementation all runners use to simplify 
the contract the user has to implement"
 > git rev-list --no-walk 5c1e49ca441af36cda0ea98946c681dbc39c8bb0 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins2155165313605885435.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins5087978867120724017.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins4697451845259205775.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins5787646217715421413.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/ad/dc/fcced9ec3f2561c0cbe8eb6527eef7cf4f4919a2b3a07891a36e846635af/setuptools-38.5.2-py2.py3-none-any.whl#md5=abd3307cdce6fb543b5a4d0e3e98bdb6
Downloading/unpacking pip from 
https://pypi.python.org/packages/b6/ac/7015eb97dc749283ffdec1c3a88ddb8ae03b8fad0f0e611408f196358da3/pip-9.0.1-py2.py3-none-any.whl#md5=297dbd16ef53bcef0447d245815f5144
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins8071939376302462532.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3023591087110408483.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Requirement already satisfied: numpy==1.13.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 22))
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23))
Requirement already satisfied: contextlib2>=0.5.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 24))
Requirement already satisfied: pywinrm in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: six in /usr/local/lib/python2.7/dist-packages 
(from absl-py->-r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: 

[beam] branch release-2.4.0 updated: Remove dev suffix from Python version.

2018-03-07 Thread robertwb
This is an automated email from the ASF dual-hosted git repository.

robertwb pushed a commit to branch release-2.4.0
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/release-2.4.0 by this push:
 new a454b66  Remove dev suffix from Python version.
a454b66 is described below

commit a454b6664a01a58f9bde4d33dd101ee57cc7d47d
Author: Robert Bradshaw 
AuthorDate: Wed Mar 7 10:07:59 2018 -0800

Remove dev suffix from Python version.
---
 sdks/python/apache_beam/version.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/sdks/python/apache_beam/version.py 
b/sdks/python/apache_beam/version.py
index a089c86..aba722d 100644
--- a/sdks/python/apache_beam/version.py
+++ b/sdks/python/apache_beam/version.py
@@ -18,4 +18,4 @@
 """Apache Beam SDK version information and utilities."""
 
 
-__version__ = '2.4.0.dev'
+__version__ = '2.4.0'

-- 
To stop receiving notification emails like this one, please contact
rober...@apache.org.


Build failed in Jenkins: beam_PerformanceTests_TextIOIT #241

2018-03-07 Thread Apache Jenkins Server
See 


Changes:

[grzegorz.kolakowski] [BEAM-3043] Set user-specified PTransform names on Flink 
operators

[grzegorz.kolakowski] [BEAM-3753] Fix failing integration tests

[grzegorz.kolakowski] [BEAM-3753] Rename *ITCase.java tests files to *Test.java

[rmannibucau] Make StateInternals short state method defaulting to the 
implementation

--
[...truncated 25.13 KB...]
[INFO] Excluding org.apache.httpcomponents:httpclient:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding org.apache.httpcomponents:httpcore:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding commons-codec:commons-codec:jar:1.3 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded 
jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-io-google-cloud-platform:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-protobuf:jar:2.5.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev374-1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.api:gax-grpc:jar:0.20.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.auto.value:auto-value:jar:1.5.3 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-pubsub:jar:v1-rev10-1.22.0 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-resolver:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding 

Build failed in Jenkins: beam_PerformanceTests_HadoopInputFormat #8

2018-03-07 Thread Apache Jenkins Server
See 


Changes:

[herohde] Initial sketches of a Go SDK

[herohde] Initial version of the direct style w/ direct runner. Incomplete.

[herohde] Add Data as UserFn context w/ immediate value.

[herohde] Added no-I/O wordcount for profiling.

[herohde] Fleshed out possible approach to generic transformations.

[herohde] Add “dag” example that use multiplexing and side input.

[herohde] Added a more complex DAG example.

[herohde] Add yatzy example with more complex construction-time setup

[herohde] Add proto for Fn API

[herohde] Add beam.Composite helper for the most common pattern to align with 
java

[herohde] Move pipeline-construction time errors into an accumulator

[herohde] Add Dataflow job and Fn API clients. Incomplete.

[herohde] Add binary cross-compile and upload to Dataflow runner. Incomplete.

[herohde] Add tentative runner indirection (default: local).

[herohde] Made data flow runner detect user main for cross-compilation.

[herohde] Remove error accumulation in favor of panic.

[herohde] Improve Dataflow translation of coders, side input and composite 
names.

[herohde] Fix name for AsView.

[herohde] Use 2 grpc endpoints in harness

[herohde] Add gRPC harness logging

[herohde] Flesh out harness and serialization further.

[herohde] Made the dataflow runner wait for job termination by default

[herohde] beam:

[herohde] beam:

[herohde] combinefn.go: fix compilation issues

[herohde] Improve dataflow serialization and execution. Incomplete.

[herohde] Sleep 30 sec in wordcap to allow logs to propagate to Cloud Logging.

[herohde] Move the 30s sleep for logging to the harness instead of in WordCap.

[herohde] Post-review updates.

[herohde] Doc updates.

[herohde] Flesh out coders. Incomplete.

[herohde] Added prototype implementation of more coders and the runner source.

[herohde] dofn: illustrates how dofns are written.

[herohde] beam: add viewfn and windowfn to side inputs match support Beam 1.0

[herohde] dofn: timers

[herohde] Complete revamp: coders, graph and execution use element-wise

[herohde] Fix coder encoding for Dataflow side input. Otherwise, the job is

[herohde] Added more godoc comments to graph types.

[herohde] Added more comments plus made local GBK use coder equality.

[herohde] Added Flatten support and “forest” example that uses it.

[herohde] Move bigqueryio to defunct

[herohde] Make forest example print less

[herohde] Add external struct registry and serialization.

[herohde] Updated comments in node.go.

[herohde] Replace real type with 'full type' since that's the current term.

[herohde] Refactor Fn API dependency.

[herohde] Added more comments to the runner/dataflow and runner/beamexec 
packages

[herohde] Fix most go vet issues

[herohde] Make core operations panic to cut down on the error propagation

[herohde] Add more comments to the graph package.

[herohde] Add DoFn wrapper to handle either function or (ptr to) struct

[herohde] Fix remaining go vet warnings.

[herohde] Code review for beam/graph/coder package.

[herohde] Code review of the runtime/graphx package.

[herohde] Remove Data options in favor of using a Fn struct

[herohde] Code review of the beam/graph/userfn package.

[herohde] Code review for beam/graph package.

[herohde] godoc for runtime/graphx

[herohde] Add support for []T and Combine functions

[herohde] Add adapted documentation from the Java SDK to the beam package

[herohde] Update snapshot of Fn API.

[herohde] Add experiments flag to the Dataflow runner

[herohde] Remove context arg from beamexec.Init

[herohde] Migration to Runner API.

[herohde] Add support for creating DOT graphs.

[herohde] Make pretty printing of types and coders more concise

[herohde] Add flexible Signature to aid type checking

[herohde] Adding unit testability to harness translation.

[herohde] Fix crash due to initialization order

[herohde] Add CreateValues and Impulse

[herohde] Add Runner API support for WindowingStrategy.

[herohde] Run goimports on baseline.

[herohde] Fix encoding of global window strategy.

[herohde] Ensure the windowed value is atomically encoded.

[herohde] Limit gRPC messages to max size.

[herohde] Developer conveniences for running jobs.

[herohde] Fix sends to not close the network channel.

[herohde] Add re-iterable side input

[herohde] Add per-key Combine

[herohde] Add Min

[herohde] Reorganize non-user-facing code into core

[herohde] Make type register reject unnamed or predeclared types

[herohde] Add type specialization tool

[herohde] Don't run grpc plugin in generate phase.

[herohde] Fix import reference path for runner API proto.

[herohde] Revamp runner registration as _ imports

[herohde] Add stats.Max and Mean

[herohde] Add global pipeline options

[herohde] Unify global and per-key combiners

[herohde] Add beam convenience wrapper for imports and runner selection

[herohde] Add session recording and 

Build failed in Jenkins: beam_PerformanceTests_JDBC #302

2018-03-07 Thread Apache Jenkins Server
See 


Changes:

[grzegorz.kolakowski] [BEAM-3043] Set user-specified PTransform names on Flink 
operators

[grzegorz.kolakowski] [BEAM-3753] Fix failing integration tests

[grzegorz.kolakowski] [BEAM-3753] Rename *ITCase.java tests files to *Test.java

[rmannibucau] Make StateInternals short state method defaulting to the 
implementation

--
[...truncated 1.70 KB...]
[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins5035207810204200761.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a 
--verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: 
[--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins6506190783837837574.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins4656884469047589601.sh
+ kubectl 
--kubeconfig=
 create namespace jdbcioit-1520437864457
namespace "jdbcioit-1520437864457" created
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins7440212698167001807.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=jdbcioit-1520437864457
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins226395241242724509.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins6718598203778854925.sh
+ rm -rf .env
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins4113185020290175616.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins7741588768413400988.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/ad/dc/fcced9ec3f2561c0cbe8eb6527eef7cf4f4919a2b3a07891a36e846635af/setuptools-38.5.2-py2.py3-none-any.whl#md5=abd3307cdce6fb543b5a4d0e3e98bdb6
Downloading/unpacking pip from 
https://pypi.python.org/packages/b6/ac/7015eb97dc749283ffdec1c3a88ddb8ae03b8fad0f0e611408f196358da3/pip-9.0.1-py2.py3-none-any.whl#md5=297dbd16ef53bcef0447d245815f5144
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins2260763023136058826.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/jenkins3793943123610621438.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Collecting absl-py (from -r PerfKitBenchmarker/requirements.txt (line 14))
:318:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/security.html#snimissingwarning.
  SNIMissingWarning
:122:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/security.html#insecureplatformwarning.
  InsecurePlatformWarning
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r 

[beam-site] 02/02: This closes #396

2018-03-07 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit fb71d8565278d663a077f8817ad2bd0f6cb70833
Merge: d4609c5 2c9ff00
Author: Mergebot 
AuthorDate: Wed Mar 7 09:58:13 2018 -0800

This closes #396

 content/contribute/contribution-guide/index.html | 2 +-
 src/contribute/contribution-guide.md | 3 ++-
 2 files changed, 3 insertions(+), 2 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] branch mergebot updated (d5209e3 -> fb71d85)

2018-03-07 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a change to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git.


 discard d5209e3  This closes #396
 discard 7802d5d  Update site to mention the 2 Python lint envs
 new 2c9ff00  Update site to mention the 2 Python lint envs
 new fb71d85  This closes #396

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (d5209e3)
\
 N -- N -- N   refs/heads/mergebot (fb71d85)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] 01/02: Update site to mention the 2 Python lint envs

2018-03-07 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit 2c9ff001d0d19d2129306e75827726ef575055e9
Author: Holden Karau 
AuthorDate: Thu Mar 1 18:37:10 2018 -0800

Update site to mention the 2 Python lint envs
---
 content/contribute/contribution-guide/index.html | 2 +-
 src/contribute/contribution-guide.md | 3 ++-
 2 files changed, 3 insertions(+), 2 deletions(-)

diff --git a/content/contribute/contribution-guide/index.html 
b/content/contribute/contribution-guide/index.html
index ec6addf..fa8ebe8 100644
--- a/content/contribute/contribution-guide/index.html
+++ b/content/contribute/contribution-guide/index.html
@@ -435,7 +435,7 @@ environment before testing your code.
 command.
 
 $ pip install tox
-$ tox -e lint
+$ tox -e lint_py2,lint_py3
 
 
 
diff --git a/src/contribute/contribution-guide.md 
b/src/contribute/contribution-guide.md
index 5a7f0b9..fcf7502 100644
--- a/src/contribute/contribution-guide.md
+++ b/src/contribute/contribution-guide.md
@@ -248,7 +248,8 @@ To Check for lint errors locally, install "tox" package and 
run following
 command.
 
 $ pip install tox
-$ tox -e lint
+$ tox -e lint_py2,lint_py3
+
 
 Beam supports running Python SDK tests using Maven. For this, navigate to root
 directory of your Apache Beam clone and execute following command. Currently

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[jira] [Assigned] (BEAM-2588) FlinkRunner shim for serving Job API

2018-03-07 Thread Ben Sidhom (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2588?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ben Sidhom reassigned BEAM-2588:


Assignee: Axel Magnuson

> FlinkRunner shim for serving Job API
> 
>
> Key: BEAM-2588
> URL: https://issues.apache.org/jira/browse/BEAM-2588
> Project: Beam
>  Issue Type: Sub-task
>  Components: runner-flink
>Reporter: Kenneth Knowles
>Assignee: Axel Magnuson
>Priority: Major
>  Labels: portability
>
> Whatever the result of https://s.apache.org/beam-job-api we will need a way 
> for the JVM-based FlinkRunner to receive and run pipelines authors in Python.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam-site] 02/02: This closes #396

2018-03-07 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit d5209e3120c5ea79c394ada3fa5dd42e79a868e6
Merge: d4609c5 7802d5d
Author: Mergebot 
AuthorDate: Wed Mar 7 09:21:36 2018 -0800

This closes #396

 content/contribute/contribution-guide/index.html | 2 +-
 src/contribute/contribution-guide.md | 3 ++-
 2 files changed, 3 insertions(+), 2 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] 01/02: Update site to mention the 2 Python lint envs

2018-03-07 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit 7802d5d723a6dab4128252bd32f422623e29cc9f
Author: Holden Karau 
AuthorDate: Thu Mar 1 18:37:10 2018 -0800

Update site to mention the 2 Python lint envs
---
 content/contribute/contribution-guide/index.html | 2 +-
 src/contribute/contribution-guide.md | 3 ++-
 2 files changed, 3 insertions(+), 2 deletions(-)

diff --git a/content/contribute/contribution-guide/index.html 
b/content/contribute/contribution-guide/index.html
index ec6addf..fa8ebe8 100644
--- a/content/contribute/contribution-guide/index.html
+++ b/content/contribute/contribution-guide/index.html
@@ -435,7 +435,7 @@ environment before testing your code.
 command.
 
 $ pip install tox
-$ tox -e lint
+$ tox -e lint_py2,lint_py3
 
 
 
diff --git a/src/contribute/contribution-guide.md 
b/src/contribute/contribution-guide.md
index 5a7f0b9..fcf7502 100644
--- a/src/contribute/contribution-guide.md
+++ b/src/contribute/contribution-guide.md
@@ -248,7 +248,8 @@ To Check for lint errors locally, install "tox" package and 
run following
 command.
 
 $ pip install tox
-$ tox -e lint
+$ tox -e lint_py2,lint_py3
+
 
 Beam supports running Python SDK tests using Maven. For this, navigate to root
 directory of your Apache Beam clone and execute following command. Currently

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] branch mergebot updated (45345f8 -> d5209e3)

2018-03-07 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a change to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git.


 discard 45345f8  This closes #396
 discard 158e256  Update site to mention the 2 Python lint envs
 new 7802d5d  Update site to mention the 2 Python lint envs
 new d5209e3  This closes #396

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (45345f8)
\
 N -- N -- N   refs/heads/mergebot (d5209e3)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[jira] [Created] (BEAM-3800) Set uids on Flink operators

2018-03-07 Thread JIRA
Grzegorz Kołakowski created BEAM-3800:
-

 Summary: Set uids on Flink operators
 Key: BEAM-3800
 URL: https://issues.apache.org/jira/browse/BEAM-3800
 Project: Beam
  Issue Type: Improvement
  Components: runner-flink
Reporter: Grzegorz Kołakowski
Assignee: Grzegorz Kołakowski


Flink operators should have unique ids assigned, which are, in turn, used for 
checkpointing stateful operators. Assigning operator ids is highly recommended 
according to Flink documentation.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_Verify #4383

2018-03-07 Thread Apache Jenkins Server
See 


Changes:

[rmannibucau] Make StateInternals short state method defaulting to the 
implementation

--
[...truncated 1.02 MB...]
copying apache_beam/runners/direct/util.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/watermark_manager.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/experimental/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/experimental
copying apache_beam/runners/experimental/python_rpc_direct/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying 
apache_beam/runners/experimental/python_rpc_direct/python_rpc_direct_runner.py 
-> apache-beam-2.5.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/experimental/python_rpc_direct/server.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/job/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/manager.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/utils.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/job
copying apache_beam/runners/portability/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_main.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/test/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/test
copying apache_beam/runners/worker/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/bundle_processor.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/data_plane.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/data_plane_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/log_handler.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/log_handler_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/logger.pxd -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/logger.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/logger_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/opcounters.pxd -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/opcounters.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/opcounters_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/operation_specs.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/operations.pxd -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/operations.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sdk_worker.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sdk_worker_main.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sdk_worker_main_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sdk_worker_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sideinputs.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/sideinputs_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/statesampler.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/statesampler_fast.pyx -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/statesampler_slow.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying 

[jira] [Commented] (BEAM-2831) Pipeline crashes due to Beam encoder breaking Flink memory management

2018-03-07 Thread Guillaume Balaine (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2831?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16389772#comment-16389772
 ] 

Guillaume Balaine commented on BEAM-2831:
-

The implication here, is that from 2.1 onwards it is impossible to run any 
reasonably sized batch with the FlinkRunner with binary formats like Avro and 
Protobuf with the default block size of FileIO...

> Pipeline crashes due to Beam encoder breaking Flink memory management
> -
>
> Key: BEAM-2831
> URL: https://issues.apache.org/jira/browse/BEAM-2831
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Affects Versions: 2.0.0, 2.1.0
> Environment: Flink 1.2.1 and 1.3.0, Java HotSpot and OpenJDK 8, macOS 
> 10.12.6 and unknown Linux
>Reporter: Reinier Kip
>Assignee: Aljoscha Krettek
>Priority: Major
>
> I’ve been running a Beam pipeline on Flink. Depending on the dataset size and 
> the heap memory configuration of the jobmanager and taskmanager, I may run 
> into an EOFException, which causes the job to fail.
> As [discussed on Flink's 
> mailinglist|http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/EOFException-related-to-memory-segments-during-run-of-Beam-pipeline-on-Flink-td15255.html]
>  (stacktrace enclosed), Flink catches these EOFExceptions and activates disk 
> spillover. Because Beam wraps these exceptions, this mechanism fails, the 
> exception travels up the stack, and the job aborts.
> Hopefully this is enough information and this is something that can be 
> adjusted for in Beam. I'd be glad to provide more information where needed.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-3799) Nexmark Query 10 breaks with direct runner

2018-03-07 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-3799?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía updated BEAM-3799:
---
Fix Version/s: 2.4.0

> Nexmark Query 10 breaks with direct runner
> --
>
> Key: BEAM-3799
> URL: https://issues.apache.org/jira/browse/BEAM-3799
> Project: Beam
>  Issue Type: Bug
>  Components: runner-direct
>Affects Versions: 2.4.0, 2.5.0
>Reporter: Ismaël Mejía
>Assignee: Thomas Groh
>Priority: Major
> Fix For: 2.4.0
>
>
> While running query 10 with the direct runner like this:
> {quote}mvn exec:java -Dexec.mainClass=org.apache.beam.sdk.nexmark.Main 
> -Pdirect-runner -Dexec.args="--runner=DirectRunner --query=10 
> --streaming=false --manageResources=false --monitorJobs=true 
> --enforceEncodability=true --enforceImmutability=true" -pl 'sdks/java/nexmark'
> {quote}
> I found that it breaks with the direct runner with  following exception (it 
> works ok with the other runners):
> {quote}[WARNING] 
> java.lang.reflect.InvocationTargetException
>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke 
> (NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke 
> (DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke (Method.java:498)
>     at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:294)
>     at java.lang.Thread.run (Thread.java:748)
> Caused by: org.apache.beam.sdk.util.IllegalMutationException: PTransform 
> Query10/Query10.UploadEvents/ParMultiDo(Anonymous) mutated value KV{null, 
> 2015-07-15T00:00:09.999Z shard-3-of-00025 0 ON_TIME null
> } after it was output (new value was KV{null, 2015-07-15T00:00:09.999Z 
> shard-3-of-00025 0 ON_TIME null
> }). Values must not be mutated in any way after being output.
>     at 
> org.apache.beam.runners.direct.ImmutabilityCheckingBundleFactory$ImmutabilityEnforcingBundle.commit
>  (ImmutabilityCheckingBundleFactory.java:134)
>     at org.apache.beam.runners.direct.EvaluationContext.commitBundles 
> (EvaluationContext.java:212)
>     at org.apache.beam.runners.direct.EvaluationContext.handleResult 
> (EvaluationContext.java:152)
>     at 
> org.apache.beam.runners.direct.QuiescenceDriver$TimerIterableCompletionCallback.handleResult
>  (QuiescenceDriver.java:258)
>     at org.apache.beam.runners.direct.DirectTransformExecutor.finishBundle 
> (DirectTransformExecutor.java:190)
>     at org.apache.beam.runners.direct.DirectTransformExecutor.run 
> (DirectTransformExecutor.java:127)
>     at java.util.concurrent.Executors$RunnableAdapter.call 
> (Executors.java:511)
>     at java.util.concurrent.FutureTask.run (FutureTask.java:266)
>     at java.util.concurrent.ThreadPoolExecutor.runWorker 
> (ThreadPoolExecutor.java:1149)
>     at java.util.concurrent.ThreadPoolExecutor$Worker.run 
> (ThreadPoolExecutor.java:624)
>     at java.lang.Thread.run (Thread.java:748)
> Caused by: org.apache.beam.sdk.util.IllegalMutationException: Value KV{null, 
> 2015-07-15T00:00:09.999Z shard-3-of-00025 0 ON_TIME null
> } mutated illegally, new value was KV{null, 2015-07-15T00:00:09.999Z 
> shard-3-of-00025 0 ON_TIME null
> }. Encoding was 
> rO0ABXNyADZvcmcuYXBhY2hlLmJlYW0uc2RrLm5leG1hcmsucXVlcmllcy5RdWVyeTEwJE91dHB1dEZpbGUWUg9rZM1SvgIABUoABWluZGV4TAAIZmlsZW5hbWV0ABJMamF2YS9sYW5nL1N0cmluZztMAAxtYXhUaW1lc3RhbXB0ABdMb3JnL2pvZGEvdGltZS9JbnN0YW50O0wABXNoYXJkcQB-AAFMAAZ0aW1pbmd0ADpMb3JnL2FwYWNoZS9iZWFtL3Nkay90cmFuc2Zvcm1zL3dpbmRvd2luZy9QYW5lSW5mbyRUaW1pbmc7eHAAAHBzcgAVb3JnLmpvZGEudGltZS5JbnN0YW50Lci-0MYOnM0CAAFKAAdpTWlsbGlzeHFOjwLrD3QAFHNoYXJkLTAwMDAzLW9mLTAwMDI1fnIAOG9yZy5hcGFjaGUuYmVhbS5zZGsudHJhbnNmb3Jtcy53aW5kb3dpbmcuUGFuZUluZm8kVGltaW5nAAASAAB4cgAOamF2YS5sYW5nLkVudW0AABIAAHhwdAAHT05fVElNRQ,
>  now 
> rO0ABXNyADZvcmcuYXBhY2hlLmJlYW0uc2RrLm5leG1hcmsucXVlcmllcy5RdWVyeTEwJE91dHB1dEZpbGUWUg9rZM1SvgIABUoABWluZGV4TAAIZmlsZW5hbWV0ABJMamF2YS9sYW5nL1N0cmluZztMAAxtYXhUaW1lc3RhbXB0ABdMb3JnL2pvZGEvdGltZS9JbnN0YW50O0wABXNoYXJkcQB-AAFMAAZ0aW1pbmd0ADpMb3JnL2FwYWNoZS9iZWFtL3Nkay90cmFuc2Zvcm1zL3dpbmRvd2luZy9QYW5lSW5mbyRUaW1pbmc7eHAAAHBzcgAVb3JnLmpvZGEudGltZS5JbnN0YW50Lci-0MYOnM0CAAFKAAdpTWlsbGlzeHFOjwLrD3QAFHNoYXJkLTAwMDAzLW9mLTAwMDI1fnIAOG9yZy5hcGFjaGUuYmVhbS5zZGsudHJhbnNmb3Jtcy53aW5kb3dpbmcuUGFuZUluZm8kVGltaW5nAAASAAB4cgAOamF2YS5sYW5nLkVudW0AABIAAHhwdAAHT05fVElNRQ.
>     at 
> org.apache.beam.sdk.util.MutationDetectors$CodedValueMutationDetector.illegalMutation
>  (MutationDetectors.java:144)
>     at 
> org.apache.beam.sdk.util.MutationDetectors$CodedValueMutationDetector.verifyUnmodifiedThrowingCheckedExceptions
>  (MutationDetectors.java:139)
>     at 
> org.apache.beam.sdk.util.MutationDetectors$CodedValueMutationDetector.verifyUnmodified
> 

[jira] [Updated] (BEAM-3409) Unexpected behavior of DoFn teardown method running in unit tests

2018-03-07 Thread Robert Bradshaw (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3409?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Bradshaw updated BEAM-3409:
--
Fix Version/s: (was: 2.4.0)
   2.5.0

> Unexpected behavior of DoFn teardown method running in unit tests 
> --
>
> Key: BEAM-3409
> URL: https://issues.apache.org/jira/browse/BEAM-3409
> Project: Beam
>  Issue Type: Bug
>  Components: runner-direct
>Affects Versions: 2.3.0
>Reporter: Alexey Romanenko
>Assignee: Romain Manni-Bucau
>Priority: Blocker
>  Labels: test
> Fix For: 2.5.0
>
>  Time Spent: 5h 10m
>  Remaining Estimate: 0h
>
> Writing a unit test, I found out a strange behaviour of Teardown method of 
> DoFn implementation when I run this method in unit tests using TestPipeline.
> To be more precise, it doesn’t wait until teardown() method will be finished, 
> it just exits from this method after about 1 sec (on my machine) even if it 
> should take longer (very simple example - running infinite loop inside this 
> method or put thread in sleep). In the same time, when I run the same code 
> from main() with ordinary Pipeline and direct runner, then it’s ok and it 
> works as expected - teardown() method will be performed completely despite 
> how much time it will take.
> I created two test cases to reproduce this issue - the first one to run with 
> main() and the second one to run with junit. They use the same implementation 
> of DoFn (class LongTearDownFn) and expects that teardown method will be 
> running at least for SLEEP_TIME ms. In case of running as junit test it's not 
> a case (see output log).
> - run with main()
> https://github.com/aromanenko-dev/beam-samples/blob/master/runners-tests/src/main/java/TearDown.java
> - run with junit
> https://github.com/aromanenko-dev/beam-samples/blob/master/runners-tests/src/test/java/TearDownTest.java



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Spark #4364

2018-03-07 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PerformanceTests_JDBC #301

2018-03-07 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1067

2018-03-07 Thread Apache Jenkins Server
See 


Changes:

[rmannibucau] Make StateInternals short state method defaulting to the 
implementation

--
[...truncated 116.89 KB...]
  File 
"
 line 842, in from_runner_api
part = context.transforms.get_by_id(transform_id)
  File 
"
 line 69, in get_by_id
self._id_to_proto[id], self._pipeline_context)
  File 
"
 line 833, in from_runner_api
transform=ptransform.PTransform.from_runner_api(proto.spec, context),
  File 
"
 line 555, in from_runner_api
context)
  File 
"
 line 881, in from_runner_api_parameter
pardo_payload.do_fn.spec.payload)
  File 
"
 line 221, in loads
return dill.loads(s)
  File 
"
 line 277, in loads
return load(file)
  File 
"
 line 266, in load
obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
dispatch[key](self)
  File 
"
 line 276, in signalhandler
raise TimedOutException()
TimedOutException: 'test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest)'

==
ERROR: test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest)
--
Traceback (most recent call last):
  File 
"
 line 812, in run
test(orig)
  File 
"
 line 45, in __call__
return self.run(*arg, **kwarg)
  File 
"
 line 133, in run
self.runTest(result)
  File 
"
 line 151, in runTest
test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 395, in __call__
return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 331, in run
testMethod()
  File 
"
 line 178, in test_iterable_side_input
pipeline.run()
  File 
"
 line 102, in run
result = super(TestPipeline, self).run()
  File 
"
 line 369, in run
self.to_runner_api(), self.runner, self._options).run(False)
  File 
"
 line 382, in run
return self.runner.run_pipeline(self)
  File 
"
 line 285, in run_pipeline
return_context=True)
  File 
"
 line 580, in to_runner_api
root_transform_id = context.transforms.get_id(self._root_transform())
  File 

[jira] [Created] (BEAM-3799) Nexmark Query 10 breaks with direct runner

2018-03-07 Thread JIRA
Ismaël Mejía created BEAM-3799:
--

 Summary: Nexmark Query 10 breaks with direct runner
 Key: BEAM-3799
 URL: https://issues.apache.org/jira/browse/BEAM-3799
 Project: Beam
  Issue Type: Bug
  Components: runner-direct
Affects Versions: 2.4.0, 2.5.0
Reporter: Ismaël Mejía
Assignee: Thomas Groh


While running query 10 with the direct runner like this:
{quote}mvn exec:java -Dexec.mainClass=org.apache.beam.sdk.nexmark.Main 
-Pdirect-runner -Dexec.args="--runner=DirectRunner --query=10 --streaming=false 
--manageResources=false --monitorJobs=true --enforceEncodability=true 
--enforceImmutability=true" -pl 'sdks/java/nexmark'
{quote}

I found that it breaks with the direct runner with  following exception (it 
works ok with the other runners):
{quote}[WARNING] 
java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke 
(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke 
(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:294)
    at java.lang.Thread.run (Thread.java:748)
Caused by: org.apache.beam.sdk.util.IllegalMutationException: PTransform 
Query10/Query10.UploadEvents/ParMultiDo(Anonymous) mutated value KV{null, 
2015-07-15T00:00:09.999Z shard-3-of-00025 0 ON_TIME null
} after it was output (new value was KV{null, 2015-07-15T00:00:09.999Z 
shard-3-of-00025 0 ON_TIME null
}). Values must not be mutated in any way after being output.
    at 
org.apache.beam.runners.direct.ImmutabilityCheckingBundleFactory$ImmutabilityEnforcingBundle.commit
 (ImmutabilityCheckingBundleFactory.java:134)
    at org.apache.beam.runners.direct.EvaluationContext.commitBundles 
(EvaluationContext.java:212)
    at org.apache.beam.runners.direct.EvaluationContext.handleResult 
(EvaluationContext.java:152)
    at 
org.apache.beam.runners.direct.QuiescenceDriver$TimerIterableCompletionCallback.handleResult
 (QuiescenceDriver.java:258)
    at org.apache.beam.runners.direct.DirectTransformExecutor.finishBundle 
(DirectTransformExecutor.java:190)
    at org.apache.beam.runners.direct.DirectTransformExecutor.run 
(DirectTransformExecutor.java:127)
    at java.util.concurrent.Executors$RunnableAdapter.call (Executors.java:511)
    at java.util.concurrent.FutureTask.run (FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker 
(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run 
(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run (Thread.java:748)
Caused by: org.apache.beam.sdk.util.IllegalMutationException: Value KV{null, 
2015-07-15T00:00:09.999Z shard-3-of-00025 0 ON_TIME null
} mutated illegally, new value was KV{null, 2015-07-15T00:00:09.999Z 
shard-3-of-00025 0 ON_TIME null
}. Encoding was 
rO0ABXNyADZvcmcuYXBhY2hlLmJlYW0uc2RrLm5leG1hcmsucXVlcmllcy5RdWVyeTEwJE91dHB1dEZpbGUWUg9rZM1SvgIABUoABWluZGV4TAAIZmlsZW5hbWV0ABJMamF2YS9sYW5nL1N0cmluZztMAAxtYXhUaW1lc3RhbXB0ABdMb3JnL2pvZGEvdGltZS9JbnN0YW50O0wABXNoYXJkcQB-AAFMAAZ0aW1pbmd0ADpMb3JnL2FwYWNoZS9iZWFtL3Nkay90cmFuc2Zvcm1zL3dpbmRvd2luZy9QYW5lSW5mbyRUaW1pbmc7eHAAAHBzcgAVb3JnLmpvZGEudGltZS5JbnN0YW50Lci-0MYOnM0CAAFKAAdpTWlsbGlzeHFOjwLrD3QAFHNoYXJkLTAwMDAzLW9mLTAwMDI1fnIAOG9yZy5hcGFjaGUuYmVhbS5zZGsudHJhbnNmb3Jtcy53aW5kb3dpbmcuUGFuZUluZm8kVGltaW5nAAASAAB4cgAOamF2YS5sYW5nLkVudW0AABIAAHhwdAAHT05fVElNRQ,
 now 
rO0ABXNyADZvcmcuYXBhY2hlLmJlYW0uc2RrLm5leG1hcmsucXVlcmllcy5RdWVyeTEwJE91dHB1dEZpbGUWUg9rZM1SvgIABUoABWluZGV4TAAIZmlsZW5hbWV0ABJMamF2YS9sYW5nL1N0cmluZztMAAxtYXhUaW1lc3RhbXB0ABdMb3JnL2pvZGEvdGltZS9JbnN0YW50O0wABXNoYXJkcQB-AAFMAAZ0aW1pbmd0ADpMb3JnL2FwYWNoZS9iZWFtL3Nkay90cmFuc2Zvcm1zL3dpbmRvd2luZy9QYW5lSW5mbyRUaW1pbmc7eHAAAHBzcgAVb3JnLmpvZGEudGltZS5JbnN0YW50Lci-0MYOnM0CAAFKAAdpTWlsbGlzeHFOjwLrD3QAFHNoYXJkLTAwMDAzLW9mLTAwMDI1fnIAOG9yZy5hcGFjaGUuYmVhbS5zZGsudHJhbnNmb3Jtcy53aW5kb3dpbmcuUGFuZUluZm8kVGltaW5nAAASAAB4cgAOamF2YS5sYW5nLkVudW0AABIAAHhwdAAHT05fVElNRQ.
    at 
org.apache.beam.sdk.util.MutationDetectors$CodedValueMutationDetector.illegalMutation
 (MutationDetectors.java:144)
    at 
org.apache.beam.sdk.util.MutationDetectors$CodedValueMutationDetector.verifyUnmodifiedThrowingCheckedExceptions
 (MutationDetectors.java:139)
    at 
org.apache.beam.sdk.util.MutationDetectors$CodedValueMutationDetector.verifyUnmodified
 (MutationDetectors.java:123)
    at 
org.apache.beam.runners.direct.ImmutabilityCheckingBundleFactory$ImmutabilityEnforcingBundle.commit
 (ImmutabilityCheckingBundleFactory.java:124)
    at org.apache.beam.runners.direct.EvaluationContext.commitBundles 
(EvaluationContext.java:212)
    at org.apache.beam.runners.direct.EvaluationContext.handleResult 

Build failed in Jenkins: beam_PostCommit_Python_Verify #4382

2018-03-07 Thread Apache Jenkins Server
See 


--
[...truncated 1.02 MB...]
copying apache_beam/runners/dataflow/native_io/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/streaming_create.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/direct/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/bundle_factory.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/clock.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor_test.py 
-> apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/evaluation_context.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/executor.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/helper_transforms.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/transform_evaluator.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/util.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/watermark_manager.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/experimental/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/experimental
copying apache_beam/runners/experimental/python_rpc_direct/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying 
apache_beam/runners/experimental/python_rpc_direct/python_rpc_direct_runner.py 
-> apache-beam-2.5.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/experimental/python_rpc_direct/server.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/job/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/manager.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/utils.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/job
copying apache_beam/runners/portability/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_main.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/test/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/test
copying apache_beam/runners/worker/__init__.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/bundle_processor.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/data_plane.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/data_plane_test.py -> 
apache-beam-2.5.0.dev0/apache_beam/runners/worker
copying apache_beam/runners/worker/log_handler.py -> 

  1   2   >