[jira] [Assigned] (BEAM-3538) Remove (or merge) Java 8 specific tests module into the main one.

2018-02-01 Thread Alexey Romanenko (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3538?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alexey Romanenko reassigned BEAM-3538:
--

Assignee: Alexey Romanenko

> Remove (or merge) Java 8 specific tests module into the main one.
> -
>
> Key: BEAM-3538
> URL: https://issues.apache.org/jira/browse/BEAM-3538
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core, testing
>Reporter: Ismaël Mejía
>Assignee: Alexey Romanenko
>Priority: Minor
>  Labels: newbie, starter
>
> The module beam-sdks-java-java8tests has specific tests for some core 
> transforms written in Java 8 syntax. Because of the move to Java 8 probably 
> it doesn't make sense anymore to have this module because all the tests 
> should be Java 8, so the tests in this module should be merged with the main 
> ones, or it should be removed if they are already covered.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_Python #864

2018-02-01 Thread Apache Jenkins Server
See 


Changes:

[ekirpichov] Introduces the Wait transform

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam6 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9cf86bcebcdbd8d5a84777cf2871597f0ba1b951 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9cf86bcebcdbd8d5a84777cf2871597f0ba1b951
Commit message: "Merge pull request #4301: Introduces the Wait transform"
 > git rev-list e34fee13a12ad1bc8694c7f8d1bdedc9afb88e17 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins8125196431367118470.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins8894134476788846354.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3038674210411406039.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3256991282615249445.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/75/d1/5abca4ccf61a7ab86c255dd315fb96e566fbf9b5d3a480e72c93e8ec2802/setuptools-38.4.0-py2.py3-none-any.whl#md5=a5c6620a59f19f2d5d32bdca18c7b47e
Downloading/unpacking pip from 
https://pypi.python.org/packages/b6/ac/7015eb97dc749283ffdec1c3a88ddb8ae03b8fad0f0e611408f196358da3/pip-9.0.1-py2.py3-none-any.whl#md5=297dbd16ef53bcef0447d245815f5144
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins1656602075560595163.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins2718874454610923649.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
:318:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 

Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT #92

2018-02-01 Thread Apache Jenkins Server
See 


Changes:

[ekirpichov] Introduces the Wait transform

--
[...truncated 694.10 KB...]
[INFO] Excluding org.apache.httpcomponents:httpclient:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding org.apache.httpcomponents:httpcore:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding commons-codec:commons-codec:jar:1.3 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded 
jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-io-google-cloud-platform:jar:2.4.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-protobuf:jar:2.4.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev355-1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.api:gax-grpc:jar:0.20.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.auto.value:auto-value:jar:1.5.1 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-pubsub:jar:v1-rev10-1.22.0 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-resolver:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-all:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-okhttp:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.squareup.okhttp:okhttp:jar:2.5.0 from the shaded jar.
[INFO] Excluding com.squareup.okio:okio:jar:1.6.0 from the shaded jar.
[INFO] Excluding 

Build failed in Jenkins: beam_PerformanceTests_TFRecordIOIT #91

2018-02-01 Thread Apache Jenkins Server
See 


Changes:

[ekirpichov] Introduces the Wait transform

--
[...truncated 10.01 KB...]
[beam_PerformanceTests_TFRecordIOIT] $ /bin/bash -xe 
/tmp/jenkins2788142014973754649.sh
+ .env/bin/pip install -e 'src/sdks/python/[gcp,test]'
Obtaining 
file://
Requirement already satisfied: avro<2.0.0,>=1.8.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: crcmod<2.0,>=1.7 in 
/usr/lib/python2.7/dist-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: dill==0.2.6 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: grpcio<2,>=1.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: httplib2<0.10,>=0.8 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: oauth2client<5,>=2.0.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: pyyaml<4.0.0,>=3.12 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: six<1.12,>=1.9 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: typing<3.7.0,>=3.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: futures<4.0.0,>=3.1.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: hdfs3<0.4.0,>=0.3.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: google-apitools<=0.5.20,>=0.5.18 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 
in /home/jenkins/.local/lib/python2.7/site-packages (from 
apache-beam==2.4.0.dev0)
Requirement already satisfied: googledatastore==7.0.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: google-cloud-pubsub==0.26.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: google-cloud-bigquery==0.25.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: pyhamcrest<2.0,>=1.9 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: enum34>=1.0.4 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
grpcio<2,>=1.0->apache-beam==2.4.0.dev0)
Requirement already satisfied: funcsigs>=1; python_version < "3.3" in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
mock<3.0.0,>=1.0.1->apache-beam==2.4.0.dev0)
Requirement already satisfied: pbr>=0.11 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
mock<3.0.0,>=1.0.1->apache-beam==2.4.0.dev0)
Requirement already satisfied: pyasn1>=0.1.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
oauth2client<5,>=2.0.1->apache-beam==2.4.0.dev0)
Requirement already satisfied: pyasn1-modules>=0.0.5 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
oauth2client<5,>=2.0.1->apache-beam==2.4.0.dev0)
Requirement already satisfied: rsa>=3.1.4 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
oauth2client<5,>=2.0.1->apache-beam==2.4.0.dev0)
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from protobuf<4,>=3.5.0.post1->apache-beam==2.4.0.dev0)
Requirement already satisfied: fasteners>=0.14 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
google-apitools<=0.5.20,>=0.5.18->apache-beam==2.4.0.dev0)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.5.2 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.4.0.dev0)
Requirement already satisfied: google-cloud-core<0.26dev,>=0.25.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
google-cloud-pubsub==0.26.0->apache-beam==2.4.0.dev0)
Requirement already satisfied: gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 

[jira] [Updated] (BEAM-3601) Switch to Java 8 futures

2018-02-01 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3601?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-3601:
--
Summary: Switch to Java 8 futures  (was: Switch portability support code to 
Java 8 futures)

> Switch to Java 8 futures
> 
>
> Key: BEAM-3601
> URL: https://issues.apache.org/jira/browse/BEAM-3601
> Project: Beam
>  Issue Type: Sub-task
>  Components: runner-core
>Reporter: Kenneth Knowles
>Assignee: Kenneth Knowles
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-3601) Switch portability support code to Java 8 futures

2018-02-01 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3601?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-3601:
--
Issue Type: Sub-task  (was: Improvement)
Parent: BEAM-3606

> Switch portability support code to Java 8 futures
> -
>
> Key: BEAM-3601
> URL: https://issues.apache.org/jira/browse/BEAM-3601
> Project: Beam
>  Issue Type: Sub-task
>  Components: runner-core
>Reporter: Kenneth Knowles
>Assignee: Kenneth Knowles
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-3606) Improve our relationship, or lack thereof, with Guava

2018-02-01 Thread Kenneth Knowles (JIRA)
Kenneth Knowles created BEAM-3606:
-

 Summary: Improve our relationship, or lack thereof, with Guava
 Key: BEAM-3606
 URL: https://issues.apache.org/jira/browse/BEAM-3606
 Project: Beam
  Issue Type: Improvement
  Components: runner-core, sdk-java-core, sdk-java-gcp
Reporter: Kenneth Knowles
Assignee: Ismaël Mejía


This is an umbrella task for things that might move us off Guava, such as 
replicating our own little utilities or moving to Java 8 features.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] branch master updated (e34fee1 -> 9cf86bc)

2018-02-01 Thread kenn
This is an automated email from the ASF dual-hosted git repository.

kenn pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from e34fee1  Merge pull request #4471: [BEAM-3099] Split out 
BufferedReader and BufferedWriter from gcsio.
 add 884f3e6  Introduces the Wait transform
 new 9cf86bc  Merge pull request #4301: Introduces the Wait transform

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../org/apache/beam/sdk/testing/TestStream.java|   2 +-
 .../java/org/apache/beam/sdk/transforms/Wait.java  | 120 
 .../org/apache/beam/sdk/transforms/WaitTest.java   | 304 +
 3 files changed, 425 insertions(+), 1 deletion(-)
 create mode 100644 
sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Wait.java
 create mode 100644 
sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/WaitTest.java

-- 
To stop receiving notification emails like this one, please contact
k...@apache.org.


[beam] 01/01: Merge pull request #4301: Introduces the Wait transform

2018-02-01 Thread kenn
This is an automated email from the ASF dual-hosted git repository.

kenn pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 9cf86bcebcdbd8d5a84777cf2871597f0ba1b951
Merge: e34fee1 884f3e6
Author: Kenn Knowles 
AuthorDate: Thu Feb 1 20:26:55 2018 -0800

Merge pull request #4301: Introduces the Wait transform

 .../org/apache/beam/sdk/testing/TestStream.java|   2 +-
 .../java/org/apache/beam/sdk/transforms/Wait.java  | 120 
 .../org/apache/beam/sdk/transforms/WaitTest.java   | 304 +
 3 files changed, 425 insertions(+), 1 deletion(-)


-- 
To stop receiving notification emails like this one, please contact
k...@apache.org.


[jira] [Updated] (BEAM-3604) MqttIOTest testReadNoClientId failure timeout

2018-02-01 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3604?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-3604:
--
Summary: MqttIOTest testReadNoClientId failure timeout  (was: MqttIOTest 
flaky timeout)

> MqttIOTest testReadNoClientId failure timeout
> -
>
> Key: BEAM-3604
> URL: https://issues.apache.org/jira/browse/BEAM-3604
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Reporter: Kenneth Knowles
>Assignee: Jean-Baptiste Onofré
>Priority: Critical
>  Labels: flake
>
> I've seen failures a bit today. Here is one:
> [https://builds.apache.org/job/beam_PreCommit_Java_GradleBuild/1758/testReport/junit/org.apache.beam.sdk.io.mqtt/MqttIOTest/testReadNoClientId/]
> Filing all flakes as "Critical" priority so we can sickbay or fix.
> Since that build will get GC'd, here is the Standard Error. It looks like 
> from that perspective everything went as planned, but perhaps the test has a 
> race condition or something?
> {code}
> Feb 01, 2018 11:28:01 PM org.apache.beam.sdk.io.mqtt.MqttIOTest startBroker
> INFO: Finding free network port
> Feb 01, 2018 11:28:01 PM org.apache.beam.sdk.io.mqtt.MqttIOTest startBroker
> INFO: Starting ActiveMQ brokerService on 57986
> Feb 01, 2018 11:28:03 PM org.apache.activemq.broker.BrokerService 
> doStartPersistenceAdapter
> INFO: Using Persistence Adapter: MemoryPersistenceAdapter
> Feb 01, 2018 11:28:04 PM org.apache.activemq.broker.BrokerService 
> doStartBroker
> INFO: Apache ActiveMQ 5.13.1 (localhost, 
> ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:1) is 
> starting
> Feb 01, 2018 11:28:04 PM 
> org.apache.activemq.transport.TransportServerThreadSupport doStart
> INFO: Listening for connections at: mqtt://localhost:57986
> Feb 01, 2018 11:28:04 PM org.apache.activemq.broker.TransportConnector start
> INFO: Connector mqtt://localhost:57986 started
> Feb 01, 2018 11:28:04 PM org.apache.activemq.broker.BrokerService 
> doStartBroker
> INFO: Apache ActiveMQ 5.13.1 (localhost, 
> ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:1) started
> Feb 01, 2018 11:28:04 PM org.apache.activemq.broker.BrokerService 
> doStartBroker
> INFO: For help or more information please see: http://activemq.apache.org
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService stop
> INFO: Apache ActiveMQ 5.13.1 (localhost, 
> ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:1) is 
> shutting down
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.TransportConnector stop
> INFO: Connector mqtt://localhost:57986 stopped
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService stop
> INFO: Apache ActiveMQ 5.13.1 (localhost, 
> ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:1) uptime 
> 24.039 seconds
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService stop
> INFO: Apache ActiveMQ 5.13.1 (localhost, 
> ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:1) is 
> shutdown
> Feb 01, 2018 11:28:26 PM org.apache.beam.sdk.io.mqtt.MqttIOTest startBroker
> INFO: Finding free network port
> Feb 01, 2018 11:28:26 PM org.apache.beam.sdk.io.mqtt.MqttIOTest startBroker
> INFO: Starting ActiveMQ brokerService on 46799
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService 
> doStartPersistenceAdapter
> INFO: Using Persistence Adapter: MemoryPersistenceAdapter
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService 
> doStartBroker
> INFO: Apache ActiveMQ 5.13.1 (localhost, 
> ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:2) is 
> starting
> Feb 01, 2018 11:28:26 PM 
> org.apache.activemq.transport.TransportServerThreadSupport doStart
> INFO: Listening for connections at: mqtt://localhost:46799
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.TransportConnector start
> INFO: Connector mqtt://localhost:46799 started
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService 
> doStartBroker
> INFO: Apache ActiveMQ 5.13.1 (localhost, 
> ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:2) started
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService 
> doStartBroker
> INFO: For help or more information please see: http://activemq.apache.org
> Feb 01, 2018 11:28:28 PM org.apache.beam.sdk.io.mqtt.MqttIOTest 
> lambda$testRead$1
> INFO: Waiting pipeline connected to the MQTT broker before sending messages 
> ...
> Feb 01, 2018 11:28:35 PM org.apache.activemq.broker.BrokerService stop
> INFO: Apache ActiveMQ 5.13.1 (localhost, 
> ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:2) is 
> shutting down
> Feb 01, 2018 11:28:35 PM org.apache.activemq.broker.TransportConnector stop
> INFO: Connector mqtt://localhost:46799 stopped
> Feb 01, 2018 

[jira] [Updated] (BEAM-3605) Kinesis ShardReadersPoolTest shouldForgetClosedShardIterator failure

2018-02-01 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3605?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-3605:
--
Labels: flake  (was: )

> Kinesis ShardReadersPoolTest shouldForgetClosedShardIterator failure
> 
>
> Key: BEAM-3605
> URL: https://issues.apache.org/jira/browse/BEAM-3605
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Reporter: Kenneth Knowles
>Assignee: Paweł Kaczmarczyk
>Priority: Critical
>  Labels: flake
>
> Here's one:
> https://builds.apache.org/job/beam_PreCommit_Java_GradleBuild/1758/testReport/junit/org.apache.beam.sdk.io.kinesis/ShardReadersPoolTest/shouldForgetClosedShardIterator/
> Filing all test failures as "Critical" so we can sickbay or fix.
> The Jenkins build will get GC'd so here is the error:
> {code}
> java.lang.AssertionError: 
> Expecting:
>   <["shard1", "shard2"]>
> to contain only:
>   <["shard2"]>
> but the following elements were unexpected:
>   <["shard1"]>
>   at 
> org.apache.beam.sdk.io.kinesis.ShardReadersPoolTest.shouldForgetClosedShardIterator(ShardReadersPoolTest.java:270)
> {code}
> and stderr
> {code}
> Feb 01, 2018 11:24:16 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Shard iterator for shard1 shard is closed, finishing the read loop
> org.apache.beam.sdk.io.kinesis.KinesisShardClosedException
> Feb 01, 2018 11:24:16 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Kinesis Shard read loop has finished
> Feb 01, 2018 11:24:16 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Shard iterator for shard1 shard is closed, finishing the read loop
> org.apache.beam.sdk.io.kinesis.KinesisShardClosedException
> Feb 01, 2018 11:24:16 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Kinesis Shard read loop has finished
> Feb 01, 2018 11:24:19 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Shard iterator for shard1 shard is closed, finishing the read loop
> org.apache.beam.sdk.io.kinesis.KinesisShardClosedException
> Feb 01, 2018 11:24:19 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Kinesis Shard read loop has finished
> Feb 01, 2018 11:24:23 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Shard iterator for shard1 shard is closed, finishing the read loop
> org.apache.beam.sdk.io.kinesis.KinesisShardClosedException: Shard iterator 
> reached end of the shard: streamName=null, shardId=shard1
>   at 
> org.apache.beam.sdk.io.kinesis.ShardRecordsIterator.readNextBatch(ShardRecordsIterator.java:70)
>   at 
> org.apache.beam.sdk.io.kinesis.ShardReadersPool.readLoop(ShardReadersPool.java:121)
>   at 
> org.apache.beam.sdk.io.kinesis.ShardReadersPool.lambda$startReadingShards$0(ShardReadersPool.java:112)
>   at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>   at java.lang.Thread.run(Thread.java:748)
> Feb 01, 2018 11:24:23 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Kinesis Shard read loop has finished
> Feb 01, 2018 11:24:23 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Shard iterator for shard2 shard is closed, finishing the read loop
> org.apache.beam.sdk.io.kinesis.KinesisShardClosedException: Shard iterator 
> reached end of the shard: streamName=null, shardId=shard2
>   at 
> org.apache.beam.sdk.io.kinesis.ShardRecordsIterator.readNextBatch(ShardRecordsIterator.java:70)
>   at 
> org.apache.beam.sdk.io.kinesis.ShardReadersPool.readLoop(ShardReadersPool.java:121)
>   at 
> org.apache.beam.sdk.io.kinesis.ShardReadersPool.lambda$startReadingShards$0(ShardReadersPool.java:112)
>   at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>   at java.lang.Thread.run(Thread.java:748)
> Feb 01, 2018 11:24:23 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool stop
> INFO: Closing shard iterators pool
> Feb 01, 2018 11:24:24 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Kinesis Shard read loop has finished
> Feb 01, 2018 11:24:24 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Shard iterator for shard1 shard is closed, finishing the read loop

[jira] [Updated] (BEAM-3604) MqttIOTest testReadNoClientId failure timeout

2018-02-01 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3604?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-3604:
--
Labels: flake  (was: )

> MqttIOTest testReadNoClientId failure timeout
> -
>
> Key: BEAM-3604
> URL: https://issues.apache.org/jira/browse/BEAM-3604
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Reporter: Kenneth Knowles
>Assignee: Jean-Baptiste Onofré
>Priority: Critical
>  Labels: flake
>
> I've seen failures a bit today. Here is one:
> [https://builds.apache.org/job/beam_PreCommit_Java_GradleBuild/1758/testReport/junit/org.apache.beam.sdk.io.mqtt/MqttIOTest/testReadNoClientId/]
> Filing all flakes as "Critical" priority so we can sickbay or fix.
> Since that build will get GC'd, here is the Standard Error. It looks like 
> from that perspective everything went as planned, but perhaps the test has a 
> race condition or something?
> {code}
> Feb 01, 2018 11:28:01 PM org.apache.beam.sdk.io.mqtt.MqttIOTest startBroker
> INFO: Finding free network port
> Feb 01, 2018 11:28:01 PM org.apache.beam.sdk.io.mqtt.MqttIOTest startBroker
> INFO: Starting ActiveMQ brokerService on 57986
> Feb 01, 2018 11:28:03 PM org.apache.activemq.broker.BrokerService 
> doStartPersistenceAdapter
> INFO: Using Persistence Adapter: MemoryPersistenceAdapter
> Feb 01, 2018 11:28:04 PM org.apache.activemq.broker.BrokerService 
> doStartBroker
> INFO: Apache ActiveMQ 5.13.1 (localhost, 
> ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:1) is 
> starting
> Feb 01, 2018 11:28:04 PM 
> org.apache.activemq.transport.TransportServerThreadSupport doStart
> INFO: Listening for connections at: mqtt://localhost:57986
> Feb 01, 2018 11:28:04 PM org.apache.activemq.broker.TransportConnector start
> INFO: Connector mqtt://localhost:57986 started
> Feb 01, 2018 11:28:04 PM org.apache.activemq.broker.BrokerService 
> doStartBroker
> INFO: Apache ActiveMQ 5.13.1 (localhost, 
> ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:1) started
> Feb 01, 2018 11:28:04 PM org.apache.activemq.broker.BrokerService 
> doStartBroker
> INFO: For help or more information please see: http://activemq.apache.org
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService stop
> INFO: Apache ActiveMQ 5.13.1 (localhost, 
> ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:1) is 
> shutting down
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.TransportConnector stop
> INFO: Connector mqtt://localhost:57986 stopped
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService stop
> INFO: Apache ActiveMQ 5.13.1 (localhost, 
> ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:1) uptime 
> 24.039 seconds
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService stop
> INFO: Apache ActiveMQ 5.13.1 (localhost, 
> ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:1) is 
> shutdown
> Feb 01, 2018 11:28:26 PM org.apache.beam.sdk.io.mqtt.MqttIOTest startBroker
> INFO: Finding free network port
> Feb 01, 2018 11:28:26 PM org.apache.beam.sdk.io.mqtt.MqttIOTest startBroker
> INFO: Starting ActiveMQ brokerService on 46799
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService 
> doStartPersistenceAdapter
> INFO: Using Persistence Adapter: MemoryPersistenceAdapter
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService 
> doStartBroker
> INFO: Apache ActiveMQ 5.13.1 (localhost, 
> ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:2) is 
> starting
> Feb 01, 2018 11:28:26 PM 
> org.apache.activemq.transport.TransportServerThreadSupport doStart
> INFO: Listening for connections at: mqtt://localhost:46799
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.TransportConnector start
> INFO: Connector mqtt://localhost:46799 started
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService 
> doStartBroker
> INFO: Apache ActiveMQ 5.13.1 (localhost, 
> ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:2) started
> Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService 
> doStartBroker
> INFO: For help or more information please see: http://activemq.apache.org
> Feb 01, 2018 11:28:28 PM org.apache.beam.sdk.io.mqtt.MqttIOTest 
> lambda$testRead$1
> INFO: Waiting pipeline connected to the MQTT broker before sending messages 
> ...
> Feb 01, 2018 11:28:35 PM org.apache.activemq.broker.BrokerService stop
> INFO: Apache ActiveMQ 5.13.1 (localhost, 
> ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:2) is 
> shutting down
> Feb 01, 2018 11:28:35 PM org.apache.activemq.broker.TransportConnector stop
> INFO: Connector mqtt://localhost:46799 stopped
> Feb 01, 2018 11:28:35 PM org.apache.activemq.broker.BrokerService stop
> INFO: 

[jira] [Created] (BEAM-3605) Kinesis ShardReadersPoolTest shouldForgetClosedShardIterator failure

2018-02-01 Thread Kenneth Knowles (JIRA)
Kenneth Knowles created BEAM-3605:
-

 Summary: Kinesis ShardReadersPoolTest 
shouldForgetClosedShardIterator failure
 Key: BEAM-3605
 URL: https://issues.apache.org/jira/browse/BEAM-3605
 Project: Beam
  Issue Type: Bug
  Components: sdk-java-extensions
Reporter: Kenneth Knowles
Assignee: Paweł Kaczmarczyk


Here's one:

https://builds.apache.org/job/beam_PreCommit_Java_GradleBuild/1758/testReport/junit/org.apache.beam.sdk.io.kinesis/ShardReadersPoolTest/shouldForgetClosedShardIterator/

Filing all test failures as "Critical" so we can sickbay or fix.

The Jenkins build will get GC'd so here is the error:

{code}
java.lang.AssertionError: 
Expecting:
  <["shard1", "shard2"]>
to contain only:
  <["shard2"]>
but the following elements were unexpected:
  <["shard1"]>

at 
org.apache.beam.sdk.io.kinesis.ShardReadersPoolTest.shouldForgetClosedShardIterator(ShardReadersPoolTest.java:270)
{code}

and stderr

{code}
Feb 01, 2018 11:24:16 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
readLoop
INFO: Shard iterator for shard1 shard is closed, finishing the read loop
org.apache.beam.sdk.io.kinesis.KinesisShardClosedException

Feb 01, 2018 11:24:16 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
readLoop
INFO: Kinesis Shard read loop has finished
Feb 01, 2018 11:24:16 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
readLoop
INFO: Shard iterator for shard1 shard is closed, finishing the read loop
org.apache.beam.sdk.io.kinesis.KinesisShardClosedException

Feb 01, 2018 11:24:16 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
readLoop
INFO: Kinesis Shard read loop has finished
Feb 01, 2018 11:24:19 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
readLoop
INFO: Shard iterator for shard1 shard is closed, finishing the read loop
org.apache.beam.sdk.io.kinesis.KinesisShardClosedException

Feb 01, 2018 11:24:19 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
readLoop
INFO: Kinesis Shard read loop has finished
Feb 01, 2018 11:24:23 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
readLoop
INFO: Shard iterator for shard1 shard is closed, finishing the read loop
org.apache.beam.sdk.io.kinesis.KinesisShardClosedException: Shard iterator 
reached end of the shard: streamName=null, shardId=shard1
at 
org.apache.beam.sdk.io.kinesis.ShardRecordsIterator.readNextBatch(ShardRecordsIterator.java:70)
at 
org.apache.beam.sdk.io.kinesis.ShardReadersPool.readLoop(ShardReadersPool.java:121)
at 
org.apache.beam.sdk.io.kinesis.ShardReadersPool.lambda$startReadingShards$0(ShardReadersPool.java:112)
at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

Feb 01, 2018 11:24:23 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
readLoop
INFO: Kinesis Shard read loop has finished
Feb 01, 2018 11:24:23 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
readLoop
INFO: Shard iterator for shard2 shard is closed, finishing the read loop
org.apache.beam.sdk.io.kinesis.KinesisShardClosedException: Shard iterator 
reached end of the shard: streamName=null, shardId=shard2
at 
org.apache.beam.sdk.io.kinesis.ShardRecordsIterator.readNextBatch(ShardRecordsIterator.java:70)
at 
org.apache.beam.sdk.io.kinesis.ShardReadersPool.readLoop(ShardReadersPool.java:121)
at 
org.apache.beam.sdk.io.kinesis.ShardReadersPool.lambda$startReadingShards$0(ShardReadersPool.java:112)
at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

Feb 01, 2018 11:24:23 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool stop
INFO: Closing shard iterators pool
Feb 01, 2018 11:24:24 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
readLoop
INFO: Kinesis Shard read loop has finished
Feb 01, 2018 11:24:24 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
readLoop
INFO: Shard iterator for shard1 shard is closed, finishing the read loop
org.apache.beam.sdk.io.kinesis.KinesisShardClosedException

Feb 01, 2018 11:24:24 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
readLoop
WARNING: Transient exception occurred.
org.apache.beam.sdk.io.kinesis.TransientKinesisException

Feb 01, 2018 11:24:24 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
readLoop
INFO: Shard iterator for shard1 shard is closed, finishing the read loop

[jira] [Created] (BEAM-3604) MqttIOTest flaky timeout

2018-02-01 Thread Kenneth Knowles (JIRA)
Kenneth Knowles created BEAM-3604:
-

 Summary: MqttIOTest flaky timeout
 Key: BEAM-3604
 URL: https://issues.apache.org/jira/browse/BEAM-3604
 Project: Beam
  Issue Type: Bug
  Components: sdk-java-extensions
Reporter: Kenneth Knowles
Assignee: Jean-Baptiste Onofré


I've seen failures a bit today. Here is one:

[https://builds.apache.org/job/beam_PreCommit_Java_GradleBuild/1758/testReport/junit/org.apache.beam.sdk.io.mqtt/MqttIOTest/testReadNoClientId/]

Filing all flakes as "Critical" priority so we can sickbay or fix.

Since that build will get GC'd, here is the Standard Error. It looks like from 
that perspective everything went as planned, but perhaps the test has a race 
condition or something?

{code}
Feb 01, 2018 11:28:01 PM org.apache.beam.sdk.io.mqtt.MqttIOTest startBroker
INFO: Finding free network port
Feb 01, 2018 11:28:01 PM org.apache.beam.sdk.io.mqtt.MqttIOTest startBroker
INFO: Starting ActiveMQ brokerService on 57986
Feb 01, 2018 11:28:03 PM org.apache.activemq.broker.BrokerService 
doStartPersistenceAdapter
INFO: Using Persistence Adapter: MemoryPersistenceAdapter
Feb 01, 2018 11:28:04 PM org.apache.activemq.broker.BrokerService doStartBroker
INFO: Apache ActiveMQ 5.13.1 (localhost, 
ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:1) is starting
Feb 01, 2018 11:28:04 PM 
org.apache.activemq.transport.TransportServerThreadSupport doStart
INFO: Listening for connections at: mqtt://localhost:57986
Feb 01, 2018 11:28:04 PM org.apache.activemq.broker.TransportConnector start
INFO: Connector mqtt://localhost:57986 started
Feb 01, 2018 11:28:04 PM org.apache.activemq.broker.BrokerService doStartBroker
INFO: Apache ActiveMQ 5.13.1 (localhost, 
ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:1) started
Feb 01, 2018 11:28:04 PM org.apache.activemq.broker.BrokerService doStartBroker
INFO: For help or more information please see: http://activemq.apache.org
Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService stop
INFO: Apache ActiveMQ 5.13.1 (localhost, 
ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:1) is shutting 
down
Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.TransportConnector stop
INFO: Connector mqtt://localhost:57986 stopped
Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService stop
INFO: Apache ActiveMQ 5.13.1 (localhost, 
ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:1) uptime 
24.039 seconds
Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService stop
INFO: Apache ActiveMQ 5.13.1 (localhost, 
ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:1) is shutdown
Feb 01, 2018 11:28:26 PM org.apache.beam.sdk.io.mqtt.MqttIOTest startBroker
INFO: Finding free network port
Feb 01, 2018 11:28:26 PM org.apache.beam.sdk.io.mqtt.MqttIOTest startBroker
INFO: Starting ActiveMQ brokerService on 46799
Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService 
doStartPersistenceAdapter
INFO: Using Persistence Adapter: MemoryPersistenceAdapter
Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService doStartBroker
INFO: Apache ActiveMQ 5.13.1 (localhost, 
ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:2) is starting
Feb 01, 2018 11:28:26 PM 
org.apache.activemq.transport.TransportServerThreadSupport doStart
INFO: Listening for connections at: mqtt://localhost:46799
Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.TransportConnector start
INFO: Connector mqtt://localhost:46799 started
Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService doStartBroker
INFO: Apache ActiveMQ 5.13.1 (localhost, 
ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:2) started
Feb 01, 2018 11:28:26 PM org.apache.activemq.broker.BrokerService doStartBroker
INFO: For help or more information please see: http://activemq.apache.org
Feb 01, 2018 11:28:28 PM org.apache.beam.sdk.io.mqtt.MqttIOTest 
lambda$testRead$1
INFO: Waiting pipeline connected to the MQTT broker before sending messages ...
Feb 01, 2018 11:28:35 PM org.apache.activemq.broker.BrokerService stop
INFO: Apache ActiveMQ 5.13.1 (localhost, 
ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:2) is shutting 
down
Feb 01, 2018 11:28:35 PM org.apache.activemq.broker.TransportConnector stop
INFO: Connector mqtt://localhost:46799 stopped
Feb 01, 2018 11:28:35 PM org.apache.activemq.broker.BrokerService stop
INFO: Apache ActiveMQ 5.13.1 (localhost, 
ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:2) uptime 
9.430 seconds
Feb 01, 2018 11:28:35 PM org.apache.activemq.broker.BrokerService stop
INFO: Apache ActiveMQ 5.13.1 (localhost, 
ID:115.98.154.104.bc.googleusercontent.com-38646-1517527683931-0:2) is shutdown
Feb 01, 2018 11:28:35 PM org.apache.beam.sdk.io.mqtt.MqttIOTest startBroker
INFO: Finding free network port
Feb 01, 2018 

[jira] [Commented] (BEAM-3547) [SQL] Nested Query Generates Incompatible Trigger

2018-02-01 Thread Anton Kedin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3547?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16349676#comment-16349676
 ] 

Anton Kedin commented on BEAM-3547:
---

Trying this query with both inputs unbounded, or unbounded+bounded works as 
expected on top of: https://github.com/apache/beam/pull/4546

> [SQL] Nested Query Generates Incompatible Trigger
> -
>
> Key: BEAM-3547
> URL: https://issues.apache.org/jira/browse/BEAM-3547
> Project: Beam
>  Issue Type: Bug
>  Components: dsl-sql
>Reporter: Anton Kedin
>Assignee: Anton Kedin
>Priority: Major
>
> From 
> [https://stackoverflow.com/questions/48335383/nested-queries-in-beam-sql] :
>  
> SQL:
> {code:java}
> PCollection Query_Output = Query.apply(
> BeamSql.queryMulti("Select Orders.OrderID From Orders Where 
> Orders.CustomerID IN (Select Customers.CustomerID From Customers WHERE 
> Customers.CustomerID = 2)"));{code}
>  
> Error:
> {code:java}
> org.apache.beam.sdk.extensions.sql.impl.planner.BeamQueryPlanner 
> validateAndConvert
> INFO: SQL:
> SELECT `Orders`.`OrderID`
> FROM `Orders` AS `Orders`
> WHERE `Orders`.`CustomerID` IN (SELECT `Customers`.`CustomerID`
> FROM `Customers` AS `Customers`
> WHERE `Customers`.`CustomerID` = 2)
> Jan 19, 2018 11:56:36 AM 
> org.apache.beam.sdk.extensions.sql.impl.planner.BeamQueryPlanner 
> convertToBeamRel
> INFO: SQLPlan>
> LogicalProject(OrderID=[$0])
>   LogicalJoin(condition=[=($1, $3)], joinType=[inner])
> LogicalTableScan(table=[[Orders]])
> LogicalAggregate(group=[{0}])
>   LogicalProject(CustomerID=[$0])
> LogicalFilter(condition=[=($0, 2)])
>   LogicalTableScan(table=[[Customers]])
> Exception in thread "main" java.lang.IllegalStateException: 
> java.lang.IllegalStateException: Inputs to Flatten had incompatible triggers: 
> DefaultTrigger, Repeatedly.forever(AfterWatermark.pastEndOfWindow())
> at 
> org.apache.beam.sdk.extensions.sql.BeamSql$QueryTransform.expand(BeamSql.java:165)
> at 
> org.apache.beam.sdk.extensions.sql.BeamSql$QueryTransform.expand(BeamSql.java:116)
> at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:533)
> at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:465)
> at 
> org.apache.beam.sdk.values.PCollectionTuple.apply(PCollectionTuple.java:160)
> at com.bitwise.cloud.ExampleOfJoins.main(ExampleOfJoins.java:91)
> Caused by: java.lang.IllegalStateException: Inputs to Flatten had 
> incompatible triggers: DefaultTrigger, 
> Repeatedly.forever(AfterWatermark.pastEndOfWindow())
> at 
> org.apache.beam.sdk.transforms.Flatten$PCollections.expand(Flatten.java:123)
> at 
> org.apache.beam.sdk.transforms.Flatten$PCollections.expand(Flatten.java:101)
> at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:533)
> at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:465)
> at 
> org.apache.beam.sdk.values.PCollectionList.apply(PCollectionList.java:182)
> at 
> org.apache.beam.sdk.transforms.join.CoGroupByKey.expand(CoGroupByKey.java:124)
> at 
> org.apache.beam.sdk.transforms.join.CoGroupByKey.expand(CoGroupByKey.java:74)
> at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:533)
> at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:465)
> at 
> org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.apply(KeyedPCollectionTuple.java:107)
> at org.apache.beam.sdk.extensions.joinlibrary.Join.innerJoin(Join.java:59)
> at 
> org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel.standardJoin(BeamJoinRel.java:217)
> at 
> org.apache.beam.sdk.extensions.sql.impl.rel.BeamJoinRel.buildBeamPipeline(BeamJoinRel.java:161)
> at 
> org.apache.beam.sdk.extensions.sql.impl.rel.BeamProjectRel.buildBeamPipeline(BeamProjectRel.java:68)
> at 
> org.apache.beam.sdk.extensions.sql.BeamSql$QueryTransform.expand(BeamSql.java:163)
> ... 5 more{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3225) Non deterministic behaviour of AfterProcessingTime trigger with multiple group by transformations

2018-02-01 Thread Eugene Kirpichov (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3225?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Kirpichov reassigned BEAM-3225:
--

Assignee: Aljoscha Krettek  (was: Eugene Kirpichov)

> Non deterministic behaviour of AfterProcessingTime trigger with multiple 
> group by transformations
> -
>
> Key: BEAM-3225
> URL: https://issues.apache.org/jira/browse/BEAM-3225
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core, runner-flink
>Affects Versions: 2.0.0, 2.1.0
>Reporter: Pawel Bartoszek
>Assignee: Aljoscha Krettek
>Priority: Critical
>
> *Context*
> I run my [test 
> code|https://gist.github.com/pbartoszek/fe6b1d7501333a2a4b385a1424f6bd80] 
> against different triggers and runners. My original problem was that when 
> writing to a file sink files weren't always produced in a deterministic way. 
> Please refer to this 
> [BEAM-3151|https://issues.apache.org/jira/browse/BEAM-3151] When I started 
> looking at WriteFiles class I noticed that file sink implementation includes 
> some multiple GroupByKey transformations. Knowing that I wrote my test code 
> that is using multiple GroupByKey transformations to conclude that this is a 
> bit buggy(?) support of After(Synchronised)ProcessingTime triggers by 
> GroupByKey that also influence the file sink. When I run my job using 
> Dataflow runner I was getting expected output.
> *About test code*
> The job is counting how many A and B elements it received within 30 sec 
> windows using Count.perElement. Then I am using GroupByKey to fire every time 
> count has increased.
> Below I outlined the expected standard output: 
> {code:java}
> Let's assume all events are received in the same 30 seconds window.
> A -> After count KV{A, 1} -> Final group by KV{A, [1]}
> A -> After count KV{A, 2} -> Final group by KV{A, [1,2]}
> A -> After count KV{A, 3} -> Final group by KV{A, [1,2,3]}
> B -> After count KV{B, 1} -> Final group by KV{B, [1]}
> With my trigger configuration I would expect that for every new element 
> 'After count' is printed with new value followed by 'Final group by' with new 
> counter. Final group by represents the history of counters then.{code}
>  
> *Problem*
> 'Final group by' trigger doesn't always go off although trigger set up would 
> suggest that. This behaviour is different for different runners and Beam 
> versions. 
>  
> *My observations when using Pubsub*
> Trigger configuration
> {code:java}
> Window.into(FixedWindows.of(standardSeconds(30)))
> 
> .triggering(Repeatedly.forever(AfterProcessingTime.pastFirstElementInPane())) 
>  
> .withAllowedLateness(standardSeconds(5), 
> Window.ClosingBehavior.FIRE_IF_NON_EMPTY)
> .accumulatingFiredPanes())
> {code}
>  
>  Beam 2.0 Flink Runner
> {code:java}
> 2017-11-16T14:51:44.294Z After count KV{A, 1} 
> [2017-11-16T14:51:30.000Z..2017-11-16T14:52:00.000Z)
> 2017-11-16T14:51:53.036Z Received Element A 
> [2017-11-16T14:51:30.000Z..2017-11-16T14:52:00.000Z)
> 2017-11-16T14:51:53.143Z After count KV{A, 2} 
> [2017-11-16T14:51:30.000Z..2017-11-16T14:52:00.000Z)
> 2017-11-16T14:51:53.246Z Final group by KV{A, [1, 2]} 
> [2017-11-16T14:51:30.000Z..2017-11-16T14:52:00.000Z)
> 2017-11-16T14:52:03.522Z Received Element A 
> [2017-11-16T14:52:00.000Z..2017-11-16T14:52:30.000Z)
> 2017-11-16T14:52:03.629Z After count KV{A, 1} 
> [2017-11-16T14:52:00.000Z..2017-11-16T14:52:30.000Z)
> 2017-11-16T14:52:03.732Z Final group by KV{A, [1]} 
> [2017-11-16T14:52:00.000Z..2017-11-16T14:52:30.000Z)
> 2017-11-16T14:52:07.270Z Received Element A 
> [2017-11-16T14:52:00.000Z..2017-11-16T14:52:30.000Z)
> 2017-11-16T14:52:07.372Z After count KV{A, 2} 
> [2017-11-16T14:52:00.000Z..2017-11-16T14:52:30.000Z)
> 2017-11-16T14:52:07.476Z Final group by KV{A, [1, 2]} 
> [2017-11-16T14:52:00.000Z..2017-11-16T14:52:30.000Z)
> 2017-11-16T14:52:10.394Z Received Element A 
> [2017-11-16T14:52:00.000Z..2017-11-16T14:52:30.000Z)
> 2017-11-16T14:52:10.501Z After count KV{A, 3} 
> [2017-11-16T14:52:00.000Z..2017-11-16T14:52:30.000Z)
> 2017-11-16T14:52:10.602Z Final group by KV{A, [1, 2, 3]} 
> [2017-11-16T14:52:00.000Z..2017-11-16T14:52:30.000Z)
> 2017-11-16T14:52:13.296Z Received Element A 
> [2017-11-16T14:52:00.000Z..2017-11-16T14:52:30.000Z)
> 2017-11-16T14:52:13.402Z After count KV{A, 4} 
> [2017-11-16T14:52:00.000Z..2017-11-16T14:52:30.000Z)
> 2017-11-16T14:52:13.507Z Final group by KV{A, [1, 2, 3, 4]} 
> [2017-11-16T14:52:00.000Z..2017-11-16T14:52:30.000Z)
> 2017-11-16T14:52:14.951Z Received Element A 
> [2017-11-16T14:52:00.000Z..2017-11-16T14:52:30.000Z) <--- Expected to see 
> 'After count' after this
> 2017-11-16T14:52:35.320Z Received Element A 
> 

[jira] [Commented] (BEAM-3225) Non deterministic behaviour of AfterProcessingTime trigger with multiple group by transformations

2018-02-01 Thread Eugene Kirpichov (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3225?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16349669#comment-16349669
 ] 

Eugene Kirpichov commented on BEAM-3225:


Thanks for the thorough investigation!

Another comment here: triggers *are* non-deterministic, in the sense that they 
only *unblock* output from being produced, but don't cause it to be immediately 
produced. Of course, runners generally try to produce it quickly, but e.g. with 
an AfterPane.elementCountAtLeast(1) a runner is definitely allowed to process 
several elements and then emit a single pane containing all of them, not 
necessarily firing 1 pane for element (hence "element count at least 1" rather 
than "exactly 1"). With something like AfterProcessingTime it is still more 
vague, as clocks are approximate - by the time the trigger fires, more elements 
could have arrived, so we can emit all of them.

To do what you want (full exact history of count changes) you'll need to write 
a stateful DoFn with a per-key counter and manually emit everything you want. 
That is what Beam does under the hood anyway (triggers are implemented using 
per-key state and timers), just not with the semantics you want.

That explains all of your examples up until "Why Final group by is triggered 
only after allowed lateness at 11:00:35?". As for that one, I'm not sure I 
understand what is the issue? After Count A fires after the first time A 
arrives (causing Final group by to fire), and fires another time when the 
window closes, i.e. after the allowed lateness (causing Final group by to fire 
one more time) - seems as expected?

As for "allowed lateness configuration dictates that only non empty panes 
should be trigger!!!" - yes, this seems like a bug in the Flink runner; 
[~aljoscha] could you take a look?

> Non deterministic behaviour of AfterProcessingTime trigger with multiple 
> group by transformations
> -
>
> Key: BEAM-3225
> URL: https://issues.apache.org/jira/browse/BEAM-3225
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core, runner-flink
>Affects Versions: 2.0.0, 2.1.0
>Reporter: Pawel Bartoszek
>Assignee: Eugene Kirpichov
>Priority: Critical
>
> *Context*
> I run my [test 
> code|https://gist.github.com/pbartoszek/fe6b1d7501333a2a4b385a1424f6bd80] 
> against different triggers and runners. My original problem was that when 
> writing to a file sink files weren't always produced in a deterministic way. 
> Please refer to this 
> [BEAM-3151|https://issues.apache.org/jira/browse/BEAM-3151] When I started 
> looking at WriteFiles class I noticed that file sink implementation includes 
> some multiple GroupByKey transformations. Knowing that I wrote my test code 
> that is using multiple GroupByKey transformations to conclude that this is a 
> bit buggy(?) support of After(Synchronised)ProcessingTime triggers by 
> GroupByKey that also influence the file sink. When I run my job using 
> Dataflow runner I was getting expected output.
> *About test code*
> The job is counting how many A and B elements it received within 30 sec 
> windows using Count.perElement. Then I am using GroupByKey to fire every time 
> count has increased.
> Below I outlined the expected standard output: 
> {code:java}
> Let's assume all events are received in the same 30 seconds window.
> A -> After count KV{A, 1} -> Final group by KV{A, [1]}
> A -> After count KV{A, 2} -> Final group by KV{A, [1,2]}
> A -> After count KV{A, 3} -> Final group by KV{A, [1,2,3]}
> B -> After count KV{B, 1} -> Final group by KV{B, [1]}
> With my trigger configuration I would expect that for every new element 
> 'After count' is printed with new value followed by 'Final group by' with new 
> counter. Final group by represents the history of counters then.{code}
>  
> *Problem*
> 'Final group by' trigger doesn't always go off although trigger set up would 
> suggest that. This behaviour is different for different runners and Beam 
> versions. 
>  
> *My observations when using Pubsub*
> Trigger configuration
> {code:java}
> Window.into(FixedWindows.of(standardSeconds(30)))
> 
> .triggering(Repeatedly.forever(AfterProcessingTime.pastFirstElementInPane())) 
>  
> .withAllowedLateness(standardSeconds(5), 
> Window.ClosingBehavior.FIRE_IF_NON_EMPTY)
> .accumulatingFiredPanes())
> {code}
>  
>  Beam 2.0 Flink Runner
> {code:java}
> 2017-11-16T14:51:44.294Z After count KV{A, 1} 
> [2017-11-16T14:51:30.000Z..2017-11-16T14:52:00.000Z)
> 2017-11-16T14:51:53.036Z Received Element A 
> [2017-11-16T14:51:30.000Z..2017-11-16T14:52:00.000Z)
> 2017-11-16T14:51:53.143Z After count KV{A, 2} 
> [2017-11-16T14:51:30.000Z..2017-11-16T14:52:00.000Z)
> 

Build failed in Jenkins: beam_PerformanceTests_Python #863

2018-02-01 Thread Apache Jenkins Server
See 


Changes:

[ehudm] Split out buffered read and write code from gcsio.

[github] import logging for line 1163

[dkulp] [BEAM-3562] Update to Checkstyle 8.7

[klk] Encourage a good description in a good spot on a PR description.

[tgroh] Add QueryablePipeline

[gene] Changing FileNaming to public to allow for usage in lambdas/inheritance

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam8 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e34fee13a12ad1bc8694c7f8d1bdedc9afb88e17 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e34fee13a12ad1bc8694c7f8d1bdedc9afb88e17
Commit message: "Merge pull request #4471: [BEAM-3099] Split out BufferedReader 
and BufferedWriter from gcsio."
 > git rev-list 51da92cce831755866e5c017802532929fdfd872 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins5273086330297496473.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins9159302210169260110.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins4553115145016364535.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins8831732620183427392.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/75/d1/5abca4ccf61a7ab86c255dd315fb96e566fbf9b5d3a480e72c93e8ec2802/setuptools-38.4.0-py2.py3-none-any.whl#md5=a5c6620a59f19f2d5d32bdca18c7b47e
Downloading/unpacking pip from 
https://pypi.python.org/packages/b6/ac/7015eb97dc749283ffdec1c3a88ddb8ae03b8fad0f0e611408f196358da3/pip-9.0.1-py2.py3-none-any.whl#md5=297dbd16ef53bcef0447d245815f5144
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_Python] $ /bin/bash -xe /tmp/jenkins510103526591437663.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe /tmp/jenkins403324489383911251.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
  Using cached numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 

Build failed in Jenkins: beam_PerformanceTests_TFRecordIOIT #90

2018-02-01 Thread Apache Jenkins Server
See 


Changes:

[ehudm] Split out buffered read and write code from gcsio.

[github] import logging for line 1163

[dkulp] [BEAM-3562] Update to Checkstyle 8.7

[klk] Encourage a good description in a good spot on a PR description.

[tgroh] Add QueryablePipeline

[gene] Changing FileNaming to public to allow for usage in lambdas/inheritance

--
[...truncated 19.26 KB...]
2018-02-02 01:42:06,790 a4f84ca1 MainThread beam_integration_benchmark(1/1) 
INFO Running: /home/jenkins/tools/maven/latest/bin/mvn -e verify 
-Dit.test=org.apache.beam.sdk.io.tfrecord.TFRecordIOIT -DskipITs=false -pl 
sdks/java/io/file-based-io-tests -Pio-it -Pdataflow-runner -Dfilesystem=gcs 
-DintegrationTestPipelineOptions=["--project=apache-beam-testing","--tempRoot=gs://temp-storage-for-perf-tests","--numberOfRecords=100","--filenamePrefix=gs://temp-storage-for-perf-tests/beam_PerformanceTests_TFRecordIOIT/90/","--runner=TestDataflowRunner"]
2018-02-02 01:42:36,113 a4f84ca1 MainThread beam_integration_benchmark(1/1) 
INFO Ran /home/jenkins/tools/maven/latest/bin/mvn -e verify 
-Dit.test=org.apache.beam.sdk.io.tfrecord.TFRecordIOIT -DskipITs=false -pl 
sdks/java/io/file-based-io-tests -Pio-it -Pdataflow-runner -Dfilesystem=gcs 
-DintegrationTestPipelineOptions=["--project=apache-beam-testing","--tempRoot=gs://temp-storage-for-perf-tests","--numberOfRecords=100","--filenamePrefix=gs://temp-storage-for-perf-tests/beam_PerformanceTests_TFRecordIOIT/90/","--runner=TestDataflowRunner"].
 Got return code (1).
STDOUT: [INFO] Error stacktraces are turned on.
[INFO] Scanning for projects...
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.beam:beam-runners-java-fn-execution:jar:2.4.0-SNAPSHOT
[WARNING] 'dependencies.dependency.(groupId:artifactId:type:classifier)' must 
be unique: com.google.protobuf:protobuf-java:jar -> duplicate declaration of 
version (?) @ org.apache.beam:beam-runners-java-fn-execution:[unknown-version], 

 line 66, column 17
[WARNING] 'dependencies.dependency.(groupId:artifactId:type:classifier)' must 
be unique: org.apache.beam:beam-runners-core-construction-java:jar -> duplicate 
declaration of version (?) @ 
org.apache.beam:beam-runners-java-fn-execution:[unknown-version], 

 line 115, column 17
[WARNING] 
[WARNING] It is highly recommended to fix these problems because they threaten 
the stability of your build.
[WARNING] 
[WARNING] For this reason, future Maven versions might no longer support 
building such malformed projects.
[WARNING] 
[INFO] 
[INFO] Detecting the operating system and CPU architecture
[INFO] 
[INFO] os.detected.name: linux
[INFO] os.detected.arch: x86_64
[INFO] os.detected.version: 4.4
[INFO] os.detected.version.major: 4
[INFO] os.detected.version.minor: 4
[INFO] os.detected.release: ubuntu
[INFO] os.detected.release.version: 14.04
[INFO] os.detected.release.like.ubuntu: true
[INFO] os.detected.release.like.debian: true
[INFO] os.detected.classifier: linux-x86_64
[INFO] 
[INFO] 
[INFO] Building Apache Beam :: SDKs :: Java :: IO :: File-based-io-tests 
2.4.0-SNAPSHOT
[INFO] 
Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-checkstyle-plugin/3.0.0/maven-checkstyle-plugin-3.0.0.pom
Progress (1): 2.1/15 kBProgress (1): 4.9/15 kBProgress (1): 7.7/15 kBProgress 
(1): 10/15 kB Progress (1): 13/15 kBProgress (1): 15 kB  
Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-checkstyle-plugin/3.0.0/maven-checkstyle-plugin-3.0.0.pom
 (15 kB at 17 kB/s)
Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-checkstyle-plugin/3.0.0/maven-checkstyle-plugin-3.0.0.jar
Progress (1): 2.1/108 kBProgress (1): 4.9/108 kBProgress (1): 7.7/108 
kBProgress (1): 10/108 kB Progress (1): 13/108 kBProgress (1): 16/108 
kBProgress (1): 19/108 kBProgress (1): 21/108 kBProgress (1): 24/108 kBProgress 
(1): 27/108 kBProgress (1): 30/108 kBProgress (1): 32/108 kBProgress (1): 
36/108 kBProgress (1): 40/108 kBProgress (1): 44/108 kBProgress (1): 49/108 
kBProgress (1): 53/108 kBProgress (1): 57/108 kBProgress (1): 61/108 kBProgress 
(1): 65/108 kBProgress (1): 69/108 kBProgress (1): 73/108 kBProgress (1): 
77/108 kBProgress (1): 81/108 kBProgress (1): 

Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Dataflow #4861

2018-02-01 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT #91

2018-02-01 Thread Apache Jenkins Server
See 


Changes:

[ehudm] Split out buffered read and write code from gcsio.

[github] import logging for line 1163

[dkulp] [BEAM-3562] Update to Checkstyle 8.7

[klk] Encourage a good description in a good spot on a PR description.

[tgroh] Add QueryablePipeline

[gene] Changing FileNaming to public to allow for usage in lambdas/inheritance

--
[...truncated 8.47 KB...]
[beam_PerformanceTests_Compressed_TextIOIT] $ /bin/bash -xe 
/tmp/jenkins1271898262445100550.sh
+ .env/bin/pip install -e 'src/sdks/python/[gcp,test]'
Obtaining 
file://
Requirement already satisfied: avro<2.0.0,>=1.8.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: crcmod<2.0,>=1.7 in 
/usr/lib/python2.7/dist-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: dill==0.2.6 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: grpcio<2,>=1.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: httplib2<0.10,>=0.8 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: oauth2client<5,>=2.0.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: pyyaml<4.0.0,>=3.12 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: six<1.12,>=1.9 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: typing<3.7.0,>=3.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: futures<4.0.0,>=3.1.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: hdfs3<0.4.0,>=0.3.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: google-apitools<=0.5.20,>=0.5.18 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 
in /home/jenkins/.local/lib/python2.7/site-packages (from 
apache-beam==2.4.0.dev0)
Requirement already satisfied: googledatastore==7.0.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: google-cloud-pubsub==0.26.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: google-cloud-bigquery==0.25.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: pyhamcrest<2.0,>=1.9 in 
/home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.4.0.dev0)
Requirement already satisfied: enum34>=1.0.4 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
grpcio<2,>=1.0->apache-beam==2.4.0.dev0)
Requirement already satisfied: funcsigs>=1; python_version < "3.3" in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
mock<3.0.0,>=1.0.1->apache-beam==2.4.0.dev0)
Requirement already satisfied: pbr>=0.11 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
mock<3.0.0,>=1.0.1->apache-beam==2.4.0.dev0)
Requirement already satisfied: pyasn1>=0.1.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
oauth2client<5,>=2.0.1->apache-beam==2.4.0.dev0)
Requirement already satisfied: pyasn1-modules>=0.0.5 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
oauth2client<5,>=2.0.1->apache-beam==2.4.0.dev0)
Requirement already satisfied: rsa>=3.1.4 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
oauth2client<5,>=2.0.1->apache-beam==2.4.0.dev0)
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from protobuf<4,>=3.5.0.post1->apache-beam==2.4.0.dev0)
Requirement already satisfied: fasteners>=0.14 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
google-apitools<=0.5.20,>=0.5.18->apache-beam==2.4.0.dev0)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.5.2 in 
/home/jenkins/.local/lib/python2.7/site-packages (from 
proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.4.0.dev0)
Requirement already satisfied: 

Build failed in Jenkins: beam_PerformanceTests_AvroIOIT #93

2018-02-01 Thread Apache Jenkins Server
See 


Changes:

[ehudm] Split out buffered read and write code from gcsio.

[github] import logging for line 1163

[dkulp] [BEAM-3562] Update to Checkstyle 8.7

[klk] Encourage a good description in a good spot on a PR description.

[tgroh] Add QueryablePipeline

[gene] Changing FileNaming to public to allow for usage in lambdas/inheritance

--
[...truncated 20.27 KB...]
2018-02-02 01:34:59,487 7422ce4b MainThread beam_integration_benchmark(1/1) 
INFO Provisioning resources for benchmark beam_integration_benchmark
2018-02-02 01:34:59,490 7422ce4b MainThread beam_integration_benchmark(1/1) 
INFO Preparing benchmark beam_integration_benchmark
2018-02-02 01:34:59,491 7422ce4b MainThread beam_integration_benchmark(1/1) 
INFO Running: git clone https://github.com/apache/beam.git
2018-02-02 01:35:07,919 7422ce4b MainThread beam_integration_benchmark(1/1) 
INFO Running benchmark beam_integration_benchmark
2018-02-02 01:35:07,928 7422ce4b MainThread beam_integration_benchmark(1/1) 
INFO Running: /home/jenkins/tools/maven/latest/bin/mvn -e verify 
-Dit.test=org.apache.beam.sdk.io.avro.AvroIOIT -DskipITs=false -pl 
sdks/java/io/file-based-io-tests -Pio-it -Pdataflow-runner -Dfilesystem=gcs 
-DintegrationTestPipelineOptions=["--project=apache-beam-testing","--tempRoot=gs://temp-storage-for-perf-tests","--numberOfRecords=100","--filenamePrefix=gs://temp-storage-for-perf-tests/beam_PerformanceTests_AvroIOIT/93/","--runner=TestDataflowRunner"]
2018-02-02 01:35:41,503 7422ce4b MainThread beam_integration_benchmark(1/1) 
INFO Ran /home/jenkins/tools/maven/latest/bin/mvn -e verify 
-Dit.test=org.apache.beam.sdk.io.avro.AvroIOIT -DskipITs=false -pl 
sdks/java/io/file-based-io-tests -Pio-it -Pdataflow-runner -Dfilesystem=gcs 
-DintegrationTestPipelineOptions=["--project=apache-beam-testing","--tempRoot=gs://temp-storage-for-perf-tests","--numberOfRecords=100","--filenamePrefix=gs://temp-storage-for-perf-tests/beam_PerformanceTests_AvroIOIT/93/","--runner=TestDataflowRunner"].
 Got return code (1).
STDOUT: [INFO] Error stacktraces are turned on.
[INFO] Scanning for projects...
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.beam:beam-runners-java-fn-execution:jar:2.4.0-SNAPSHOT
[WARNING] 'dependencies.dependency.(groupId:artifactId:type:classifier)' must 
be unique: com.google.protobuf:protobuf-java:jar -> duplicate declaration of 
version (?) @ org.apache.beam:beam-runners-java-fn-execution:[unknown-version], 

 line 66, column 17
[WARNING] 'dependencies.dependency.(groupId:artifactId:type:classifier)' must 
be unique: org.apache.beam:beam-runners-core-construction-java:jar -> duplicate 
declaration of version (?) @ 
org.apache.beam:beam-runners-java-fn-execution:[unknown-version], 

 line 115, column 17
[WARNING] 
[WARNING] It is highly recommended to fix these problems because they threaten 
the stability of your build.
[WARNING] 
[WARNING] For this reason, future Maven versions might no longer support 
building such malformed projects.
[WARNING] 
[INFO] 
[INFO] Detecting the operating system and CPU architecture
[INFO] 
[INFO] os.detected.name: linux
[INFO] os.detected.arch: x86_64
[INFO] os.detected.version: 4.4
[INFO] os.detected.version.major: 4
[INFO] os.detected.version.minor: 4
[INFO] os.detected.release: ubuntu
[INFO] os.detected.release.version: 14.04
[INFO] os.detected.release.like.ubuntu: true
[INFO] os.detected.release.like.debian: true
[INFO] os.detected.classifier: linux-x86_64
[INFO] 
[INFO] 
[INFO] Building Apache Beam :: SDKs :: Java :: IO :: File-based-io-tests 
2.4.0-SNAPSHOT
[INFO] 
Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-checkstyle-plugin/3.0.0/maven-checkstyle-plugin-3.0.0.pom
Progress (1): 2.1/15 kBProgress (1): 4.9/15 kBProgress (1): 7.7/15 kBProgress 
(1): 10/15 kB Progress (1): 13/15 kBProgress (1): 15 kB  
Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-checkstyle-plugin/3.0.0/maven-checkstyle-plugin-3.0.0.pom
 (15 kB at 23 kB/s)
Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-checkstyle-plugin/3.0.0/maven-checkstyle-plugin-3.0.0.jar
Progress (1): 2.1/108 kBProgress (1): 4.9/108 kBProgress (1): 

[jira] [Created] (BEAM-3603) Add a ReadAll transform to tfrecordio

2018-02-01 Thread Chamikara Jayalath (JIRA)
Chamikara Jayalath created BEAM-3603:


 Summary: Add a ReadAll transform to tfrecordio
 Key: BEAM-3603
 URL: https://issues.apache.org/jira/browse/BEAM-3603
 Project: Beam
  Issue Type: Improvement
  Components: sdk-py-core
Affects Versions: 2.4.0
Reporter: Chamikara Jayalath
Assignee: Chamikara Jayalath


We currently have ReadAll transforms for textio and avroio. We should add one 
for tfrecordio as well. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-3602) Add source set for generated Java gRPC code

2018-02-01 Thread Ben Sidhom (JIRA)
Ben Sidhom created BEAM-3602:


 Summary: Add source set for generated Java gRPC code
 Key: BEAM-3602
 URL: https://issues.apache.org/jira/browse/BEAM-3602
 Project: Beam
  Issue Type: Improvement
  Components: build-system
Reporter: Ben Sidhom
Assignee: Ben Sidhom


Generated code is not currently exposed to IDEs (either in source or .class 
form). This leads to spurious compile errors in editors.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-3601) Switch portability support code to Java 8 futures

2018-02-01 Thread Kenneth Knowles (JIRA)
Kenneth Knowles created BEAM-3601:
-

 Summary: Switch portability support code to Java 8 futures
 Key: BEAM-3601
 URL: https://issues.apache.org/jira/browse/BEAM-3601
 Project: Beam
  Issue Type: Improvement
  Components: runner-core
Reporter: Kenneth Knowles
Assignee: Kenneth Knowles






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Dataflow #4860

2018-02-01 Thread Apache Jenkins Server
See 




[jira] [Created] (BEAM-3600) Do not ignore FileSystem errors and document expected behavior

2018-02-01 Thread Udi Meiri (JIRA)
Udi Meiri created BEAM-3600:
---

 Summary: Do not ignore FileSystem errors and document expected 
behavior
 Key: BEAM-3600
 URL: https://issues.apache.org/jira/browse/BEAM-3600
 Project: Beam
  Issue Type: Bug
  Components: sdk-py-core
Reporter: Udi Meiri
Assignee: Ahmet Altay


copy/rename:
 * it should be an error if the dst file exists
 * it should be an error if the src file doesn't exist

delete:
 * it should be an error if the file/dir doesn't exist

FileBasedSink.finalize_write:
 * should check (src, dst) pairs for existence:
 ** src only - regular rename
 ** dst only - skip rename
 ** both src and dst - if files don't match on metadata (checksum or size), 
delete dst then rename, otherwise delete src and skip rename
 ** neither exist - return error: bad state

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3600) Do not ignore FileSystem errors and document expected behavior

2018-02-01 Thread Udi Meiri (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3600?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Udi Meiri reassigned BEAM-3600:
---

Assignee: Udi Meiri  (was: Ahmet Altay)

> Do not ignore FileSystem errors and document expected behavior
> --
>
> Key: BEAM-3600
> URL: https://issues.apache.org/jira/browse/BEAM-3600
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Udi Meiri
>Assignee: Udi Meiri
>Priority: Major
>
> copy/rename:
>  * it should be an error if the dst file exists
>  * it should be an error if the src file doesn't exist
> delete:
>  * it should be an error if the file/dir doesn't exist
> FileBasedSink.finalize_write:
>  * should check (src, dst) pairs for existence:
>  ** src only - regular rename
>  ** dst only - skip rename
>  ** both src and dst - if files don't match on metadata (checksum or size), 
> delete dst then rename, otherwise delete src and skip rename
>  ** neither exist - return error: bad state
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-3599) kinesis.ShardReadersPoolTest.shouldInterruptKinesisReadingAndStopShortly is flaky

2018-02-01 Thread Thomas Groh (JIRA)
Thomas Groh created BEAM-3599:
-

 Summary: 
kinesis.ShardReadersPoolTest.shouldInterruptKinesisReadingAndStopShortly is 
flaky
 Key: BEAM-3599
 URL: https://issues.apache.org/jira/browse/BEAM-3599
 Project: Beam
  Issue Type: Bug
  Components: sdk-java-extensions
Reporter: Thomas Groh
Assignee: Reuven Lax


This appears to be unavoidable due to the construction of the test - the amount 
of realtime that the thread waits is asserted to be less than one second, but 
as the class doesn't consume a {{Ticker}} (or related way to track elapsed 
time), we have no way of deterministically controlling that value within the 
test.

 

Example failure: 
[https://builds.apache.org/job/beam_PreCommit_Java_GradleBuild/1738/testReport/junit/org.apache.beam.sdk.io.kinesis/ShardReadersPoolTest/shouldInterruptKinesisReadingAndStopShortly/]

 

{{java.lang.AssertionError: Expecting: <4169L> to be less than: <1000L> at 
org.apache.beam.sdk.io.kinesis.ShardReadersPoolTest.shouldInterruptKinesisReadingAndStopShortly(ShardReadersPoolTest.java:159)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)}} 

 

The test could also do with a more precise error message (it looks like too 
many elements were received based on the error message, rather than the actual 
"took too long" failure.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-3598) kinesis.ShardReadersPoolTest.shouldStopReadersPoolAlsoWhenExceptionsOccurDuringStopping is flaky

2018-02-01 Thread Thomas Groh (JIRA)
Thomas Groh created BEAM-3598:
-

 Summary: 
kinesis.ShardReadersPoolTest.shouldStopReadersPoolAlsoWhenExceptionsOccurDuringStopping
 is flaky
 Key: BEAM-3598
 URL: https://issues.apache.org/jira/browse/BEAM-3598
 Project: Beam
  Issue Type: Bug
  Components: sdk-java-extensions
Reporter: Thomas Groh
Assignee: Reuven Lax


shouldStopReadersPoolAlsoWhenExceptionsOccurDuringStopping fails due to 
incomplete interactions with the mock.

{{Wanted but not invoked: firstIterator.readNextBatch(); -> at 
org.apache.beam.sdk.io.kinesis.ShardReadersPoolTest.shouldStopReadersPoolAlsoWhenExceptionsOccurDuringStopping(ShardReadersPoolTest.java:244)
 However, there were other interactions with this mock: -> at 
org.apache.beam.sdk.io.kinesis.ShardReadersPoolTest.shouldStopReadersPoolAlsoWhenExceptionsOccurDuringStopping(ShardReadersPoolTest.java:241)}}

 

[https://builds.apache.org/job/beam_PreCommit_Java_MavenInstall/org.apache.beam$beam-sdks-java-io-kinesis/17390/testReport/junit/org.apache.beam.sdk.io.kinesis/ShardReadersPoolTest/shouldStopReadersPoolAlsoWhenExceptionsOccurDuringStopping/]

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: Merge pull request #4471: [BEAM-3099] Split out BufferedReader and BufferedWriter from gcsio.

2018-02-01 Thread chamikara
This is an automated email from the ASF dual-hosted git repository.

chamikara pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit e34fee13a12ad1bc8694c7f8d1bdedc9afb88e17
Merge: 2b242fe fe2de5e
Author: Chamikara Jayalath 
AuthorDate: Thu Feb 1 14:25:32 2018 -0800

Merge pull request #4471: [BEAM-3099] Split out BufferedReader and 
BufferedWriter from gcsio.

 sdks/python/apache_beam/io/filesystemio.py  | 264 +
 sdks/python/apache_beam/io/filesystemio_test.py | 185 +
 sdks/python/apache_beam/io/gcp/gcsio.py | 481 
 sdks/python/apache_beam/io/gcp/gcsio_test.py|  48 +--
 4 files changed, 528 insertions(+), 450 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
chamik...@apache.org.


[beam] branch master updated (2b242fe -> e34fee1)

2018-02-01 Thread chamikara
This is an automated email from the ASF dual-hosted git repository.

chamikara pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 2b242fe  Merge pull request #4530: Add QueryablePipeline
 add fe2de5e  Split out buffered read and write code from gcsio.
 new e34fee1  Merge pull request #4471: [BEAM-3099] Split out 
BufferedReader and BufferedWriter from gcsio.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/python/apache_beam/io/filesystemio.py  | 264 +
 sdks/python/apache_beam/io/filesystemio_test.py | 185 +
 sdks/python/apache_beam/io/gcp/gcsio.py | 481 
 sdks/python/apache_beam/io/gcp/gcsio_test.py|  48 +--
 4 files changed, 528 insertions(+), 450 deletions(-)
 create mode 100644 sdks/python/apache_beam/io/filesystemio.py
 create mode 100644 sdks/python/apache_beam/io/filesystemio_test.py

-- 
To stop receiving notification emails like this one, please contact
chamik...@apache.org.


[beam] branch master updated (e053eb5 -> 2b242fe)

2018-02-01 Thread tgroh
This is an automated email from the ASF dual-hosted git repository.

tgroh pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from e053eb5  Merge pull request #4568: Changing FileIO.Write.FileNaming 
Interface to public
 add 9a2d2a6  Add QueryablePipeline
 new 2b242fe  Merge pull request #4530: Add QueryablePipeline

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../core/construction/graph/PipelineNode.java  |  55 +++
 .../core/construction/graph/QueryablePipeline.java | 281 +++
 .../core/construction/graph}/package-info.java |   6 +-
 .../construction/graph/QueryablePipelineTest.java  | 389 +
 4 files changed, 727 insertions(+), 4 deletions(-)
 create mode 100644 
runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/graph/PipelineNode.java
 create mode 100644 
runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/graph/QueryablePipeline.java
 copy runners/{core-java/src/main/java/org/apache/beam/runners/core => 
core-construction-java/src/main/java/org/apache/beam/runners/core/construction/graph}/package-info.java
 (90%)
 create mode 100644 
runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/graph/QueryablePipelineTest.java

-- 
To stop receiving notification emails like this one, please contact
tg...@apache.org.


[beam] 01/01: Merge pull request #4530: Add QueryablePipeline

2018-02-01 Thread tgroh
This is an automated email from the ASF dual-hosted git repository.

tgroh pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 2b242fe30caa7f64c914f9e453cc962d7d3fff6a
Merge: e053eb5 9a2d2a6
Author: Thomas Groh 
AuthorDate: Thu Feb 1 14:21:29 2018 -0800

Merge pull request #4530: Add QueryablePipeline

[BEAM-3565] Add QueryablePipeline

 .../core/construction/graph/PipelineNode.java  |  55 +++
 .../core/construction/graph/QueryablePipeline.java | 281 +++
 .../core/construction/graph/package-info.java  |  24 ++
 .../construction/graph/QueryablePipelineTest.java  | 389 +
 4 files changed, 749 insertions(+)

-- 
To stop receiving notification emails like this one, please contact
tg...@apache.org.


[jira] [Created] (BEAM-3597) Add function registration in Go SDK to avoid symbol table lookups

2018-02-01 Thread Henning Rohde (JIRA)
Henning Rohde created BEAM-3597:
---

 Summary: Add function registration in Go SDK to avoid symbol table 
lookups
 Key: BEAM-3597
 URL: https://issues.apache.org/jira/browse/BEAM-3597
 Project: Beam
  Issue Type: Improvement
  Components: sdk-go
Reporter: Henning Rohde
Assignee: Henning Rohde


We should allow optional function registration at init-time to avoid reading 
symbols from the binary, which is slow. It would also allow runtime environment 
where we do not have that ability.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-3596) Remove Java 7 references on the website

2018-02-01 Thread Thomas Groh (JIRA)
Thomas Groh created BEAM-3596:
-

 Summary: Remove Java 7 references on the website
 Key: BEAM-3596
 URL: https://issues.apache.org/jira/browse/BEAM-3596
 Project: Beam
  Issue Type: Bug
  Components: website
Affects Versions: 2.3.0
Reporter: Thomas Groh
Assignee: Reuven Lax


Beam requires java 8 as of 2.3.0, so we don't need to disclaim the java 7 
compatibilities for things like the Mobile Gaming Example



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to stable : beam_PostCommit_Java_MavenInstall #5839

2018-02-01 Thread Apache Jenkins Server
See 




[beam] 01/01: Merge pull request #4568: Changing FileIO.Write.FileNaming Interface to public

2018-02-01 Thread jkff
This is an automated email from the ASF dual-hosted git repository.

jkff pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit e053eb564d008abd2bfe0b1d61400fdca7400fa2
Merge: ec7e098 ece8709
Author: Eugene Kirpichov 
AuthorDate: Thu Feb 1 12:42:37 2018 -0800

Merge pull request #4568: Changing FileIO.Write.FileNaming Interface to 
public

Changing FileIO.Write.FileNaming Interface to public

 sdks/java/core/src/main/java/org/apache/beam/sdk/io/FileIO.java | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

-- 
To stop receiving notification emails like this one, please contact
j...@apache.org.


[beam] branch master updated (ec7e098 -> e053eb5)

2018-02-01 Thread jkff
This is an automated email from the ASF dual-hosted git repository.

jkff pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from ec7e098  Merge pull request #4522: [BEAM-3562] Update to Checkstyle 8.7
 add ece8709  Changing FileNaming to public to allow for usage in 
lambdas/inheritance outside of the package.
 new e053eb5  Merge pull request #4568: Changing FileIO.Write.FileNaming 
Interface to public

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/java/core/src/main/java/org/apache/beam/sdk/io/FileIO.java | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

-- 
To stop receiving notification emails like this one, please contact
j...@apache.org.


[jira] [Created] (BEAM-3595) Normalize URNs across SDKs and runners.

2018-02-01 Thread Robert Bradshaw (JIRA)
Robert Bradshaw created BEAM-3595:
-

 Summary: Normalize URNs across SDKs and runners.
 Key: BEAM-3595
 URL: https://issues.apache.org/jira/browse/BEAM-3595
 Project: Beam
  Issue Type: Bug
  Components: beam-model
Reporter: Robert Bradshaw
Assignee: Kenneth Knowles






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3579) Go textio.Write does not work on non-direct runners

2018-02-01 Thread Henning Rohde (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3579?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16349183#comment-16349183
 ] 

Henning Rohde commented on BEAM-3579:
-

Instead of side input, we should just do a GBK with a fixed key and write the 
file in ProcessElement.

> Go textio.Write does not work on non-direct runners
> ---
>
> Key: BEAM-3579
> URL: https://issues.apache.org/jira/browse/BEAM-3579
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Reporter: Henning Rohde
>Assignee: Henning Rohde
>Priority: Major
>
> The content is not flushed/closed, so for GCS the object is never created. We 
> need to write the object as a side input instead or similar. We may never 
> call Teardown for non-local runners and FinishBundle doesn't have the right 
> granularity.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build became unstable: beam_PostCommit_Java_MavenInstall #5838

2018-02-01 Thread Apache Jenkins Server
See 




[jira] [Assigned] (BEAM-3579) Go textio.Write does not work on non-direct runners

2018-02-01 Thread Henning Rohde (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3579?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Henning Rohde reassigned BEAM-3579:
---

Assignee: Henning Rohde

> Go textio.Write does not work on non-direct runners
> ---
>
> Key: BEAM-3579
> URL: https://issues.apache.org/jira/browse/BEAM-3579
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Reporter: Henning Rohde
>Assignee: Henning Rohde
>Priority: Major
>
> The content is not flushed/closed, so for GCS the object is never created. We 
> need to write the object as a side input instead or similar. We may never 
> call Teardown for non-local runners and FinishBundle doesn't have the right 
> granularity.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] branch master updated (ee99265 -> ec7e098)

2018-02-01 Thread kenn
This is an automated email from the ASF dual-hosted git repository.

kenn pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from ee99265  Merge pull request #4566: Encourage a good description in a 
good spot on a PR description.
 add 9897be0  [BEAM-3562] Update to Checkstyle 8.7
 new ec7e098  Merge pull request #4522: [BEAM-3562] Update to Checkstyle 8.7

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 build_rules.gradle |  3 +++
 pom.xml| 11 +--
 .../translation/types/CoderTypeSerializerTest.java |  1 -
 .../runners/gearpump/GearpumpPipelineOptions.java  |  3 ---
 .../beam/runners/gearpump/GearpumpRunner.java  |  4 
 .../beam/runners/gearpump/TestGearpumpRunner.java  |  1 -
 .../CreateGearpumpPCollectionViewTranslator.java   |  1 -
 .../translators/GearpumpPipelineTranslator.java|  3 ---
 .../translators/ParDoMultiOutputTranslator.java|  2 --
 .../translators/ReadUnboundedTranslator.java   |  1 -
 .../gearpump/translators/TransformTranslator.java  |  1 -
 .../gearpump/translators/TranslationContext.java   |  3 ---
 .../translators/WindowAssignTranslator.java|  2 --
 .../translators/functions/DoFnFunction.java|  4 
 .../translators/io/BoundedSourceWrapper.java   |  1 -
 .../gearpump/translators/io/GearpumpSource.java|  2 --
 .../translators/io/UnboundedSourceWrapper.java |  1 -
 .../translators/utils/DoFnRunnerFactory.java   |  1 -
 .../translators/utils/NoOpStepContext.java |  1 -
 .../translators/utils/TranslatorUtils.java |  3 ---
 .../beam/runners/gearpump/PipelineOptionsTest.java |  3 +--
 .../FlattenPCollectionsTranslatorTest.java |  1 -
 .../translators/GroupByKeyTranslatorTest.java  |  2 --
 .../translators/io/GearpumpSourceTest.java |  2 --
 .../gearpump/translators/io/ValueSoureTest.java|  2 --
 .../translators/utils/TranslatorUtilsTest.java |  2 --
 .../beam/runners/dataflow/util/PackageUtil.java|  4 ++--
 .../src/main/resources/beam/checkstyle.xml | 22 +++---
 .../src/main/resources/beam/suppressions.xml   | 12 ++--
 sdks/java/core/pom.xml |  9 -
 .../java/org/apache/beam/sdk/util/CoderUtils.java  |  2 --
 .../org/apache/beam/sdk/io/FileBasedSinkTest.java  |  4 ++--
 .../beam/sdk/util/SerializableUtilsTest.java   |  1 -
 .../extensions/joinlibrary/OuterFullJoinTest.java  |  1 -
 .../extensions/sketching/SketchFrequencies.java|  2 --
 .../sketching/SketchFrequenciesTest.java   |  1 -
 sdks/java/extensions/sql/pom.xml   | 14 --
 .../sql/impl/interpreter/BeamSqlFnExecutor.java|  1 -
 .../interpreter/operator/BeamSqlPrimitive.java |  1 -
 .../date/BeamSqlDatetimeMinusExpression.java   |  2 --
 .../date/BeamSqlDatetimePlusExpression.java|  2 --
 .../date/BeamSqlIntervalMultiplyExpression.java|  2 --
 .../BeamSqlTimestampMinusIntervalExpression.java   |  1 -
 .../BeamSqlTimestampMinusTimestampExpression.java  |  1 -
 .../interpreter/operator/date/TimeUnitUtils.java   |  1 -
 .../reinterpret/BeamSqlReinterpretExpression.java  |  1 -
 .../reinterpret/ReinterpretConversion.java |  2 --
 .../extensions/sql/impl/utils/SqlTypeUtils.java|  2 --
 .../operator/BeamSqlReinterpretExpressionTest.java |  1 -
 .../date/BeamSqlDateExpressionTestBase.java|  1 -
 .../date/BeamSqlDatetimeMinusExpressionTest.java   |  1 -
 .../date/BeamSqlDatetimePlusExpressionTest.java|  1 -
 .../BeamSqlIntervalMultiplyExpressionTest.java |  1 -
 ...eamSqlTimestampMinusIntervalExpressionTest.java |  1 -
 ...amSqlTimestampMinusTimestampExpressionTest.java |  1 -
 .../operator/date/TimeUnitUtilsTest.java   | 19 +--
 .../DatetimeReinterpretConversionsTest.java|  1 -
 .../reinterpret/ReinterpretConversionTest.java |  2 --
 .../operator/reinterpret/ReinterpreterTest.java|  1 -
 .../BeamSqlDateFunctionsIntegrationTest.java   |  1 -
 .../apache/beam/sdk/io/amqp/AmqpMessageCoder.java  |  2 --
 .../io/amqp/AmqpMessageCoderProviderRegistrar.java |  2 --
 .../beam/sdk/io/amqp/AmqpMessageCoderTest.java |  2 --
 .../beam/sdk/io/cassandra/CassandraService.java|  1 -
 .../sdk/io/cassandra/CassandraServiceImplTest.java |  2 --
 .../sdk/io/cassandra/CassandraTestDataSet.java |  1 -
 .../beam/sdk/io/gcp/spanner/MutationGroup.java |  1 -
 .../io/gcp/bigtable/BigtableServiceImplTest.java   |  1 -
 .../beam/sdk/io/hbase/SerializableScanTest.java|  1 -
 .../java/org/apache/beam/sdk/io/jms/JmsIO.java |  1 +
 .../java/org/apache/beam/sdk/io/jms/JmsIOTest.java |  1 -
 

[beam] 01/01: Merge pull request #4522: [BEAM-3562] Update to Checkstyle 8.7

2018-02-01 Thread kenn
This is an automated email from the ASF dual-hosted git repository.

kenn pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit ec7e098a4f3e0f38fca2299c30ba1a784ff5c07e
Merge: ee99265 9897be0
Author: Kenn Knowles 
AuthorDate: Thu Feb 1 11:44:00 2018 -0800

Merge pull request #4522: [BEAM-3562] Update to Checkstyle 8.7

 build_rules.gradle |  3 +++
 pom.xml| 11 +--
 .../translation/types/CoderTypeSerializerTest.java |  1 -
 .../runners/gearpump/GearpumpPipelineOptions.java  |  3 ---
 .../beam/runners/gearpump/GearpumpRunner.java  |  4 
 .../beam/runners/gearpump/TestGearpumpRunner.java  |  1 -
 .../CreateGearpumpPCollectionViewTranslator.java   |  1 -
 .../translators/GearpumpPipelineTranslator.java|  3 ---
 .../translators/ParDoMultiOutputTranslator.java|  2 --
 .../translators/ReadUnboundedTranslator.java   |  1 -
 .../gearpump/translators/TransformTranslator.java  |  1 -
 .../gearpump/translators/TranslationContext.java   |  3 ---
 .../translators/WindowAssignTranslator.java|  2 --
 .../translators/functions/DoFnFunction.java|  4 
 .../translators/io/BoundedSourceWrapper.java   |  1 -
 .../gearpump/translators/io/GearpumpSource.java|  2 --
 .../translators/io/UnboundedSourceWrapper.java |  1 -
 .../translators/utils/DoFnRunnerFactory.java   |  1 -
 .../translators/utils/NoOpStepContext.java |  1 -
 .../translators/utils/TranslatorUtils.java |  3 ---
 .../beam/runners/gearpump/PipelineOptionsTest.java |  3 +--
 .../FlattenPCollectionsTranslatorTest.java |  1 -
 .../translators/GroupByKeyTranslatorTest.java  |  2 --
 .../translators/io/GearpumpSourceTest.java |  2 --
 .../gearpump/translators/io/ValueSoureTest.java|  2 --
 .../translators/utils/TranslatorUtilsTest.java |  2 --
 .../beam/runners/dataflow/util/PackageUtil.java|  4 ++--
 .../src/main/resources/beam/checkstyle.xml | 22 +++---
 .../src/main/resources/beam/suppressions.xml   | 12 ++--
 sdks/java/core/pom.xml |  9 -
 .../java/org/apache/beam/sdk/util/CoderUtils.java  |  2 --
 .../org/apache/beam/sdk/io/FileBasedSinkTest.java  |  4 ++--
 .../beam/sdk/util/SerializableUtilsTest.java   |  1 -
 .../extensions/joinlibrary/OuterFullJoinTest.java  |  1 -
 .../extensions/sketching/SketchFrequencies.java|  2 --
 .../sketching/SketchFrequenciesTest.java   |  1 -
 sdks/java/extensions/sql/pom.xml   | 14 --
 .../sql/impl/interpreter/BeamSqlFnExecutor.java|  1 -
 .../interpreter/operator/BeamSqlPrimitive.java |  1 -
 .../date/BeamSqlDatetimeMinusExpression.java   |  2 --
 .../date/BeamSqlDatetimePlusExpression.java|  2 --
 .../date/BeamSqlIntervalMultiplyExpression.java|  2 --
 .../BeamSqlTimestampMinusIntervalExpression.java   |  1 -
 .../BeamSqlTimestampMinusTimestampExpression.java  |  1 -
 .../interpreter/operator/date/TimeUnitUtils.java   |  1 -
 .../reinterpret/BeamSqlReinterpretExpression.java  |  1 -
 .../reinterpret/ReinterpretConversion.java |  2 --
 .../extensions/sql/impl/utils/SqlTypeUtils.java|  2 --
 .../operator/BeamSqlReinterpretExpressionTest.java |  1 -
 .../date/BeamSqlDateExpressionTestBase.java|  1 -
 .../date/BeamSqlDatetimeMinusExpressionTest.java   |  1 -
 .../date/BeamSqlDatetimePlusExpressionTest.java|  1 -
 .../BeamSqlIntervalMultiplyExpressionTest.java |  1 -
 ...eamSqlTimestampMinusIntervalExpressionTest.java |  1 -
 ...amSqlTimestampMinusTimestampExpressionTest.java |  1 -
 .../operator/date/TimeUnitUtilsTest.java   | 19 +--
 .../DatetimeReinterpretConversionsTest.java|  1 -
 .../reinterpret/ReinterpretConversionTest.java |  2 --
 .../operator/reinterpret/ReinterpreterTest.java|  1 -
 .../BeamSqlDateFunctionsIntegrationTest.java   |  1 -
 .../apache/beam/sdk/io/amqp/AmqpMessageCoder.java  |  2 --
 .../io/amqp/AmqpMessageCoderProviderRegistrar.java |  2 --
 .../beam/sdk/io/amqp/AmqpMessageCoderTest.java |  2 --
 .../beam/sdk/io/cassandra/CassandraService.java|  1 -
 .../sdk/io/cassandra/CassandraServiceImplTest.java |  2 --
 .../sdk/io/cassandra/CassandraTestDataSet.java |  1 -
 .../beam/sdk/io/gcp/spanner/MutationGroup.java |  1 -
 .../io/gcp/bigtable/BigtableServiceImplTest.java   |  1 -
 .../beam/sdk/io/hbase/SerializableScanTest.java|  1 -
 .../java/org/apache/beam/sdk/io/jms/JmsIO.java |  1 +
 .../java/org/apache/beam/sdk/io/jms/JmsIOTest.java |  1 -
 .../beam/sdk/io/kafka/KafkaCheckpointMark.java |  1 -
 .../org/apache/beam/sdk/io/kafka/ProducerSpEL.java |  1 -
 .../beam/sdk/io/kinesis/AWSClientsProvider.java|  1 -
 .../sdk/io/kinesis/DynamicCheckpointGenerator.java |  2 --
 .../apache/beam/sdk/io/kinesis/KinesisReader.java  |  

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Dataflow #4858

2018-02-01 Thread Apache Jenkins Server
See 




[beam] branch master updated (a2bf73f -> ee99265)

2018-02-01 Thread kenn
This is an automated email from the ASF dual-hosted git repository.

kenn pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from a2bf73f  Merge pull request #4560 [lint] import logging
 add 34eadc5  Encourage a good description in a good spot on a PR 
description.
 new ee99265  Merge pull request #4566: Encourage a good description in a 
good spot on a PR description.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .github/PULL_REQUEST_TEMPLATE.md | 13 ++---
 1 file changed, 10 insertions(+), 3 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
k...@apache.org.


[jira] [Commented] (BEAM-3542) SamzaPipelineOptions probably shouldn't need maxSourceParallelism

2018-02-01 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3542?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16349084#comment-16349084
 ] 

Kenneth Knowles commented on BEAM-3542:
---

[~chamikara] commenting on the PR and this is a good place to continue the 
discussion. Have you seen the thread about "hints" on the dev@ list? This 
sounds like it might be one of the use cases for that sort of thing.

> SamzaPipelineOptions probably shouldn't need maxSourceParallelism
> -
>
> Key: BEAM-3542
> URL: https://issues.apache.org/jira/browse/BEAM-3542
> Project: Beam
>  Issue Type: Bug
>  Components: runner-samza
>Reporter: Kenneth Knowles
>Priority: Minor
>
> Let's continue to examine and make sure the runner is using unbounded sources 
> in a Beam-ish consistent way. If it is necessary, that is OK too, but it 
> seemed there might be things to clarify since the code review.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: Merge pull request #4566: Encourage a good description in a good spot on a PR description.

2018-02-01 Thread kenn
This is an automated email from the ASF dual-hosted git repository.

kenn pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit ee99265a445087256b6643f65c48ddef162fae3c
Merge: a2bf73f 34eadc5
Author: Kenn Knowles 
AuthorDate: Thu Feb 1 11:02:00 2018 -0800

Merge pull request #4566: Encourage a good description in a good spot on a 
PR description.

 .github/PULL_REQUEST_TEMPLATE.md | 13 ++---
 1 file changed, 10 insertions(+), 3 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
k...@apache.org.


[jira] [Created] (BEAM-3594) Explan FlatMap

2018-02-01 Thread JIRA
María GH created BEAM-3594:
--

 Summary: Explan FlatMap
 Key: BEAM-3594
 URL: https://issues.apache.org/jira/browse/BEAM-3594
 Project: Beam
  Issue Type: Improvement
  Components: website
Reporter: María GH
Assignee: Reuven Lax


[https://beam.apache.org/documentation/programming-guide] needs to explain what 
FlatMap does. The first time it is used in a snippet, it happens without any 
previous explanation.

Right after, there is a snippet to replace it with beam.Map. 

Two things needed:

1- Preface the FlatMap with an explanation of its purpose.

2- Give an example where it is not replaceable by Map.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (BEAM-3581) [SQL] Support for Non-ASCII chars is flaky

2018-02-01 Thread Anton Kedin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3581?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16349074#comment-16349074
 ] 

Anton Kedin edited comment on BEAM-3581 at 2/1/18 6:37 PM:
---

Next step after [fixing the tests|https://github.com/apache/beam/pull/4564] we 
should document the current behavior:
 - Beam SQL parser tries to use UTF16;
 - unless system properties were overriden;
 - or unless you used some Calcite classes before using Beam which loaded the 
default charset;
 - except for Beam tests, which set the system properties to UTF16;

Then we need to upgrade to Calcite version that reads saffron.properties from 
resources, and use that instead of system properties.

And keep this jira open, or create another one for the follow up.


was (Author: kedin):
Next step after [fixing the tests|https://github.com/apache/beam/pull/4564] we 
should document the current behavior:
 - Beam SQL parser tries to use UTF16;
 - unless system properties were overriden;
 - or unless you used some Calcite classes before using Beam which loaded the 
default charset;
 - except for Beam tests, which set the system properties to UTF16;

Then we need to upgrade to Calcite version that reads saffron.properties from 
resources, and use that instead of system properties.

> [SQL] Support for Non-ASCII chars is flaky
> --
>
> Key: BEAM-3581
> URL: https://issues.apache.org/jira/browse/BEAM-3581
> Project: Beam
>  Issue Type: Bug
>  Components: dsl-sql
>Reporter: Anton Kedin
>Assignee: Anton Kedin
>Priority: Major
>
> Beam SQL overrides default charset that Calcite uses and sets it to UTF16. It 
> is done via system properties.
> Problem is that we do this only when it hasn't been set yet. So if system 
> property has been set to ISO-8859-1 (Calcite's default), then test runs will 
> fail when trying to encode characters not supported in that encoding.
> Solution:
>  - because it's a system property, we don't want to force override it;
>  - for the same reason we cannot set it for a specific query execution;
>  - we can expose a static method on BeamSql to override these properties if 
> explicitly requested;
>  - affected tests will explicitly override it;
>  - otherwise behavior will stay unchanged and we will respect defaults and 
> user settings;



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3581) [SQL] Support for Non-ASCII chars is flaky

2018-02-01 Thread Anton Kedin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3581?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16349074#comment-16349074
 ] 

Anton Kedin commented on BEAM-3581:
---

Next step after [fixing the tests|https://github.com/apache/beam/pull/4564] we 
should document the current behavior:
 - Beam SQL parser tries to use UTF16;
 - unless system properties were overriden;
 - or unless you used some Calcite classes before using Beam which loaded the 
default charset;
 - except for Beam tests, which set the system properties to UTF16;

Then we need to upgrade to Calcite version that reads saffron.properties from 
resources, and use that instead of system properties.

> [SQL] Support for Non-ASCII chars is flaky
> --
>
> Key: BEAM-3581
> URL: https://issues.apache.org/jira/browse/BEAM-3581
> Project: Beam
>  Issue Type: Bug
>  Components: dsl-sql
>Reporter: Anton Kedin
>Assignee: Anton Kedin
>Priority: Major
>
> Beam SQL overrides default charset that Calcite uses and sets it to UTF16. It 
> is done via system properties.
> Problem is that we do this only when it hasn't been set yet. So if system 
> property has been set to ISO-8859-1 (Calcite's default), then test runs will 
> fail when trying to encode characters not supported in that encoding.
> Solution:
>  - because it's a system property, we don't want to force override it;
>  - for the same reason we cannot set it for a specific query execution;
>  - we can expose a static method on BeamSql to override these properties if 
> explicitly requested;
>  - affected tests will explicitly override it;
>  - otherwise behavior will stay unchanged and we will respect defaults and 
> user settings;



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_Python #862

2018-02-01 Thread Apache Jenkins Server
See 


Changes:

[github] Fix undefined names: exc_info --> self.exc_info

[lcwik] Change info to debug statement

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam4 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 51da92cce831755866e5c017802532929fdfd872 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 51da92cce831755866e5c017802532929fdfd872
Commit message: "Merge pull request #4559 from cclauss/patch-1"
 > git rev-list 82e5e944ba143c23acf377bfd7a850046be68ea7 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins8399523810663267945.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6785706637888403514.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6583095736507094117.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins1548845153372327499.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/75/d1/5abca4ccf61a7ab86c255dd315fb96e566fbf9b5d3a480e72c93e8ec2802/setuptools-38.4.0-py2.py3-none-any.whl#md5=a5c6620a59f19f2d5d32bdca18c7b47e
Downloading/unpacking pip from 
https://pypi.python.org/packages/b6/ac/7015eb97dc749283ffdec1c3a88ddb8ae03b8fad0f0e611408f196358da3/pip-9.0.1-py2.py3-none-any.whl#md5=297dbd16ef53bcef0447d245815f5144
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_Python] $ /bin/bash -xe /tmp/jenkins17278353873698400.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins4052711741288962951.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
:318:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to 

Jenkins build is back to normal : beam_PostCommit_Python_Verify #4118

2018-02-01 Thread Apache Jenkins Server
See 




[beam] branch master updated (51da92c -> a2bf73f)

2018-02-01 Thread robertwb
This is an automated email from the ASF dual-hosted git repository.

robertwb pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 51da92c  Merge pull request #4559 from cclauss/patch-1
 add a71042a  import logging for line 1163
 new a2bf73f  Merge pull request #4560 [lint] import logging

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/python/apache_beam/transforms/trigger.py | 1 +
 1 file changed, 1 insertion(+)

-- 
To stop receiving notification emails like this one, please contact
rober...@apache.org.


[jira] [Commented] (BEAM-3542) SamzaPipelineOptions probably shouldn't need maxSourceParallelism

2018-02-01 Thread Xinyu Liu (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3542?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16349013#comment-16349013
 ] 

Xinyu Liu commented on BEAM-3542:
-

Let me add more details about this: maxSourceParallelism is currently used 
directly as the desiredNumSplits to split an UnboundedSource (close to 
FlinkPipelineOptions.parallelism). Based on the number of splits, Samza will 
create the same number of tasks, which is the logical parallelism in Samza. The 
runtime parallelism in Samza is defined by container count, thread pool size 
and max concurrency within a task. So I pick the name "maxSourceParallelism" to 
distinguish it from the runtime. I put "max" as prefix since the actual splits 
are equal or lower than this number, e.g. split a kafka topic of 10 partitions 
with 1000 desiredNumSplits will still get 10 splits. Not sure whether the 
naming of is misleading. We can call it desiredParallelism or maxParallelism if 
that's better.

On the other hand, we do hope BEAM can support the use case of having default 
splits without the need for desiredNumSplits. For most LinkedIn use cases, this 
is very helpful since the user does not need to know the exact partitions of 
each their own topic to specify the desiredNumSplits. They can simply rely on 
the default splits to get the expected parallelism (assume the default splits 
for Kafka source is the number of partitions). Can we support this in BEAM (I 
can create a ticket for it)?

> SamzaPipelineOptions probably shouldn't need maxSourceParallelism
> -
>
> Key: BEAM-3542
> URL: https://issues.apache.org/jira/browse/BEAM-3542
> Project: Beam
>  Issue Type: Bug
>  Components: runner-samza
>Reporter: Kenneth Knowles
>Priority: Minor
>
> Let's continue to examine and make sure the runner is using unbounded sources 
> in a Beam-ish consistent way. If it is necessary, that is OK too, but it 
> seemed there might be things to clarify since the code review.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-3593) Remove methods that just call super()

2018-02-01 Thread Colm O hEigeartaigh (JIRA)
Colm O hEigeartaigh created BEAM-3593:
-

 Summary: Remove methods that just call super()
 Key: BEAM-3593
 URL: https://issues.apache.org/jira/browse/BEAM-3593
 Project: Beam
  Issue Type: Improvement
  Components: beam-model
Reporter: Colm O hEigeartaigh
Assignee: Colm O hEigeartaigh
 Fix For: 2.4.0






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] branch go-sdk updated (010272b -> f7e4e41)

2018-02-01 Thread tgroh
This is an automated email from the ASF dual-hosted git repository.

tgroh pushed a change to branch go-sdk
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 010272b  Merge master into go-sdk
 add 1f33d31  Fix beam.Combine to combine globally
 new f7e4e41  Merge pull request #4556: Fix beam.Combine to combine globally

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/go/pkg/beam/combine.go | 11 ---
 1 file changed, 8 insertions(+), 3 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
tg...@apache.org.


[beam] 01/01: Merge pull request #4556: Fix beam.Combine to combine globally

2018-02-01 Thread tgroh
This is an automated email from the ASF dual-hosted git repository.

tgroh pushed a commit to branch go-sdk
in repository https://gitbox.apache.org/repos/asf/beam.git

commit f7e4e41041a2bac584770f38ec3b6f7c09ba2582
Merge: 010272b 1f33d31
Author: Thomas Groh 
AuthorDate: Thu Feb 1 09:48:02 2018 -0800

Merge pull request #4556: Fix beam.Combine to combine globally

[BEAM-3586] Fix beam.Combine to combine globally

 sdks/go/pkg/beam/combine.go | 11 ---
 1 file changed, 8 insertions(+), 3 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
tg...@apache.org.


[jira] [Resolved] (BEAM-3591) Undefined name: exc_info

2018-02-01 Thread Ahmet Altay (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3591?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ahmet Altay resolved BEAM-3591.
---
   Resolution: Fixed
Fix Version/s: 2.4.0

> Undefined name: exc_info
> 
>
> Key: BEAM-3591
> URL: https://issues.apache.org/jira/browse/BEAM-3591
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: cclauss
>Assignee: Ahmet Altay
>Priority: Minor
> Fix For: 2.4.0
>
>   Original Estimate: 336h
>  Remaining Estimate: 336h
>
> __exc_info__ is an undefined name which might result in NameError being 
> raised instead of the desired error.  Proposed fix is 
> https://github.com/apache/beam/pull/4559
> flake8 testing of https://github.com/apache/beam
> $ __flake8 . --count --select=E901,E999,F821,F822,F823 --show-source 
> --statistics__
> ```
> ./sdks/python/apache_beam/runners/worker/data_plane.py:185:19: F821 undefined 
> name 'exc_info'
>     raise exc_info[0], exc_info[1], exc_info[2]
>   ^
> ./sdks/python/apache_beam/runners/worker/data_plane.py:185:32: F821 undefined 
> name 'exc_info'
>     raise exc_info[0], exc_info[1], exc_info[2]
>    ^
> ./sdks/python/apache_beam/runners/worker/data_plane.py:185:45: F821 undefined 
> name 'exc_info'
>     raise exc_info[0], exc_info[1], exc_info[2]
>     ^
> ```



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3591) Undefined name: exc_info

2018-02-01 Thread Ahmet Altay (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3591?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16348982#comment-16348982
 ] 

Ahmet Altay commented on BEAM-3591:
---

Thank you for noticing and fixing this. I merged 
https://github.com/apache/beam/pull/4559

> Undefined name: exc_info
> 
>
> Key: BEAM-3591
> URL: https://issues.apache.org/jira/browse/BEAM-3591
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: cclauss
>Assignee: Ahmet Altay
>Priority: Minor
>   Original Estimate: 336h
>  Remaining Estimate: 336h
>
> __exc_info__ is an undefined name which might result in NameError being 
> raised instead of the desired error.  Proposed fix is 
> https://github.com/apache/beam/pull/4559
> flake8 testing of https://github.com/apache/beam
> $ __flake8 . --count --select=E901,E999,F821,F822,F823 --show-source 
> --statistics__
> ```
> ./sdks/python/apache_beam/runners/worker/data_plane.py:185:19: F821 undefined 
> name 'exc_info'
>     raise exc_info[0], exc_info[1], exc_info[2]
>   ^
> ./sdks/python/apache_beam/runners/worker/data_plane.py:185:32: F821 undefined 
> name 'exc_info'
>     raise exc_info[0], exc_info[1], exc_info[2]
>    ^
> ./sdks/python/apache_beam/runners/worker/data_plane.py:185:45: F821 undefined 
> name 'exc_info'
>     raise exc_info[0], exc_info[1], exc_info[2]
>     ^
> ```



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] branch master updated (afa7e86 -> 51da92c)

2018-02-01 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from afa7e86  Change info to debug statement
 add 2fe7169  Fix undefined names: exc_info --> self.exc_info
 new 51da92c  Merge pull request #4559 from cclauss/patch-1

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/python/apache_beam/runners/worker/data_plane.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

-- 
To stop receiving notification emails like this one, please contact
al...@apache.org.


[beam] 01/01: Merge pull request #4559 from cclauss/patch-1

2018-02-01 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 51da92cce831755866e5c017802532929fdfd872
Merge: afa7e86 2fe7169
Author: Ahmet Altay 
AuthorDate: Thu Feb 1 09:40:27 2018 -0800

Merge pull request #4559 from cclauss/patch-1

Fix undefined names: exc_info --> self.exc_info

 sdks/python/apache_beam/runners/worker/data_plane.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

-- 
To stop receiving notification emails like this one, please contact
al...@apache.org.


Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Apex #3357

2018-02-01 Thread Apache Jenkins Server
See 




[beam] branch master updated: Change info to debug statement

2018-02-01 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/master by this push:
 new e1b6fb7  Change info to debug statement
 new afa7e86  Change info to debug statement
e1b6fb7 is described below

commit e1b6fb7f7cd3a8bdbfa1de0e9b091ef8ff155a35
Author: keithmcneill 
AuthorDate: Wed Jan 31 20:57:31 2018 -0500

Change info to debug statement

This log line creates a huge amount of noise in the logs.  In addition with 
google charging for stackdriver I believe that this will cost a lot of $$'s
---
 .../src/main/java/org/apache/beam/sdk/io/gcp/bigtable/BigtableIO.java   | 2 +-
 .../test/java/org/apache/beam/sdk/io/gcp/bigtable/BigtableIOTest.java   | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigtable/BigtableIO.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigtable/BigtableIO.java
index 1de30de..63138bb 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigtable/BigtableIO.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigtable/BigtableIO.java
@@ -716,7 +716,7 @@ public class BigtableIO {
   public void finishBundle() throws Exception {
 bigtableWriter.flush();
 checkForFailures();
-LOG.info("Wrote {} records", recordsWritten);
+LOG.debug("Wrote {} records", recordsWritten);
   }
 
   @Teardown
diff --git 
a/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/bigtable/BigtableIOTest.java
 
b/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/bigtable/BigtableIOTest.java
index efe6292..e1fab40 100644
--- 
a/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/bigtable/BigtableIOTest.java
+++ 
b/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/bigtable/BigtableIOTest.java
@@ -825,7 +825,7 @@ public class BigtableIOTest {
 .apply("write", defaultWrite.withTableId(table));
 p.run();
 
-logged.verifyInfo("Wrote 1 records");
+logged.verifyDebug("Wrote 1 records");
 
 assertEquals(1, service.tables.size());
 assertNotNull(service.getTable(table));

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


[jira] [Commented] (BEAM-3471) Create a callback triggered at the end of a batch in spark runner

2018-02-01 Thread Etienne Chauchot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3471?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16348768#comment-16348768
 ] 

Etienne Chauchot commented on BEAM-3471:


I tested using the {{SourceRdd.close()}} method for that, but the method seems 
to not be called at least for this test 
https://github.com/apache/beam/pull/4548/files#diff-3b688d41decddb7ef75c71683c261a59

> Create a callback triggered at the end of a batch in spark runner
> -
>
> Key: BEAM-3471
> URL: https://issues.apache.org/jira/browse/BEAM-3471
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-spark
>Reporter: Etienne Chauchot
>Assignee: Jean-Baptiste Onofré
>Priority: Major
>
> In the future we might add new features to the runners for which we might 
> need to do some processing at the end of a batch. Currently there is not 
> unique place (a callback) to add this processing.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_Verify #4117

2018-02-01 Thread Apache Jenkins Server
See 


--
[...truncated 1.45 MB...]
  "kind": "ParallelDo", 
  "name": "s14", 
  "properties": {
"display_data": [
  {
"key": "fn", 
"label": "Transform Function", 
"namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
"type": "STRING", 
"value": ""
  }, 
  {
"key": "fn", 
"label": "Transform Function", 
"namespace": "apache_beam.transforms.core.ParDo", 
"shortValue": "CallableWrapperDoFn", 
"type": "STRING", 
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
  }
], 
"non_parallel_inputs": {}, 
"output_info": [
  {
"encoding": {
  "@type": "kind:windowed_value", 
  "component_encodings": [
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": [
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": []
}, 
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": []
}
  ], 
  "is_pair_like": true
}, 
{
  "@type": "kind:global_window"
}
  ], 
  "is_wrapper": true
}, 
"output_name": "out", 
"user_name": "write/Write/WriteImpl/Extract.out"
  }
], 
"parallel_input": {
  "@type": "OutputReference", 
  "output_name": "out", 
  "step_name": "s13"
}, 
"serialized_fn": "", 
"user_name": "write/Write/WriteImpl/Extract"
  }
}, 
{
  "kind": "CollectionToSingleton", 
  "name": "SideInput-s15", 
  "properties": {
"output_info": [
  {
"encoding": {
  "@type": "kind:windowed_value", 
  "component_encodings": [
{
  "@type": "kind:windowed_value", 
  "component_encodings": [
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": [
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": []
}, 
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": []
}
  ], 
  "is_pair_like": true
}, 
{
  "@type": "kind:global_window"
}
  ], 
  "is_wrapper": true
}
  ]
}, 
"output_name": "out", 
"user_name": 
"write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0).output"
  }
], 
"parallel_input": {
  "@type": "OutputReference", 
  "output_name": "out", 
  "step_name": "s8"
}, 
"user_name": 
"write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0)"
  }
}, 
{
  "kind": "CollectionToSingleton", 
  "name": "SideInput-s16", 
  "properties": {
"output_info": [
  {
"encoding": {
  "@type": "kind:windowed_value", 
  "component_encodings": [
{
  "@type": "kind:windowed_value", 
  "component_encodings": [
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": [
{
  "@type": 

[jira] [Updated] (BEAM-3186) In-flight data loss when restoring from savepoint

2018-02-01 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-3186?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jean-Baptiste Onofré updated BEAM-3186:
---
Fix Version/s: 2.3.0

> In-flight data loss when restoring from savepoint
> -
>
> Key: BEAM-3186
> URL: https://issues.apache.org/jira/browse/BEAM-3186
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Affects Versions: 2.0.0, 2.1.0
>Reporter: Pawel Bartoszek
>Assignee: Dawid Wysakowicz
>Priority: Blocker
> Fix For: 2.3.0
>
> Attachments: restore_no_trigger.png, restore_with_trigger.png, 
> restore_with_trigger_b.png
>
>
> *The context:*
> I want to count how many events of given type(A,B, etc) I receive every 
> minute using 1 minute windows and AfterWatermark trigger with allowed 
> lateness 1 min.
> *Data loss case*
> In the case below if there is at least one A element with the event time 
> belonging to the window 14:00-14:01 read from Kinesis stream after job is 
> restored from savepoint the data loss will not be observed for this key and 
> this window.
> !restore_no_trigger.png!
> *Not data loss case*
> However, if no new A element element is read from Kinesis stream than data 
> loss is observable.
> !restore_with_trigger.png!
> *Workaround*
> As a workaround we could configure early firings every X seconds which gives 
> up to X seconds data loss per key on restore.
> *My guess where the issue might be*
> I believe this is Beam-Flink integration layer bug. From my investigation I 
> don't think it's KinesisReader and possibility that it couldn't advance 
> watermark. To prove that after I restore from savepoint I sent some records 
> for different key (B) for the same window as shown in the 
> pictures(14:00-14:01) without seeing trigger going off for restored window 
> and key A.
> My guess is that Beam after job is restored doesn't register flink event time 
> timer for restored window unless there is a new element (key) coming for the 
> restored window.
> Please refer to [this 
> gist|https://gist.github.com/pbartoszek/7ab88c8b6538039db1b383358d1d1b5a] for 
> test job that shows this behaviour.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-3186) In-flight data loss when restoring from savepoint

2018-02-01 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-3186?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jean-Baptiste Onofré updated BEAM-3186:
---
Priority: Blocker  (was: Major)

> In-flight data loss when restoring from savepoint
> -
>
> Key: BEAM-3186
> URL: https://issues.apache.org/jira/browse/BEAM-3186
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Affects Versions: 2.0.0, 2.1.0
>Reporter: Pawel Bartoszek
>Assignee: Dawid Wysakowicz
>Priority: Blocker
> Fix For: 2.3.0
>
> Attachments: restore_no_trigger.png, restore_with_trigger.png, 
> restore_with_trigger_b.png
>
>
> *The context:*
> I want to count how many events of given type(A,B, etc) I receive every 
> minute using 1 minute windows and AfterWatermark trigger with allowed 
> lateness 1 min.
> *Data loss case*
> In the case below if there is at least one A element with the event time 
> belonging to the window 14:00-14:01 read from Kinesis stream after job is 
> restored from savepoint the data loss will not be observed for this key and 
> this window.
> !restore_no_trigger.png!
> *Not data loss case*
> However, if no new A element element is read from Kinesis stream than data 
> loss is observable.
> !restore_with_trigger.png!
> *Workaround*
> As a workaround we could configure early firings every X seconds which gives 
> up to X seconds data loss per key on restore.
> *My guess where the issue might be*
> I believe this is Beam-Flink integration layer bug. From my investigation I 
> don't think it's KinesisReader and possibility that it couldn't advance 
> watermark. To prove that after I restore from savepoint I sent some records 
> for different key (B) for the same window as shown in the 
> pictures(14:00-14:01) without seeing trigger going off for restored window 
> and key A.
> My guess is that Beam after job is restored doesn't register flink event time 
> timer for restored window unless there is a new element (key) coming for the 
> restored window.
> Please refer to [this 
> gist|https://gist.github.com/pbartoszek/7ab88c8b6538039db1b383358d1d1b5a] for 
> test job that shows this behaviour.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-3531) Nexmark failed with NPE with DEFAULT suite

2018-02-01 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-3531?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía updated BEAM-3531:
---
Summary: Nexmark failed with NPE with DEFAULT suite  (was: nexmark failed 
with NPE with DEFAULT suite)

> Nexmark failed with NPE with DEFAULT suite
> --
>
> Key: BEAM-3531
> URL: https://issues.apache.org/jira/browse/BEAM-3531
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Affects Versions: 2.2.0
>Reporter: Qi Cui
>Assignee: Qi Cui
>Priority: Major
> Fix For: 2.4.0
>
>
> @:~/src/beam/sdks/java/nexmark$ mvn exec:java 
> -Dexec.mainClass=org.apache.beam.sdk.nexmark.Main -Pdirect-runner 
> -Dexec.args="--runner=DirectRunner \
> > --suite=DEFAULT --streaming=false --manageResources=false 
> > --monitorJobs=true --enforceEncodability=true --enforceImmutability=true"
> [INFO] Scanning for projects...
> [WARNING] 
> [WARNING] Some problems were encountered while building the effective model 
> for org.apache.beam:beam-sdks-java-nexmark:jar:2.3.0-SNAPSHOT
> [WARNING] The expression ${parent.version} is deprecated. Please use 
> ${project.parent.version} instead.
> [WARNING] 
> [WARNING] It is highly recommended to fix these problems because they 
> threaten the stability of your build.
> [WARNING] 
> [WARNING] For this reason, future Maven versions might no longer support 
> building such malformed projects.
> [WARNING] 
> [INFO] 
> 
> [INFO] Detecting the operating system and CPU architecture
> [INFO] 
> 
> [INFO] os.detected.name: linux
> [INFO] os.detected.arch: x86_64
> [INFO] os.detected.version: 4.13
> [INFO] os.detected.version.major: 4
> [INFO] os.detected.version.minor: 13
> [INFO] os.detected.release: ubuntu
> [INFO] os.detected.release.version: 17.10
> [INFO] os.detected.release.like.ubuntu: true
> [INFO] os.detected.release.like.debian: true
> [INFO] os.detected.classifier: linux-x86_64
> [WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, 
> no dependency information available
> [WARNING] Failed to retrieve plugin descriptor for 
> org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin 
> org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not 
> be resolved: Failure to find org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 in 
> https://repo.maven.apache.org/maven2 was cached in the local repository, 
> resolution will not be reattempted until the update interval of central has 
> elapsed or updates are forced
> [INFO] 
> [INFO] 
> 
> [INFO] Building Apache Beam :: SDKs :: Java :: Nexmark 2.3.0-SNAPSHOT
> [INFO] 
> 
> [WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, 
> no dependency information available
> [WARNING] Failed to retrieve plugin descriptor for 
> org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin 
> org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not 
> be resolved: Failure to find org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 in 
> https://repo.maven.apache.org/maven2 was cached in the local repository, 
> resolution will not be reattempted until the update interval of central has 
> elapsed or updates are forced
> [INFO] 
> [INFO] --- exec-maven-plugin:1.5.0:java (default-cli) @ 
> beam-sdks-java-nexmark ---
> 2018-01-25T02:08:13.056Z Running query:0
> ==
> Run started 2018-01-25T02:08:12.987Z and ran for PT0.249S
> Default configuration:
> {"debug":true,"query":0,"sourceType":"DIRECT","sinkType":"DEVNULL","pubSubMode":"COMBINED","numEvents":10,"numEventGenerators":100,"rateShape":"SINE","firstEventRate":1,"nextEventRate":1,"rateUnit":"PER_SECOND","ratePeriodSec":600,"preloadSeconds":0,"streamTimeout":240,"isRateLimited":false,"useWallclockEventTime":false,"avgPersonByteSize":200,"avgAuctionByteSize":500,"avgBidByteSize":100,"hotAuctionRatio":2,"hotSellersRatio":4,"hotBiddersRatio":4,"windowSizeSec":10,"windowPeriodSec":5,"watermarkHoldbackSec":0,"numInFlightAuctions":100,"numActivePeople":1000,"coderStrategy":"HAND","cpuDelayMs":0,"diskBusyBytes":0,"auctionSkip":123,"fanout":5,"maxAuctionsWaitingTime":600,"occasionalDelaySec":3,"probDelayedEvent":0.1,"maxLogEvents":10,"usePubsubPublishTime":false,"outOfOrderGroupSize":1}
> Configurations:
>  Conf Description
>   query:0
> Performance:
>  Conf Runtime(sec) (Baseline) Events(/sec) (Baseline) Results (Baseline)
>   *** not run ***
> 

[jira] [Created] (BEAM-3592) Spark-runner profile is broken on Nexmark after move to Spark 2.x

2018-02-01 Thread JIRA
Ismaël Mejía created BEAM-3592:
--

 Summary: Spark-runner profile is broken on Nexmark after move to 
Spark 2.x
 Key: BEAM-3592
 URL: https://issues.apache.org/jira/browse/BEAM-3592
 Project: Beam
  Issue Type: Bug
  Components: sdk-java-extensions
Affects Versions: 2.3.0
Reporter: Ismaël Mejía
Assignee: Ismaël Mejía


I found this issue by testing release 2.3.0 with Nexmark. It breaks because of 
a versioning problem of the netty dependency.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3589) Flink runner breaks with ClassCastException on UnboundedSource

2018-02-01 Thread Aljoscha Krettek (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3589?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16348631#comment-16348631
 ] 

Aljoscha Krettek commented on BEAM-3589:


[~grzegorz_kolakowski] I think I'm too tired today but does 
https://github.com/apache/beam/pull/4558 actually remove that unnecessary 
{{(Read.Unbounded)}} cast, I couldn't find it anymore. 

> Flink runner breaks with ClassCastException on UnboundedSource
> --
>
> Key: BEAM-3589
> URL: https://issues.apache.org/jira/browse/BEAM-3589
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Affects Versions: 2.3.0
>Reporter: Ismaël Mejía
>Assignee: Grzegorz Kołakowski
>Priority: Blocker
>
> When you execute a pipeline tha uses an unbounded source and an empty 
> transform it produces a ClassCastException:
> {quote}[WARNING] 
> java.lang.reflect.InvocationTargetException
>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke 
> (NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke 
> (DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke (Method.java:498)
>     at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:294)
>     at java.lang.Thread.run (Thread.java:748)
> Caused by: java.lang.ClassCastException: 
> org.apache.beam.runners.core.construction.AutoValue_PTransformTranslation_UnknownRawPTransform
>  cannot be cast to org.apache.beam.sdk.io.Read$Unbounded
>     at 
> org.apache.beam.runners.flink.FlinkStreamingTransformTranslators$ReadSourceTranslator.translateNode
>  (FlinkStreamingTransformTranslators.java:256)
>     at 
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.applyStreamingTransform
>  (FlinkStreamingPipelineTranslator.java:139)
>     at 
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.visitPrimitiveTransform
>  (FlinkStreamingPipelineTranslator.java:118)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:670)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:647)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:623)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:623)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:647)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:662)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600 
> (TransformHierarchy.java:311)
>     at org.apache.beam.sdk.runners.TransformHierarchy.visit 
> (TransformHierarchy.java:245)
>     at org.apache.beam.sdk.Pipeline.traverseTopologically (Pipeline.java:458)
>     at org.apache.beam.runners.flink.FlinkPipelineTranslator.translate 
> (FlinkPipelineTranslator.java:38)
>     at 
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.translate 
> (FlinkStreamingPipelineTranslator.java:70)
>     at 
> org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.translate 
> (FlinkPipelineExecutionEnvironment.java:113)
>     at org.apache.beam.runners.flink.FlinkRunner.run (FlinkRunner.java:110)
>     at org.apache.beam.sdk.Pipeline.run (Pipeline.java:311)
>     at org.apache.beam.sdk.Pipeline.run (Pipeline.java:297)
>     at org.apache.beam.sdk.nexmark.NexmarkLauncher.run 
> (NexmarkLauncher.java:1139)
>     at org.apache.beam.sdk.nexmark.Main.runAll (Main.java:69)
>     at org.apache.beam.sdk.nexmark.Main.main (Main.java:301)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke 
> (NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke 
> (DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke (Method.java:498)
>     at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:294)
>     at java.lang.Thread.run (Thread.java:748)
> {quote}
> You can reproduce it quickly by running this command from the nexmark 
> directory:
> {quote}mvn exec:java -Dexec.mainClass=org.apache.beam.sdk.nexmark.Main 
> -Pflink-runner "-Dexec.args=--runner=FlinkRunner --query=2 --streaming=true 
> --manageResources=false --monitorJobs=true"
> {quote}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Dataflow #4857

2018-02-01 Thread Apache Jenkins Server
See 


--
[...truncated 23.49 MB...]
[INFO] 2018-02-01T13:42:44.689Z: (710339aae0f9e798): Executing operation 
PAssert$362/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$362/GroupGlobally/WindowIntoDummy/Window.Assign+PAssert$362/GroupGlobally/GroupDummyAndContents/Reify+PAssert$362/GroupGlobally/GroupDummyAndContents/Write
[INFO] 2018-02-01T13:42:44.722Z: (6cdafa3e6cdcef18): Executing operation 
PAssert$362/GroupGlobally/GatherAllOutputs/GroupByKey/Read+PAssert$362/GroupGlobally/GatherAllOutputs/GroupByKey/GroupByWindow+PAssert$362/GroupGlobally/GatherAllOutputs/Values/Values/Map+PAssert$362/GroupGlobally/RewindowActuals/Window.Assign+PAssert$362/GroupGlobally/KeyForDummy/AddKeys/Map+PAssert$362/GroupGlobally/GroupDummyAndContents/Reify+PAssert$362/GroupGlobally/GroupDummyAndContents/Write
[INFO] Success result for Dataflow job 
2018-02-01_05_39_54-17662899275506557860. Found 1 success, 0 failures out of 1 
expected assertions.
[INFO] 2018-02-01T13:42:47.062Z: (96dc4a2934b7c58f): Autoscaling: Raised the 
number of workers to 0 based on the rate of progress in the currently running 
step(s).
[INFO] 2018-02-01T13:42:46.978Z: (5fea8bcf45b9a1bf): Autoscaling: Raised the 
number of workers to 0 based on the rate of progress in the currently running 
step(s).
[INFO] 2018-02-01T13:42:47.683Z: (bb97d6414ee6b08e): Workers have started 
successfully.
[INFO] Job 2018-02-01_05_39_37-720619087247016162 finished with status DONE.
[INFO] Success result for Dataflow job 2018-02-01_05_39_37-720619087247016162. 
Found 2 success, 0 failures out of 2 expected assertions.
[INFO] Staging files complete: 0 files cached, 133 files newly uploaded
[INFO] Adding Create.Values/Read(CreateSource) as step s1
[INFO] Adding 
Distinct.WithRepresentativeValues/KeyByRepresentativeValue/AddKeys/Map as step 
s2
[INFO] Adding Distinct.WithRepresentativeValues/OneValuePerKey/GroupByKey as 
step s3
[INFO] Adding 
Distinct.WithRepresentativeValues/OneValuePerKey/Combine.GroupedValues as step 
s4
[INFO] Adding Distinct.WithRepresentativeValues/KeepFirstPane as step s5
[INFO] Adding PAssert$364/GroupGlobally/Window.Into()/Window.Assign as step s6
[INFO] Adding 
PAssert$364/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous) as 
step s7
[INFO] Adding PAssert$364/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map 
as step s8
[INFO] Adding 
PAssert$364/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign as step 
s9
[INFO] Adding PAssert$364/GroupGlobally/GatherAllOutputs/GroupByKey as step s10
[INFO] Adding PAssert$364/GroupGlobally/GatherAllOutputs/Values/Values/Map as 
step s11
[INFO] Adding PAssert$364/GroupGlobally/RewindowActuals/Window.Assign as step 
s12
[INFO] Adding PAssert$364/GroupGlobally/KeyForDummy/AddKeys/Map as step s13
[INFO] Adding 
PAssert$364/GroupGlobally/RemoveActualsTriggering/Flatten.PCollections as step 
s14
[INFO] Adding PAssert$364/GroupGlobally/Create.Values/Read(CreateSource) as 
step s15
[INFO] Adding PAssert$364/GroupGlobally/WindowIntoDummy/Window.Assign as step 
s16
[INFO] Adding 
PAssert$364/GroupGlobally/RemoveDummyTriggering/Flatten.PCollections as step s17
[INFO] Adding PAssert$364/GroupGlobally/FlattenDummyAndContents as step s18
[INFO] Adding PAssert$364/GroupGlobally/NeverTrigger/Flatten.PCollections as 
step s19
[INFO] Adding PAssert$364/GroupGlobally/GroupDummyAndContents as step s20
[INFO] Adding PAssert$364/GroupGlobally/Values/Values/Map as step s21
[INFO] Adding PAssert$364/GroupGlobally/ParDo(Concat) as step s22
[INFO] Adding PAssert$364/GetPane/Map as step s23
[INFO] Adding PAssert$364/RunChecks as step s24
[INFO] Adding PAssert$364/VerifyAssertions/ParDo(DefaultConclude) as step s25
[INFO] Staging pipeline description to 
gs://temp-storage-for-validates-runner-tests//distincttest0testdistinctwithrepresentativevalue-jenkins-0201134148-e74d9dd6/output/results/staging/
[INFO] Uploading <53139 bytes, hash knbWjK5qNfDLTz4tOSFvUQ> to 
gs://temp-storage-for-validates-runner-tests//distincttest0testdistinctwithrepresentativevalue-jenkins-0201134148-e74d9dd6/output/results/staging/pipeline-knbWjK5qNfDLTz4tOSFvUQ.pb
[INFO] To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-02-01_05_42_52-18303468637152664280?project=apache-beam-testing
[INFO] To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2018-02-01_05_42_52-18303468637152664280
[INFO] Running Dataflow job 2018-02-01_05_42_52-18303468637152664280 with 1 
expected assertions.
[INFO] 2018-02-01T13:42:54.872Z: (710339aae0f9e2ac): Executing operation 
PAssert$362/GroupGlobally/GroupDummyAndContents/Close
[INFO] 2018-02-01T13:42:54.934Z: (6cdafa3e6cdce26a): Executing operation 

[jira] [Commented] (BEAM-3589) Flink runner breaks with ClassCastException on UnboundedSource

2018-02-01 Thread JIRA

[ 
https://issues.apache.org/jira/browse/BEAM-3589?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16348582#comment-16348582
 ] 

Ismaël Mejía commented on BEAM-3589:


Great work [~grzegorz_kolakowski] your fix works perfectly, we should however 
get it merged and cherry-pick it into version 2.3.0. I will let the ticket open 
until this is done.

> Flink runner breaks with ClassCastException on UnboundedSource
> --
>
> Key: BEAM-3589
> URL: https://issues.apache.org/jira/browse/BEAM-3589
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Affects Versions: 2.3.0
>Reporter: Ismaël Mejía
>Assignee: Grzegorz Kołakowski
>Priority: Blocker
>
> When you execute a pipeline tha uses an unbounded source and an empty 
> transform it produces a ClassCastException:
> {quote}[WARNING] 
> java.lang.reflect.InvocationTargetException
>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke 
> (NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke 
> (DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke (Method.java:498)
>     at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:294)
>     at java.lang.Thread.run (Thread.java:748)
> Caused by: java.lang.ClassCastException: 
> org.apache.beam.runners.core.construction.AutoValue_PTransformTranslation_UnknownRawPTransform
>  cannot be cast to org.apache.beam.sdk.io.Read$Unbounded
>     at 
> org.apache.beam.runners.flink.FlinkStreamingTransformTranslators$ReadSourceTranslator.translateNode
>  (FlinkStreamingTransformTranslators.java:256)
>     at 
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.applyStreamingTransform
>  (FlinkStreamingPipelineTranslator.java:139)
>     at 
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.visitPrimitiveTransform
>  (FlinkStreamingPipelineTranslator.java:118)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:670)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:647)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:623)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:623)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:647)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:662)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600 
> (TransformHierarchy.java:311)
>     at org.apache.beam.sdk.runners.TransformHierarchy.visit 
> (TransformHierarchy.java:245)
>     at org.apache.beam.sdk.Pipeline.traverseTopologically (Pipeline.java:458)
>     at org.apache.beam.runners.flink.FlinkPipelineTranslator.translate 
> (FlinkPipelineTranslator.java:38)
>     at 
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.translate 
> (FlinkStreamingPipelineTranslator.java:70)
>     at 
> org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.translate 
> (FlinkPipelineExecutionEnvironment.java:113)
>     at org.apache.beam.runners.flink.FlinkRunner.run (FlinkRunner.java:110)
>     at org.apache.beam.sdk.Pipeline.run (Pipeline.java:311)
>     at org.apache.beam.sdk.Pipeline.run (Pipeline.java:297)
>     at org.apache.beam.sdk.nexmark.NexmarkLauncher.run 
> (NexmarkLauncher.java:1139)
>     at org.apache.beam.sdk.nexmark.Main.runAll (Main.java:69)
>     at org.apache.beam.sdk.nexmark.Main.main (Main.java:301)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke 
> (NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke 
> (DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke (Method.java:498)
>     at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:294)
>     at java.lang.Thread.run (Thread.java:748)
> {quote}
> You can reproduce it quickly by running this command from the nexmark 
> directory:
> {quote}mvn exec:java -Dexec.mainClass=org.apache.beam.sdk.nexmark.Main 
> -Pflink-runner "-Dexec.args=--runner=FlinkRunner --query=2 --streaming=true 
> --manageResources=false --monitorJobs=true"
> {quote}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-3186) In-flight data loss when restoring from savepoint

2018-02-01 Thread Pawel Bartoszek (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3186?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Pawel Bartoszek updated BEAM-3186:
--
Description: 
*The context:*

I want to count how many events of given type(A,B, etc) I receive every minute 
using 1 minute windows and AfterWatermark trigger with allowed lateness 1 min.

*Data loss case*
In the case below if there is at least one A element with the event time 
belonging to the window 14:00-14:01 read from Kinesis stream after job is 
restored from savepoint the data loss will not be observed for this key and 
this window.
!restore_with_trigger.png!

*Not data loss case*
However, if no new A element element is read from Kinesis stream than data loss 
is observable.

!restore_no_trigger.png!

*Workaround*
As a workaround we could configure early firings every X seconds which gives up 
to X seconds data loss per key on restore.

*My guess where the issue might be*

I believe this is Beam-Flink integration layer bug. From my investigation I 
don't think it's KinesisReader and possibility that it couldn't advance 
watermark. To prove that after I restore from savepoint I sent some records for 
different key (B) for the same window as shown in the pictures(14:00-14:01) 
without seeing trigger going off for restored window and key A.

My guess is that Beam after job is restored doesn't register flink event time 
timer for restored window unless there is a new element (key) coming for the 
restored window.

Please refer to [this 
gist|https://gist.github.com/pbartoszek/7ab88c8b6538039db1b383358d1d1b5a] for 
test job that shows this behaviour.


  was:
*The context:*

I want to count how many events of given type(A,B, etc) I receive every minute 
using 1 minute windows and AfterWatermark trigger with allowed lateness 1 min.

*Data loss case*
In the case below if there is at least one A element with the event time 
belonging to the window 14:00-14:01 read from Kinesis stream after job is 
restored from savepoint the data loss will not be observed for this key and 
this window.
!restore_no_trigger.png!

*Not data loss case*
However, if no new A element element is read from Kinesis stream than data loss 
is observable.
!restore_with_trigger.png!

*Workaround*
As a workaround we could configure early firings every X seconds which gives up 
to X seconds data loss per key on restore.

*My guess where the issue might be*

I believe this is Beam-Flink integration layer bug. From my investigation I 
don't think it's KinesisReader and possibility that it couldn't advance 
watermark. To prove that after I restore from savepoint I sent some records for 
different key (B) for the same window as shown in the pictures(14:00-14:01) 
without seeing trigger going off for restored window and key A.

My guess is that Beam after job is restored doesn't register flink event time 
timer for restored window unless there is a new element (key) coming for the 
restored window.

Please refer to [this 
gist|https://gist.github.com/pbartoszek/7ab88c8b6538039db1b383358d1d1b5a] for 
test job that shows this behaviour.



> In-flight data loss when restoring from savepoint
> -
>
> Key: BEAM-3186
> URL: https://issues.apache.org/jira/browse/BEAM-3186
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Affects Versions: 2.0.0, 2.1.0
>Reporter: Pawel Bartoszek
>Assignee: Dawid Wysakowicz
>Priority: Major
> Attachments: restore_no_trigger.png, restore_with_trigger.png, 
> restore_with_trigger_b.png
>
>
> *The context:*
> I want to count how many events of given type(A,B, etc) I receive every 
> minute using 1 minute windows and AfterWatermark trigger with allowed 
> lateness 1 min.
> *Data loss case*
> In the case below if there is at least one A element with the event time 
> belonging to the window 14:00-14:01 read from Kinesis stream after job is 
> restored from savepoint the data loss will not be observed for this key and 
> this window.
> !restore_with_trigger.png!
> *Not data loss case*
> However, if no new A element element is read from Kinesis stream than data 
> loss is observable.
> !restore_no_trigger.png!
> *Workaround*
> As a workaround we could configure early firings every X seconds which gives 
> up to X seconds data loss per key on restore.
> *My guess where the issue might be*
> I believe this is Beam-Flink integration layer bug. From my investigation I 
> don't think it's KinesisReader and possibility that it couldn't advance 
> watermark. To prove that after I restore from savepoint I sent some records 
> for different key (B) for the same window as shown in the 
> pictures(14:00-14:01) without seeing trigger going off for restored window 
> and key A.
> My guess is that Beam after job is restored doesn't register flink event time 
> timer for 

[jira] [Updated] (BEAM-3186) In-flight data loss when restoring from savepoint

2018-02-01 Thread Pawel Bartoszek (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3186?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Pawel Bartoszek updated BEAM-3186:
--
Description: 
*The context:*

I want to count how many events of given type(A,B, etc) I receive every minute 
using 1 minute windows and AfterWatermark trigger with allowed lateness 1 min.

*Data loss case*
In the case below if there is at least one A element with the event time 
belonging to the window 14:00-14:01 read from Kinesis stream after job is 
restored from savepoint the data loss will not be observed for this key and 
this window.
!restore_no_trigger.png!

*Not data loss case*
However, if no new A element element is read from Kinesis stream than data loss 
is observable.
!restore_with_trigger.png!



*Workaround*
As a workaround we could configure early firings every X seconds which gives up 
to X seconds data loss per key on restore.

*My guess where the issue might be*

I believe this is Beam-Flink integration layer bug. From my investigation I 
don't think it's KinesisReader and possibility that it couldn't advance 
watermark. To prove that after I restore from savepoint I sent some records for 
different key (B) for the same window as shown in the pictures(14:00-14:01) 
without seeing trigger going off for restored window and key A.

My guess is that Beam after job is restored doesn't register flink event time 
timer for restored window unless there is a new element (key) coming for the 
restored window.

Please refer to [this 
gist|https://gist.github.com/pbartoszek/7ab88c8b6538039db1b383358d1d1b5a] for 
test job that shows this behaviour.


  was:
*The context:*

I want to count how many events of given type(A,B, etc) I receive every minute 
using 1 minute windows and AfterWatermark trigger with allowed lateness 1 min.

*Data loss case*
In the case below if there is at least one A element with the event time 
belonging to the window 14:00-14:01 read from Kinesis stream after job is 
restored from savepoint the data loss will not be observed for this key and 
this window.
!restore_with_trigger.png!

*Not data loss case*
However, if no new A element element is read from Kinesis stream than data loss 
is observable.

!restore_no_trigger.png!

*Workaround*
As a workaround we could configure early firings every X seconds which gives up 
to X seconds data loss per key on restore.

*My guess where the issue might be*

I believe this is Beam-Flink integration layer bug. From my investigation I 
don't think it's KinesisReader and possibility that it couldn't advance 
watermark. To prove that after I restore from savepoint I sent some records for 
different key (B) for the same window as shown in the pictures(14:00-14:01) 
without seeing trigger going off for restored window and key A.

My guess is that Beam after job is restored doesn't register flink event time 
timer for restored window unless there is a new element (key) coming for the 
restored window.

Please refer to [this 
gist|https://gist.github.com/pbartoszek/7ab88c8b6538039db1b383358d1d1b5a] for 
test job that shows this behaviour.



> In-flight data loss when restoring from savepoint
> -
>
> Key: BEAM-3186
> URL: https://issues.apache.org/jira/browse/BEAM-3186
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Affects Versions: 2.0.0, 2.1.0
>Reporter: Pawel Bartoszek
>Assignee: Dawid Wysakowicz
>Priority: Major
> Attachments: restore_no_trigger.png, restore_with_trigger.png, 
> restore_with_trigger_b.png
>
>
> *The context:*
> I want to count how many events of given type(A,B, etc) I receive every 
> minute using 1 minute windows and AfterWatermark trigger with allowed 
> lateness 1 min.
> *Data loss case*
> In the case below if there is at least one A element with the event time 
> belonging to the window 14:00-14:01 read from Kinesis stream after job is 
> restored from savepoint the data loss will not be observed for this key and 
> this window.
> !restore_no_trigger.png!
> *Not data loss case*
> However, if no new A element element is read from Kinesis stream than data 
> loss is observable.
> !restore_with_trigger.png!
> *Workaround*
> As a workaround we could configure early firings every X seconds which gives 
> up to X seconds data loss per key on restore.
> *My guess where the issue might be*
> I believe this is Beam-Flink integration layer bug. From my investigation I 
> don't think it's KinesisReader and possibility that it couldn't advance 
> watermark. To prove that after I restore from savepoint I sent some records 
> for different key (B) for the same window as shown in the 
> pictures(14:00-14:01) without seeing trigger going off for restored window 
> and key A.
> My guess is that Beam after job is restored doesn't register flink event time 
> timer for 

[jira] [Commented] (BEAM-3590) Can't activate release profile from a module directory

2018-02-01 Thread JIRA

[ 
https://issues.apache.org/jira/browse/BEAM-3590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16348549#comment-16348549
 ] 

Jean-Baptiste Onofré commented on BEAM-3590:


Yes, I saw that. I gonna improve this. Thanks !

> Can't activate release profile from a module directory
> --
>
> Key: BEAM-3590
> URL: https://issues.apache.org/jira/browse/BEAM-3590
> Project: Beam
>  Issue Type: Bug
>  Components: build-system
>Affects Versions: 2.3.0
>Reporter: Ismaël Mejía
>Assignee: Jean-Baptiste Onofré
>Priority: Minor
>
> To run and test a specific module, e.g. JmsIO a common pattern is to go to the
> directory and do:
> {quote}cd sdks/java/io/jms
> mvn clean verify -Prelease
> {quote}
> This worked before but breaks since the recent changes in the release profile.
> The detailed exception
> {quote}[INFO] --- maven-assembly-plugin:3.1.0:single 
> (source-release-assembly) @ beam-sdks-java-io-jms ---
> [INFO] Reading assembly descriptor: assembly.xml
> [INFO] 
> 
> [INFO] BUILD FAILURE
> [INFO] 
> 
> [INFO] Total time: 12.221 s
> [INFO] Finished at: 2018-01-31T22:03:27+01:00
> [INFO] Final Memory: 71M/1457M
> [INFO] 
> 
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-assembly-plugin:3.1.0:single 
> (source-release-assembly) on project beam-sdks-java-io-jms: Error reading 
> assemblies: Error locating assembly descriptor: assembly.xml
> [ERROR]
> [ERROR] [1] [INFO] Searching for file location: 
> /home/ismael/upstream/beam/sdks/java/io/jms/assembly.xml
> [ERROR]
> [ERROR] [2] [INFO] File: 
> /home/ismael/upstream/beam/sdks/java/io/jms/assembly.xml does not exist.
> [ERROR]
> [ERROR] [3] [INFO] File: 
> /home/ismael/upstream/beam/sdks/java/io/jms/assembly.xml does not exist.
> [ERROR] -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
> switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions, please 
> read the following articles:
> [ERROR] [Help 1] 
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
> {quote}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-2806) support View.CreatePCollectionView in FlinkRunner

2018-02-01 Thread Aljoscha Krettek (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2806?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aljoscha Krettek reassigned BEAM-2806:
--

Assignee: Grzegorz Kołakowski  (was: Aljoscha Krettek)

> support View.CreatePCollectionView in FlinkRunner
> -
>
> Key: BEAM-2806
> URL: https://issues.apache.org/jira/browse/BEAM-2806
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-flink
>Reporter: Xu Mingmin
>Assignee: Grzegorz Kołakowski
>Priority: Major
>
> Beam version: 2.2.0-SNAPSHOT
> Here's the code
> {code}
> PCollectionView> rowsView = rightRows
> .apply(View.asMultimap());
> {code}
> And exception when running with {{FlinkRunner}}:
> {code}
> Exception in thread "main" java.lang.UnsupportedOperationException: The 
> transform View.CreatePCollectionView is currently not supported.
>   at 
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.visitPrimitiveTransform(FlinkStreamingPipelineTranslator.java:113)
>   at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:594)
>   at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:586)
>   at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:586)
>   at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:268)
>   at 
> org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:202)
>   at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:440)
>   at 
> org.apache.beam.runners.flink.FlinkPipelineTranslator.translate(FlinkPipelineTranslator.java:38)
>   at 
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.translate(FlinkStreamingPipelineTranslator.java:69)
>   at 
> org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.translate(FlinkPipelineExecutionEnvironment.java:104)
>   at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:113)
>   at org.apache.beam.sdk.Pipeline.run(Pipeline.java:297)
>   at org.apache.beam.sdk.Pipeline.run(Pipeline.java:283)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3589) Flink runner breaks with ClassCastException on UnboundedSource

2018-02-01 Thread Aljoscha Krettek (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3589?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aljoscha Krettek reassigned BEAM-3589:
--

Assignee: Grzegorz Kołakowski

> Flink runner breaks with ClassCastException on UnboundedSource
> --
>
> Key: BEAM-3589
> URL: https://issues.apache.org/jira/browse/BEAM-3589
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Affects Versions: 2.3.0
>Reporter: Ismaël Mejía
>Assignee: Grzegorz Kołakowski
>Priority: Blocker
>
> When you execute a pipeline tha uses an unbounded source and an empty 
> transform it produces a ClassCastException:
> {quote}[WARNING] 
> java.lang.reflect.InvocationTargetException
>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke 
> (NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke 
> (DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke (Method.java:498)
>     at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:294)
>     at java.lang.Thread.run (Thread.java:748)
> Caused by: java.lang.ClassCastException: 
> org.apache.beam.runners.core.construction.AutoValue_PTransformTranslation_UnknownRawPTransform
>  cannot be cast to org.apache.beam.sdk.io.Read$Unbounded
>     at 
> org.apache.beam.runners.flink.FlinkStreamingTransformTranslators$ReadSourceTranslator.translateNode
>  (FlinkStreamingTransformTranslators.java:256)
>     at 
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.applyStreamingTransform
>  (FlinkStreamingPipelineTranslator.java:139)
>     at 
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.visitPrimitiveTransform
>  (FlinkStreamingPipelineTranslator.java:118)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:670)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:647)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:623)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:623)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:647)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:662)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600 
> (TransformHierarchy.java:311)
>     at org.apache.beam.sdk.runners.TransformHierarchy.visit 
> (TransformHierarchy.java:245)
>     at org.apache.beam.sdk.Pipeline.traverseTopologically (Pipeline.java:458)
>     at org.apache.beam.runners.flink.FlinkPipelineTranslator.translate 
> (FlinkPipelineTranslator.java:38)
>     at 
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.translate 
> (FlinkStreamingPipelineTranslator.java:70)
>     at 
> org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.translate 
> (FlinkPipelineExecutionEnvironment.java:113)
>     at org.apache.beam.runners.flink.FlinkRunner.run (FlinkRunner.java:110)
>     at org.apache.beam.sdk.Pipeline.run (Pipeline.java:311)
>     at org.apache.beam.sdk.Pipeline.run (Pipeline.java:297)
>     at org.apache.beam.sdk.nexmark.NexmarkLauncher.run 
> (NexmarkLauncher.java:1139)
>     at org.apache.beam.sdk.nexmark.Main.runAll (Main.java:69)
>     at org.apache.beam.sdk.nexmark.Main.main (Main.java:301)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke 
> (NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke 
> (DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke (Method.java:498)
>     at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:294)
>     at java.lang.Thread.run (Thread.java:748)
> {quote}
> You can reproduce it quickly by running this command from the nexmark 
> directory:
> {quote}mvn exec:java -Dexec.mainClass=org.apache.beam.sdk.nexmark.Main 
> -Pflink-runner "-Dexec.args=--runner=FlinkRunner --query=2 --streaming=true 
> --manageResources=false --monitorJobs=true"
> {quote}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3186) In-flight data loss when restoring from savepoint

2018-02-01 Thread Aljoscha Krettek (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3186?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aljoscha Krettek reassigned BEAM-3186:
--

Assignee: Dawid Wysakowicz  (was: Aljoscha Krettek)

> In-flight data loss when restoring from savepoint
> -
>
> Key: BEAM-3186
> URL: https://issues.apache.org/jira/browse/BEAM-3186
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Affects Versions: 2.0.0, 2.1.0
>Reporter: Pawel Bartoszek
>Assignee: Dawid Wysakowicz
>Priority: Major
> Attachments: restore_no_trigger.png, restore_with_trigger.png, 
> restore_with_trigger_b.png
>
>
> *The context:*
> I want to count how many events of given type(A,B, etc) I receive every 
> minute using 1 minute windows and AfterWatermark trigger with allowed 
> lateness 1 min.
> *Data loss case*
> In the case below if there is at least one A element with the event time 
> belonging to the window 14:00-14:01 read from Kinesis stream after job is 
> restored from savepoint the data loss will not be observed for this key and 
> this window.
> !restore_no_trigger.png!
> *Not data loss case*
> However, if no new A element element is read from Kinesis stream than data 
> loss is observable.
> !restore_with_trigger.png!
> *Workaround*
> As a workaround we could configure early firings every X seconds which gives 
> up to X seconds data loss per key on restore.
> *My guess where the issue might be*
> I believe this is Beam-Flink integration layer bug. From my investigation I 
> don't think it's KinesisReader and possibility that it couldn't advance 
> watermark. To prove that after I restore from savepoint I sent some records 
> for different key (B) for the same window as shown in the 
> pictures(14:00-14:01) without seeing trigger going off for restored window 
> and key A.
> My guess is that Beam after job is restored doesn't register flink event time 
> timer for restored window unless there is a new element (key) coming for the 
> restored window.
> Please refer to [this 
> gist|https://gist.github.com/pbartoszek/7ab88c8b6538039db1b383358d1d1b5a] for 
> test job that shows this behaviour.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3414) AfterProcessingTime trigger issue with Flink Runner

2018-02-01 Thread Aljoscha Krettek (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3414?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16348542#comment-16348542
 ] 

Aljoscha Krettek commented on BEAM-3414:


Still working on getting that PR in, yes.

> AfterProcessingTime trigger issue with Flink Runner
> ---
>
> Key: BEAM-3414
> URL: https://issues.apache.org/jira/browse/BEAM-3414
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core, runner-flink
>Affects Versions: 2.2.0
> Environment: idea, ubuntu 16.04, FlinkRunner
>Reporter: huangjianhuang
>Assignee: Aljoscha Krettek
>Priority: Major
>
> in my demo, I read data from kafka and count globally, finally output the 
> total count of recieved data, as follow:
> {code:java}
> FlinkPipelineOptions options = 
> PipelineOptionsFactory.fromArgs(args).withValidation()
> .as(FlinkPipelineOptions.class);
> options.setStreaming(true);
> options.setRunner(FlinkRunner.class);
> Pipeline pipeline = Pipeline.create(options);
> pipeline
> .apply("Read from kafka",
> KafkaIO.read()
> //.withTimestampFn(kafkaData -> 
> TimeUtil.timeMillisToInstant(kafkaData.getKey()))
> .withBootstrapServers("localhost:9092")
> .withTopic("recharge")
> .withKeyDeserializer(StringDeserializer.class)
> 
> .withValueDeserializer(StringDeserializer.class)
> .withoutMetadata()
> )
> .apply(Values.create())
> .apply(Window.into(new GlobalWindows())
> .triggering(Repeatedly.forever(
> 
> AfterProcessingTime.pastFirstElementInPane().plusDelayOf(Duration.standardSeconds(5
> .accumulatingFiredPanes()
> )
> .apply(Count.globally())
> .apply("output",
> ParDo.of(new DoFn() {
> @ProcessElement
> public void process(ProcessContext context) {
> System.out.println("---get at: " + 
> Instant.now() + "--");
> System.out.println(context.element());
> }
> }));
> {code}
> the result should be displayed after (5s) I sent first data, but sometimes 
> there were nothing display after I sent data. the pic shows the outputs i got 
> in a test:
> (cant upload a pic, desc as text)
> {code:java}
> Send 681Msg at: 2018-01-05T06:34:31.436
>   ---get at: 2018-01-05T06:34:36.668Z--
>   681
> Send 681Msg at: 2018-01-05T06:34:47.166
>   ---get at: 2018-01-05T06:34:52.284Z--
>   1362
> Send 681Msg at: 2018-01-05T06:34:55.505
> Send 681Msg at: 2018-01-05T06:35:22.068
>   ---get at: 2018-01-05T06:35:22.112Z--
>   2044
> {code}
> btw, the code works fine with direct runner.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Apex #3356

2018-02-01 Thread Apache Jenkins Server
See 


--
[...truncated 178.71 KB...]
2018-02-01T12:09:49.038 [INFO] Excluding 
com.google.inject.extensions:guice-servlet:jar:3.0 from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding com.google.inject:guice:jar:3.0 from 
the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding javax.inject:javax.inject:jar:1 from 
the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding aopalliance:aopalliance:jar:1.0 from 
the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding com.sun.jersey:jersey-server:jar:1.9 
from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding asm:asm:jar:3.1 from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding 
com.sun.jersey.contribs:jersey-guice:jar:1.9 from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding jline:jline:jar:2.11 from the shaded 
jar.
2018-02-01T12:09:49.038 [INFO] Excluding org.apache.ant:ant:jar:1.9.2 from the 
shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding org.apache.ant:ant-launcher:jar:1.9.2 
from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding net.engio:mbassador:jar:1.1.9 from the 
shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding net.lingala.zip4j:zip4j:jar:1.3.2 from 
the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding commons-codec:commons-codec:jar:1.10 
from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding 
org.apache.xbean:xbean-asm5-shaded:jar:4.3 from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding org.jctools:jctools-core:jar:1.1 from 
the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding 
org.apache.beam:beam-model-pipeline:jar:2.4.0-SNAPSHOT from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding 
com.google.protobuf:protobuf-java:jar:3.2.0 from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding 
org.apache.beam:beam-sdks-java-core:jar:2.4.0-SNAPSHOT from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Including com.google.guava:guava:jar:20.0 in the 
shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding 
com.github.stephenc.findbugs:findbugs-annotations:jar:1.3.9-1 from the shaded 
jar.
2018-02-01T12:09:49.038 [INFO] Excluding 
com.fasterxml.jackson.core:jackson-core:jar:2.8.9 from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding 
com.fasterxml.jackson.core:jackson-annotations:jar:2.8.9 from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding 
com.fasterxml.jackson.core:jackson-databind:jar:2.8.9 from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding net.bytebuddy:byte-buddy:jar:1.6.8 
from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding org.apache.avro:avro:jar:1.8.2 from 
the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding 
com.thoughtworks.paranamer:paranamer:jar:2.7 from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding org.tukaani:xz:jar:1.5 from the shaded 
jar.
2018-02-01T12:09:49.038 [INFO] Excluding 
org.xerial.snappy:snappy-java:jar:1.1.4 from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding 
org.apache.commons:commons-compress:jar:1.14 from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding 
org.apache.beam:beam-runners-core-construction-java:jar:2.4.0-SNAPSHOT from the 
shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding 
org.apache.beam:beam-model-job-management:jar:2.4.0-SNAPSHOT from the shaded 
jar.
2018-02-01T12:09:49.038 [INFO] Excluding 
com.google.protobuf:protobuf-java-util:jar:3.2.0 from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding com.google.code.gson:gson:jar:2.7 from 
the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the 
shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding 
com.google.errorprone:error_prone_annotations:jar:2.0.15 from the shaded jar.
2018-02-01T12:09:49.038 [INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from 
the shaded jar.
2018-02-01T12:09:49.039 [INFO] Excluding 
com.google.instrumentation:instrumentation-api:jar:0.3.0 from the shaded jar.
2018-02-01T12:09:49.039 [INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the 
shaded jar.
2018-02-01T12:09:49.039 [INFO] Excluding 
org.apache.beam:beam-runners-core-java:jar:2.4.0-SNAPSHOT from the shaded jar.
2018-02-01T12:09:49.039 [INFO] Excluding 
org.apache.commons:commons-lang3:jar:3.6 from the shaded jar.
2018-02-01T12:09:49.039 [INFO] Excluding 
com.google.code.findbugs:jsr305:jar:3.0.1 from the shaded jar.
2018-02-01T12:09:49.039 [INFO] Excluding 
com.google.auto.service:auto-service:jar:1.0-rc2 from the shaded jar.
2018-02-01T12:09:49.039 [INFO] Excluding com.google.auto:auto-common:jar:0.3 
from the shaded jar.
2018-02-01T12:09:49.039 [INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from 
the shaded jar.
2018-02-01T12:09:49.039 [INFO] Excluding io.grpc:grpc-protobuf-lite:jar:1.2.0 
from the shaded jar.
2018-02-01T12:09:51.047 [INFO] Replacing original artifact with shaded artifact.
2018-02-01T12:09:51.047 [INFO] Replacing 

Build failed in Jenkins: beam_PerformanceTests_Python #861

2018-02-01 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam8 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 82e5e944ba143c23acf377bfd7a850046be68ea7 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 82e5e944ba143c23acf377bfd7a850046be68ea7
Commit message: "[BEAM-3249] Do not assume build directory is within build/, 
use the project defined build dir."
 > git rev-list 82e5e944ba143c23acf377bfd7a850046be68ea7 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins7621189792462985900.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3528553918196377218.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6919199800811630340.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins5374407727404169519.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/75/d1/5abca4ccf61a7ab86c255dd315fb96e566fbf9b5d3a480e72c93e8ec2802/setuptools-38.4.0-py2.py3-none-any.whl#md5=a5c6620a59f19f2d5d32bdca18c7b47e
Downloading/unpacking pip from 
https://pypi.python.org/packages/b6/ac/7015eb97dc749283ffdec1c3a88ddb8ae03b8fad0f0e611408f196358da3/pip-9.0.1-py2.py3-none-any.whl#md5=297dbd16ef53bcef0447d245815f5144
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins4244646903443425979.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins7228499570969448497.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
:318:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 

Jenkins build is back to normal : beam_PerformanceTests_TFRecordIOIT #88

2018-02-01 Thread Apache Jenkins Server
See 




[jira] [Created] (BEAM-3591) Undefined name: exc_info

2018-02-01 Thread cclauss (JIRA)
cclauss created BEAM-3591:
-

 Summary: Undefined name: exc_info
 Key: BEAM-3591
 URL: https://issues.apache.org/jira/browse/BEAM-3591
 Project: Beam
  Issue Type: Bug
  Components: sdk-py-core
Reporter: cclauss
Assignee: Ahmet Altay


__exc_info__ is an undefined name which might result in NameError being raised 
instead of the desired error.  Proposed fix is 
https://github.com/apache/beam/pull/4559

flake8 testing of https://github.com/apache/beam

$ __flake8 . --count --select=E901,E999,F821,F822,F823 --show-source 
--statistics__
```
./sdks/python/apache_beam/runners/worker/data_plane.py:185:19: F821 undefined 
name 'exc_info'
    raise exc_info[0], exc_info[1], exc_info[2]
  ^
./sdks/python/apache_beam/runners/worker/data_plane.py:185:32: F821 undefined 
name 'exc_info'
    raise exc_info[0], exc_info[1], exc_info[2]
   ^
./sdks/python/apache_beam/runners/worker/data_plane.py:185:45: F821 undefined 
name 'exc_info'
    raise exc_info[0], exc_info[1], exc_info[2]
    ^
```



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3589) Flink runner breaks with ClassCastException on UnboundedSource

2018-02-01 Thread JIRA

[ 
https://issues.apache.org/jira/browse/BEAM-3589?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16348400#comment-16348400
 ] 

Grzegorz Kołakowski commented on BEAM-3589:
---

I believe the issue is fixed in https://issues.apache.org/jira/browse/BEAM-2806 
. Here is the corresponding pull request 
[https://github.com/apache/beam/pull/4558] .

> Flink runner breaks with ClassCastException on UnboundedSource
> --
>
> Key: BEAM-3589
> URL: https://issues.apache.org/jira/browse/BEAM-3589
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Affects Versions: 2.3.0
>Reporter: Ismaël Mejía
>Priority: Blocker
>
> When you execute a pipeline tha uses an unbounded source and an empty 
> transform it produces a ClassCastException:
> {quote}[WARNING] 
> java.lang.reflect.InvocationTargetException
>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke 
> (NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke 
> (DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke (Method.java:498)
>     at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:294)
>     at java.lang.Thread.run (Thread.java:748)
> Caused by: java.lang.ClassCastException: 
> org.apache.beam.runners.core.construction.AutoValue_PTransformTranslation_UnknownRawPTransform
>  cannot be cast to org.apache.beam.sdk.io.Read$Unbounded
>     at 
> org.apache.beam.runners.flink.FlinkStreamingTransformTranslators$ReadSourceTranslator.translateNode
>  (FlinkStreamingTransformTranslators.java:256)
>     at 
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.applyStreamingTransform
>  (FlinkStreamingPipelineTranslator.java:139)
>     at 
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.visitPrimitiveTransform
>  (FlinkStreamingPipelineTranslator.java:118)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:670)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:647)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:623)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:623)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:647)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
> (TransformHierarchy.java:662)
>     at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600 
> (TransformHierarchy.java:311)
>     at org.apache.beam.sdk.runners.TransformHierarchy.visit 
> (TransformHierarchy.java:245)
>     at org.apache.beam.sdk.Pipeline.traverseTopologically (Pipeline.java:458)
>     at org.apache.beam.runners.flink.FlinkPipelineTranslator.translate 
> (FlinkPipelineTranslator.java:38)
>     at 
> org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.translate 
> (FlinkStreamingPipelineTranslator.java:70)
>     at 
> org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.translate 
> (FlinkPipelineExecutionEnvironment.java:113)
>     at org.apache.beam.runners.flink.FlinkRunner.run (FlinkRunner.java:110)
>     at org.apache.beam.sdk.Pipeline.run (Pipeline.java:311)
>     at org.apache.beam.sdk.Pipeline.run (Pipeline.java:297)
>     at org.apache.beam.sdk.nexmark.NexmarkLauncher.run 
> (NexmarkLauncher.java:1139)
>     at org.apache.beam.sdk.nexmark.Main.runAll (Main.java:69)
>     at org.apache.beam.sdk.nexmark.Main.main (Main.java:301)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke 
> (NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke 
> (DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke (Method.java:498)
>     at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:294)
>     at java.lang.Thread.run (Thread.java:748)
> {quote}
> You can reproduce it quickly by running this command from the nexmark 
> directory:
> {quote}mvn exec:java -Dexec.mainClass=org.apache.beam.sdk.nexmark.Main 
> -Pflink-runner "-Dexec.args=--runner=FlinkRunner --query=2 --streaming=true 
> --manageResources=false --monitorJobs=true"
> {quote}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-3590) Can't activate release profile from a module directory

2018-02-01 Thread JIRA
Ismaël Mejía created BEAM-3590:
--

 Summary: Can't activate release profile from a module directory
 Key: BEAM-3590
 URL: https://issues.apache.org/jira/browse/BEAM-3590
 Project: Beam
  Issue Type: Bug
  Components: build-system
Affects Versions: 2.3.0
Reporter: Ismaël Mejía
Assignee: Jean-Baptiste Onofré


To run and test a specific module, e.g. JmsIO a common pattern is to go to the
directory and do:
{quote}cd sdks/java/io/jms
mvn clean verify -Prelease
{quote}
This worked before but breaks since the recent changes in the release profile.
The detailed exception
{quote}[INFO] --- maven-assembly-plugin:3.1.0:single (source-release-assembly) 
@ beam-sdks-java-io-jms ---
[INFO] Reading assembly descriptor: assembly.xml
[INFO] 
[INFO] BUILD FAILURE
[INFO] 
[INFO] Total time: 12.221 s
[INFO] Finished at: 2018-01-31T22:03:27+01:00
[INFO] Final Memory: 71M/1457M
[INFO] 
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-assembly-plugin:3.1.0:single 
(source-release-assembly) on project beam-sdks-java-io-jms: Error reading 
assemblies: Error locating assembly descriptor: assembly.xml
[ERROR]
[ERROR] [1] [INFO] Searching for file location: 
/home/ismael/upstream/beam/sdks/java/io/jms/assembly.xml
[ERROR]
[ERROR] [2] [INFO] File: 
/home/ismael/upstream/beam/sdks/java/io/jms/assembly.xml does not exist.
[ERROR]
[ERROR] [3] [INFO] File: 
/home/ismael/upstream/beam/sdks/java/io/jms/assembly.xml does not exist.
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
{quote}
 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-3589) Flink runner breaks with ClassCastException on UnboundedSource

2018-02-01 Thread JIRA
Ismaël Mejía created BEAM-3589:
--

 Summary: Flink runner breaks with ClassCastException on 
UnboundedSource
 Key: BEAM-3589
 URL: https://issues.apache.org/jira/browse/BEAM-3589
 Project: Beam
  Issue Type: Bug
  Components: runner-flink
Affects Versions: 2.3.0
Reporter: Ismaël Mejía


When you execute a pipeline tha uses an unbounded source and an empty transform 
it produces a ClassCastException:
{quote}[WARNING] 
java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke 
(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke 
(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:294)
    at java.lang.Thread.run (Thread.java:748)
Caused by: java.lang.ClassCastException: 
org.apache.beam.runners.core.construction.AutoValue_PTransformTranslation_UnknownRawPTransform
 cannot be cast to org.apache.beam.sdk.io.Read$Unbounded
    at 
org.apache.beam.runners.flink.FlinkStreamingTransformTranslators$ReadSourceTranslator.translateNode
 (FlinkStreamingTransformTranslators.java:256)
    at 
org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.applyStreamingTransform
 (FlinkStreamingPipelineTranslator.java:139)
    at 
org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.visitPrimitiveTransform
 (FlinkStreamingPipelineTranslator.java:118)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
(TransformHierarchy.java:670)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
(TransformHierarchy.java:647)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
(TransformHierarchy.java:623)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
(TransformHierarchy.java:623)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
(TransformHierarchy.java:647)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit 
(TransformHierarchy.java:662)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600 
(TransformHierarchy.java:311)
    at org.apache.beam.sdk.runners.TransformHierarchy.visit 
(TransformHierarchy.java:245)
    at org.apache.beam.sdk.Pipeline.traverseTopologically (Pipeline.java:458)
    at org.apache.beam.runners.flink.FlinkPipelineTranslator.translate 
(FlinkPipelineTranslator.java:38)
    at org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator.translate 
(FlinkStreamingPipelineTranslator.java:70)
    at 
org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.translate 
(FlinkPipelineExecutionEnvironment.java:113)
    at org.apache.beam.runners.flink.FlinkRunner.run (FlinkRunner.java:110)
    at org.apache.beam.sdk.Pipeline.run (Pipeline.java:311)
    at org.apache.beam.sdk.Pipeline.run (Pipeline.java:297)
    at org.apache.beam.sdk.nexmark.NexmarkLauncher.run 
(NexmarkLauncher.java:1139)
    at org.apache.beam.sdk.nexmark.Main.runAll (Main.java:69)
    at org.apache.beam.sdk.nexmark.Main.main (Main.java:301)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke 
(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke 
(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:294)
    at java.lang.Thread.run (Thread.java:748)
{quote}
You can reproduce it quickly by running this command from the nexmark directory:
{quote}mvn exec:java -Dexec.mainClass=org.apache.beam.sdk.nexmark.Main 
-Pflink-runner "-Dexec.args=--runner=FlinkRunner --query=2 --streaming=true 
--manageResources=false --monitorJobs=true"
{quote}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Dataflow #4856

2018-02-01 Thread Apache Jenkins Server
See