[GitHub] beam pull request #3326: Beam-2383 Round function

2017-06-07 Thread app-tarush
GitHub user app-tarush opened a pull request:

https://github.com/apache/beam/pull/3326

Beam-2383 Round function

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`.
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/app-tarush/beam beam-2383-round-func

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3326.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3326


commit 274b869be4b28290b285aac20afc17e8fff91907
Author: tarushapptech 
Date:   2017-06-08T06:46:01Z

commiting changes for round function




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Build failed in Jenkins: beam_PostCommit_Python_Verify #2439

2017-06-07 Thread Apache Jenkins Server
See 


Changes:

[robertwb] Add coder info to pubsub io

--
[...truncated 503.34 KB...]
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-36.0.1.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.0.1.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
:135:
 UserWarning: Using fallback coder for typehint: List[Any].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:135:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-36.0.1.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.0.1.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
:135:
 UserWarning: Using fallback coder for typehint: Union[Tuple[str, None], 
Tuple[str, int]].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:135:
 UserWarning: Using fallback coder for typehint: List[Any].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:135:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
:318:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/security.html#snimissingwarning.
  SNIMissingWarning


Build failed in Jenkins: beam_PostCommit_Java_JDK_Versions_Test » OpenJDK 8 (on Ubuntu only),beam #59

2017-06-07 Thread Apache Jenkins Server
See 


Changes:

[ccy] Remove support for NativeSinks from the Python DirectRunner

[lcwik] Make BytesCoder to be a known type

[robertwb] Add coder info to pubsub io

--
[...truncated 1.26 MB...]
2017-06-08\T\06:36:20.340 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/clapper/grizzled-slf4j_2.10/1.0.2/grizzled-slf4j_2.10-1.0.2.pom
2017-06-08\T\06:36:20.366 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/clapper/grizzled-slf4j_2.10/1.0.2/grizzled-slf4j_2.10-1.0.2.pom
 (2 KB at 71.9 KB/sec)
2017-06-08\T\06:36:20.368 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.3/scala-library-2.10.3.pom
2017-06-08\T\06:36:20.395 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.3/scala-library-2.10.3.pom
 (2 KB at 73.9 KB/sec)
2017-06-08\T\06:36:20.397 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/github/scopt/scopt_2.10/3.5.0/scopt_2.10-3.5.0.pom
2017-06-08\T\06:36:20.424 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/github/scopt/scopt_2.10/3.5.0/scopt_2.10-3.5.0.pom
 (2 KB at 60.6 KB/sec)
2017-06-08\T\06:36:20.427 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/twitter/chill_2.10/0.7.4/chill_2.10-0.7.4.pom
2017-06-08\T\06:36:20.454 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/twitter/chill_2.10/0.7.4/chill_2.10-0.7.4.pom
 (3 KB at 85.8 KB/sec)
2017-06-08\T\06:36:20.456 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.5/scala-library-2.10.5.pom
2017-06-08\T\06:36:20.484 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.5/scala-library-2.10.5.pom
 (2 KB at 71.3 KB/sec)
2017-06-08\T\06:36:20.487 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/twitter/chill-java/0.7.4/chill-java-0.7.4.pom
2017-06-08\T\06:36:20.513 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/twitter/chill-java/0.7.4/chill-java-0.7.4.pom
 (2 KB at 73.6 KB/sec)
2017-06-08\T\06:36:20.515 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-optimizer_2.10/1.3.0/flink-optimizer_2.10-1.3.0.pom
2017-06-08\T\06:36:20.542 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-optimizer_2.10/1.3.0/flink-optimizer_2.10-1.3.0.pom
 (5 KB at 156.2 KB/sec)
2017-06-08\T\06:36:20.548 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.3.0/flink-streaming-java_2.10-1.3.0.pom
2017-06-08\T\06:36:20.577 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.3.0/flink-streaming-java_2.10-1.3.0.pom
 (7 KB at 207.9 KB/sec)
2017-06-08\T\06:36:20.584 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.pom
2017-06-08\T\06:36:20.610 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.pom
 (4 KB at 121.7 KB/sec)
2017-06-08\T\06:36:20.612 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/sling/sling/10/sling-10.pom
2017-06-08\T\06:36:20.642 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/sling/sling/10/sling-10.pom (26 
KB at 835.5 KB/sec)
2017-06-08\T\06:36:20.644 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/apache/8/apache-8.pom
2017-06-08\T\06:36:20.675 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/apache/8/apache-8.pom (14 KB at 
442.7 KB/sec)
2017-06-08\T\06:36:20.698 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.3.0/flink-test-utils_2.10-1.3.0.pom
2017-06-08\T\06:36:20.724 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.3.0/flink-test-utils_2.10-1.3.0.pom
 (5 KB at 178.1 KB/sec)
2017-06-08\T\06:36:20.725 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-parent/1.3.0/flink-test-utils-parent-1.3.0.pom
2017-06-08\T\06:36:20.751 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-parent/1.3.0/flink-test-utils-parent-1.3.0.pom
 (2 KB at 54.1 KB/sec)
2017-06-08\T\06:36:20.756 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.3.0/flink-test-utils-junit-1.3.0.pom
2017-06-08\T\06:36:20.783 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.3.0/flink-test-utils-junit-1.3.0.pom
 (4 KB at 114.6 KB/sec)
2017-06-08\T\06:36:20.789 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/curator/curator-test/2.12.0/cur

[GitHub] beam pull request #3325: Use inner module for non-public coders.

2017-06-07 Thread robertwb
GitHub user robertwb opened a pull request:

https://github.com/apache/beam/pull/3325

Use inner module for non-public coders.

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`.
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/robertwb/incubator-beam interval-window

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3325.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3325


commit f15518e13f53890a7bb8e087c7e4e71d81bc340a
Author: Robert Bradshaw 
Date:   2017-06-08T06:35:11Z

Use inner module for non-public coders.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Build failed in Jenkins: beam_PostCommit_Java_MavenInstall_Windows #107

2017-06-07 Thread Apache Jenkins Server
See 


Changes:

[ccy] Remove support for NativeSinks from the Python DirectRunner

[lcwik] Make BytesCoder to be a known type

[robertwb] Add coder info to pubsub io

--
[...truncated 2.55 MB...]
2017-06-08T06:31:59.021 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-slf4j/1.8/gossip-slf4j-1.8.pom
 (2 KB at 160.8 KB/sec)
2017-06-08T06:31:59.025 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip/1.8/gossip-1.8.pom
2017-06-08T06:31:59.035 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip/1.8/gossip-1.8.pom
 (12 KB at 1125.6 KB/sec)
2017-06-08T06:31:59.038 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/forge/forge-parent/9/forge-parent-9.pom
2017-06-08T06:31:59.049 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/forge/forge-parent/9/forge-parent-9.pom
 (13 KB at 1166.7 KB/sec)
2017-06-08T06:31:59.053 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-core/1.8/gossip-core-1.8.pom
2017-06-08T06:31:59.062 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-core/1.8/gossip-core-1.8.pom
 (3 KB at 241.4 KB/sec)
2017-06-08T06:31:59.066 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-bootstrap/1.8/gossip-bootstrap-1.8.pom
2017-06-08T06:31:59.074 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-bootstrap/1.8/gossip-bootstrap-1.8.pom
 (2 KB at 196.7 KB/sec)
2017-06-08T06:31:59.078 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/guava/guava/14.0.1/guava-14.0.1.pom
2017-06-08T06:31:59.088 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/guava/guava/14.0.1/guava-14.0.1.pom
 (6 KB at 525.0 KB/sec)
2017-06-08T06:31:59.091 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom
2017-06-08T06:31:59.099 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom
 (3 KB at 311.9 KB/sec)
2017-06-08T06:31:59.103 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-classworlds/2.4.2/plexus-classworlds-2.4.2.pom
2017-06-08T06:31:59.111 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-classworlds/2.4.2/plexus-classworlds-2.4.2.pom
 (4 KB at 428.3 KB/sec)
2017-06-08T06:31:59.116 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-interpolation/1.16/plexus-interpolation-1.16.pom
2017-06-08T06:31:59.122 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-interpolation/1.16/plexus-interpolation-1.16.pom
 (2 KB at 167.2 KB/sec)
2017-06-08T06:31:59.127 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/eclipse/aether/aether-api/0.9.0.M2/aether-api-0.9.0.M2.pom
2017-06-08T06:31:59.138 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/eclipse/aether/aether-api/0.9.0.M2/aether-api-0.9.0.M2.pom
 (2 KB at 154.2 KB/sec)
2017-06-08T06:31:59.143 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-api/2.0/gmaven-adapter-api-2.0.pom
2017-06-08T06:31:59.150 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-api/2.0/gmaven-adapter-api-2.0.pom
 (2 KB at 252.0 KB/sec)
2017-06-08T06:31:59.155 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-impl/2.0/gmaven-adapter-impl-2.0.pom
2017-06-08T06:31:59.163 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-impl/2.0/gmaven-adapter-impl-2.0.pom
 (3 KB at 331.8 KB/sec)
2017-06-08T06:31:59.168 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/groovy/groovy-all/2.1.5/groovy-all-2.1.5.pom
2017-06-08T06:31:59.176 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/groovy/groovy-all/2.1.5/groovy-all-2.1.5.pom
 (18 KB at 2202.0 KB/sec)
2017-06-08T06:31:59.180 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant/1.8.4/ant-1.8.4.pom
2017-06-08T06:31:59.188 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant/1.8.4/ant-1.8.4.pom (10 
KB at 1178.5 KB/sec)
2017-06-08T06:31:59.192 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-parent/1.8.4/ant-parent-1.8.4.pom
2017-06-08T06:31:59.199 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-parent/1.8.4/ant-parent-1.8.4.pom
 (5 KB at 636.3 KB/sec)
2017-06-08T06:31:59.203 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-launcher/1.8.4/ant-launcher-1.8.4.pom
2017-06-08T06:31:59.211 [INFO] Downloaded: 
https://repo.ma

Build failed in Jenkins: beam_PostCommit_Java_JDK_Versions_Test » JDK 1.7 (latest),beam #59

2017-06-07 Thread Apache Jenkins Server
See 


Changes:

[ccy] Remove support for NativeSinks from the Python DirectRunner

[lcwik] Make BytesCoder to be a known type

[robertwb] Add coder info to pubsub io

--
[...truncated 1.25 MB...]
2017-06-08\T\06:31:21.007 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/github/scopt/scopt_2.10/3.5.0/scopt_2.10-3.5.0.pom
2017-06-08\T\06:31:21.032 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/github/scopt/scopt_2.10/3.5.0/scopt_2.10-3.5.0.pom
 (2 KB at 65.4 KB/sec)
2017-06-08\T\06:31:21.033 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/twitter/chill_2.10/0.7.4/chill_2.10-0.7.4.pom
2017-06-08\T\06:31:21.058 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/twitter/chill_2.10/0.7.4/chill_2.10-0.7.4.pom
 (3 KB at 92.7 KB/sec)
2017-06-08\T\06:31:21.059 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.5/scala-library-2.10.5.pom
2017-06-08\T\06:31:21.085 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.5/scala-library-2.10.5.pom
 (2 KB at 79.8 KB/sec)
2017-06-08\T\06:31:21.086 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/twitter/chill-java/0.7.4/chill-java-0.7.4.pom
2017-06-08\T\06:31:21.111 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/twitter/chill-java/0.7.4/chill-java-0.7.4.pom
 (2 KB at 79.5 KB/sec)
2017-06-08\T\06:31:21.112 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-optimizer_2.10/1.3.0/flink-optimizer_2.10-1.3.0.pom
2017-06-08\T\06:31:21.137 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-optimizer_2.10/1.3.0/flink-optimizer_2.10-1.3.0.pom
 (5 KB at 168.7 KB/sec)
2017-06-08\T\06:31:21.142 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.3.0/flink-streaming-java_2.10-1.3.0.pom
2017-06-08\T\06:31:21.169 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.3.0/flink-streaming-java_2.10-1.3.0.pom
 (7 KB at 223.3 KB/sec)
2017-06-08\T\06:31:21.173 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.pom
2017-06-08\T\06:31:21.198 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.pom
 (4 KB at 126.6 KB/sec)
2017-06-08\T\06:31:21.200 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/sling/sling/10/sling-10.pom
2017-06-08\T\06:31:21.226 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/sling/sling/10/sling-10.pom (26 
KB at 964.1 KB/sec)
2017-06-08\T\06:31:21.228 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/apache/8/apache-8.pom
2017-06-08\T\06:31:21.253 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/apache/8/apache-8.pom (14 KB at 
549.0 KB/sec)
2017-06-08\T\06:31:21.265 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.3.0/flink-test-utils_2.10-1.3.0.pom
2017-06-08\T\06:31:21.291 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.3.0/flink-test-utils_2.10-1.3.0.pom
 (5 KB at 178.1 KB/sec)
2017-06-08\T\06:31:21.292 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-parent/1.3.0/flink-test-utils-parent-1.3.0.pom
2017-06-08\T\06:31:21.318 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-parent/1.3.0/flink-test-utils-parent-1.3.0.pom
 (2 KB at 54.1 KB/sec)
2017-06-08\T\06:31:21.321 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.3.0/flink-test-utils-junit-1.3.0.pom
2017-06-08\T\06:31:21.347 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.3.0/flink-test-utils-junit-1.3.0.pom
 (4 KB at 123.4 KB/sec)
2017-06-08\T\06:31:21.352 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/curator/curator-test/2.12.0/curator-test-2.12.0.pom
2017-06-08\T\06:31:21.376 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/curator/curator-test/2.12.0/curator-test-2.12.0.pom
 (5 KB at 168.7 KB/sec)
2017-06-08\T\06:31:21.378 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/curator/apache-curator/2.12.0/apache-curator-2.12.0.pom
2017-06-08\T\06:31:21.405 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/curator/apache-curator/2.12.0/apache-curator-2.12.0.pom
 (32 KB at 1169.3 KB/sec)
2017-06-08\T\06:31:21.408 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/zookeeper/zookeeper/3.4.8/zookeeper-3.4

Build failed in Jenkins: beam_PostCommit_Java_JDK_Versions_Test » OpenJDK 7 (on Ubuntu only),beam #59

2017-06-07 Thread Apache Jenkins Server
See 


Changes:

[ccy] Remove support for NativeSinks from the Python DirectRunner

[lcwik] Make BytesCoder to be a known type

[robertwb] Add coder info to pubsub io

--
[...truncated 1.26 MB...]
2017-06-08\T\06:30:11.050 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/clapper/grizzled-slf4j_2.10/1.0.2/grizzled-slf4j_2.10-1.0.2.pom
2017-06-08\T\06:30:11.077 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/clapper/grizzled-slf4j_2.10/1.0.2/grizzled-slf4j_2.10-1.0.2.pom
 (2 KB at 69.3 KB/sec)
2017-06-08\T\06:30:11.078 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.3/scala-library-2.10.3.pom
2017-06-08\T\06:30:11.104 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.3/scala-library-2.10.3.pom
 (2 KB at 76.8 KB/sec)
2017-06-08\T\06:30:11.105 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/github/scopt/scopt_2.10/3.5.0/scopt_2.10-3.5.0.pom
2017-06-08\T\06:30:11.131 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/github/scopt/scopt_2.10/3.5.0/scopt_2.10-3.5.0.pom
 (2 KB at 62.9 KB/sec)
2017-06-08\T\06:30:11.132 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/twitter/chill_2.10/0.7.4/chill_2.10-0.7.4.pom
2017-06-08\T\06:30:11.157 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/twitter/chill_2.10/0.7.4/chill_2.10-0.7.4.pom
 (3 KB at 92.7 KB/sec)
2017-06-08\T\06:30:11.159 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.5/scala-library-2.10.5.pom
2017-06-08\T\06:30:11.184 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.5/scala-library-2.10.5.pom
 (2 KB at 79.8 KB/sec)
2017-06-08\T\06:30:11.186 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/twitter/chill-java/0.7.4/chill-java-0.7.4.pom
2017-06-08\T\06:30:11.211 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/twitter/chill-java/0.7.4/chill-java-0.7.4.pom
 (2 KB at 76.4 KB/sec)
2017-06-08\T\06:30:11.213 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-optimizer_2.10/1.3.0/flink-optimizer_2.10-1.3.0.pom
2017-06-08\T\06:30:11.238 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-optimizer_2.10/1.3.0/flink-optimizer_2.10-1.3.0.pom
 (5 KB at 168.7 KB/sec)
2017-06-08\T\06:30:11.242 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.3.0/flink-streaming-java_2.10-1.3.0.pom
2017-06-08\T\06:30:11.274 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.3.0/flink-streaming-java_2.10-1.3.0.pom
 (7 KB at 188.4 KB/sec)
2017-06-08\T\06:30:11.278 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.pom
2017-06-08\T\06:30:11.304 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.pom
 (4 KB at 121.7 KB/sec)
2017-06-08\T\06:30:11.306 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/sling/sling/10/sling-10.pom
2017-06-08\T\06:30:11.333 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/sling/sling/10/sling-10.pom (26 
KB at 895.2 KB/sec)
2017-06-08\T\06:30:11.334 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/apache/8/apache-8.pom
2017-06-08\T\06:30:11.361 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/apache/8/apache-8.pom (14 KB at 
508.3 KB/sec)
2017-06-08\T\06:30:11.373 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.3.0/flink-test-utils_2.10-1.3.0.pom
2017-06-08\T\06:30:11.399 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.3.0/flink-test-utils_2.10-1.3.0.pom
 (5 KB at 178.1 KB/sec)
2017-06-08\T\06:30:11.400 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-parent/1.3.0/flink-test-utils-parent-1.3.0.pom
2017-06-08\T\06:30:11.427 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-parent/1.3.0/flink-test-utils-parent-1.3.0.pom
 (2 KB at 52.1 KB/sec)
2017-06-08\T\06:30:11.430 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.3.0/flink-test-utils-junit-1.3.0.pom
2017-06-08\T\06:30:11.459 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.3.0/flink-test-utils-junit-1.3.0.pom
 (4 KB at 110.7 KB/sec)
2017-06-08\T\06:30:11.464 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/curator/curator-test/2.12.0/cur

[GitHub] beam pull request #3301: Beam-2383 round function

2017-06-07 Thread app-tarush
Github user app-tarush closed the pull request at:

https://github.com/apache/beam/pull/3301


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] beam pull request #3324: Avoid flakiness in data channel for empty streams.

2017-06-07 Thread robertwb
GitHub user robertwb opened a pull request:

https://github.com/apache/beam/pull/3324

Avoid flakiness in data channel for empty streams.

As empty stream is used as end-of-stream marker, don't ever send
it as the data itself.

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`.
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/robertwb/incubator-beam data-plane

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3324.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3324


commit ce8e87c028eb90c18319126fec1d6f1d54af8ab4
Author: Robert Bradshaw 
Date:   2017-06-08T06:00:43Z

Avoid flakiness in data channel for empty streams.

As empty stream is used as end-of-stream marker, don't ever send
it as the data itself.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] beam pull request #3317: python pubsub io fixes for streaming

2017-06-07 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3317


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] beam git commit: Closes #3317

2017-06-07 Thread robertwb
Repository: beam
Updated Branches:
  refs/heads/master 0a0a1bc74 -> 86e1fab69


Closes #3317


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/86e1fab6
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/86e1fab6
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/86e1fab6

Branch: refs/heads/master
Commit: 86e1fab69f296247624d7799f151ff25c3b7243a
Parents: 0a0a1bc b5852d2
Author: Robert Bradshaw 
Authored: Wed Jun 7 22:55:00 2017 -0700
Committer: Robert Bradshaw 
Committed: Wed Jun 7 22:55:00 2017 -0700

--
 sdks/python/apache_beam/io/gcp/pubsub.py| 32 +++-
 sdks/python/apache_beam/io/gcp/pubsub_test.py   | 28 +++--
 .../runners/dataflow/dataflow_runner.py | 23 ++
 3 files changed, 67 insertions(+), 16 deletions(-)
--




[2/2] beam git commit: Add coder info to pubsub io

2017-06-07 Thread robertwb
Add coder info to pubsub io


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/b5852d21
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/b5852d21
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/b5852d21

Branch: refs/heads/master
Commit: b5852d212cab060321c43a5800f8585aa3649aec
Parents: 0a0a1bc
Author: Vikas Kedigehalli 
Authored: Wed Jun 7 16:28:18 2017 -0700
Committer: Robert Bradshaw 
Committed: Wed Jun 7 22:55:00 2017 -0700

--
 sdks/python/apache_beam/io/gcp/pubsub.py| 32 +++-
 sdks/python/apache_beam/io/gcp/pubsub_test.py   | 28 +++--
 .../runners/dataflow/dataflow_runner.py | 23 ++
 3 files changed, 67 insertions(+), 16 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/b5852d21/sdks/python/apache_beam/io/gcp/pubsub.py
--
diff --git a/sdks/python/apache_beam/io/gcp/pubsub.py 
b/sdks/python/apache_beam/io/gcp/pubsub.py
index 1ba8ac0..40326e1 100644
--- a/sdks/python/apache_beam/io/gcp/pubsub.py
+++ b/sdks/python/apache_beam/io/gcp/pubsub.py
@@ -40,13 +40,15 @@ __all__ = ['ReadStringsFromPubSub', 'WriteStringsToPubSub',
 class ReadStringsFromPubSub(PTransform):
   """A ``PTransform`` for reading utf-8 string payloads from Cloud Pub/Sub."""
 
-  def __init__(self, topic, subscription=None, id_label=None):
+  def __init__(self, topic=None, subscription=None, id_label=None):
 """Initializes ``ReadStringsFromPubSub``.
 
 Attributes:
-  topic: Cloud Pub/Sub topic in the form "/topics//".
-  subscription: Optional existing Cloud Pub/Sub subscription to use in the
-form "projects//subscriptions/".
+  topic: Cloud Pub/Sub topic in the form "/topics//". If
+provided then subscription must be None.
+  subscription: Existing Cloud Pub/Sub subscription to use in the
+form "projects//subscriptions/". If provided 
then
+topic must be None.
   id_label: The attribute on incoming Pub/Sub messages to use as a unique
 record identifier.  When specified, the value of this attribute (which
 can be any string that uniquely identifies the record) will be used for
@@ -55,6 +57,12 @@ class ReadStringsFromPubSub(PTransform):
 case, deduplication of the stream will be strictly best effort.
 """
 super(ReadStringsFromPubSub, self).__init__()
+if topic and subscription:
+  raise ValueError("Only one of topic or subscription should be provided.")
+
+if not (topic or subscription):
+  raise ValueError("Either a topic or subscription must be provided.")
+
 self._source = _PubSubPayloadSource(
 topic,
 subscription=subscription,
@@ -90,9 +98,11 @@ class _PubSubPayloadSource(dataflow_io.NativeSource):
   """Source for the payload of a message as bytes from a Cloud Pub/Sub topic.
 
   Attributes:
-topic: Cloud Pub/Sub topic in the form "/topics//".
-subscription: Optional existing Cloud Pub/Sub subscription to use in the
-  form "projects//subscriptions/".
+topic: Cloud Pub/Sub topic in the form "/topics//". If
+  provided then topic must be None.
+subscription: Existing Cloud Pub/Sub subscription to use in the
+  form "projects//subscriptions/". If provided then
+  subscription must be None.
 id_label: The attribute on incoming Pub/Sub messages to use as a unique
   record identifier.  When specified, the value of this attribute (which 
can
   be any string that uniquely identifies the record) will be used for
@@ -101,7 +111,10 @@ class _PubSubPayloadSource(dataflow_io.NativeSource):
   case, deduplication of the stream will be strictly best effort.
   """
 
-  def __init__(self, topic, subscription=None, id_label=None):
+  def __init__(self, topic=None, subscription=None, id_label=None):
+# we are using this coder explicitly for portability reasons of PubsubIO
+# across implementations in languages.
+self.coder = coders.BytesCoder()
 self.topic = topic
 self.subscription = subscription
 self.id_label = id_label
@@ -131,6 +144,9 @@ class _PubSubPayloadSink(dataflow_io.NativeSink):
   """Sink for the payload of a message as bytes to a Cloud Pub/Sub topic."""
 
   def __init__(self, topic):
+# we are using this coder explicitly for portability reasons of PubsubIO
+# across implementations in languages.
+self.coder = coders.BytesCoder()
 self.topic = topic
 
   @property

http://git-wip-us.apache.org/repos/asf/beam/blob/b5852d21/sdks/python/apache_beam/io/gcp/pubsub_test.py
--
diff --git a/sdks/python/apache_beam/io/gcp/pubsub_test.py 
b/sdks/python/apache_beam/io/gcp/pubsub_test.py
index 322d08a..cf14e8c 10

[GitHub] beam pull request #3311: Sql rebase

2017-06-07 Thread jbonofre
Github user jbonofre closed the pull request at:

https://github.com/apache/beam/pull/3311


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Build failed in Jenkins: beam_PostCommit_Python_Verify #2438

2017-06-07 Thread Apache Jenkins Server
See 


Changes:

[lcwik] Make BytesCoder to be a known type

--
[...truncated 501.02 KB...]
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.0.1.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.0.1.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
test_read_from_text_metrics 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
:135:
 UserWarning: Using fallback coder for typehint: Union[TaggedOutput, int].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-36.0.1.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.10.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.0.1.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
:135:
 UserWarning: Using fallback coder for typehint: List[Any].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:135:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
:135:
 UserWarning: Using fallback coder for typehint: Union[Tuple[str, None], 
Tuple[str, int]].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:135:
 UserWarning: Using fallback coder for typehint: List[Any].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
:135:
 UserWarning: Using fallback coder for typehint: Union[].
  warnings.warn('Using fallback coder for typehint: %r.' % typehint)
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
DEPRECATION: pip install --download has been deprecated and will be removed in 
the future. Pip now has a download command that should be used instead.
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-36.0.1.zip
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting six (from pyhamcrest->-r postcommit_requirements.

Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Spark #2304

2017-06-07 Thread Apache Jenkins Server
See 




[2/2] beam git commit: [BEAM-1226] Make BytesCoder to be a known type

2017-06-07 Thread lcwik
[BEAM-1226] Make BytesCoder to be a known type

This closes #3316


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/0a0a1bc7
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/0a0a1bc7
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/0a0a1bc7

Branch: refs/heads/master
Commit: 0a0a1bc7493ad71f1a70b1c9921ba09ecfe4c2e8
Parents: 66460cb d94ac58
Author: Luke Cwik 
Authored: Wed Jun 7 20:08:49 2017 -0700
Committer: Luke Cwik 
Committed: Wed Jun 7 20:08:49 2017 -0700

--
 sdks/python/apache_beam/coders/coders.py  | 5 +
 sdks/python/apache_beam/runners/worker/operation_specs.py | 4 
 2 files changed, 9 insertions(+)
--




[GitHub] beam pull request #3316: Make BytesCoder to be a known type

2017-06-07 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3316


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] beam git commit: Make BytesCoder to be a known type

2017-06-07 Thread lcwik
Repository: beam
Updated Branches:
  refs/heads/master 66460cb2d -> 0a0a1bc74


Make BytesCoder to be a known type


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/d94ac58e
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/d94ac58e
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/d94ac58e

Branch: refs/heads/master
Commit: d94ac58ea2d12f55743e8ad27a02bdb83c194da7
Parents: 66460cb
Author: Vikas Kedigehalli 
Authored: Wed Jun 7 16:26:21 2017 -0700
Committer: Luke Cwik 
Committed: Wed Jun 7 20:05:40 2017 -0700

--
 sdks/python/apache_beam/coders/coders.py  | 5 +
 sdks/python/apache_beam/runners/worker/operation_specs.py | 4 
 2 files changed, 9 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/d94ac58e/sdks/python/apache_beam/coders/coders.py
--
diff --git a/sdks/python/apache_beam/coders/coders.py 
b/sdks/python/apache_beam/coders/coders.py
index f40045d..f3e0b43 100644
--- a/sdks/python/apache_beam/coders/coders.py
+++ b/sdks/python/apache_beam/coders/coders.py
@@ -286,6 +286,11 @@ class BytesCoder(FastCoder):
   def is_deterministic(self):
 return True
 
+  def as_cloud_object(self):
+return {
+'@type': 'kind:bytes',
+}
+
   def __eq__(self, other):
 return type(self) == type(other)
 

http://git-wip-us.apache.org/repos/asf/beam/blob/d94ac58e/sdks/python/apache_beam/runners/worker/operation_specs.py
--
diff --git a/sdks/python/apache_beam/runners/worker/operation_specs.py 
b/sdks/python/apache_beam/runners/worker/operation_specs.py
index db5eb76..b8d19a1 100644
--- a/sdks/python/apache_beam/runners/worker/operation_specs.py
+++ b/sdks/python/apache_beam/runners/worker/operation_specs.py
@@ -339,6 +339,10 @@ def get_coder_from_spec(coder_spec):
 assert len(coder_spec['component_encodings']) == 1
 return coders.coders.LengthPrefixCoder(
 get_coder_from_spec(coder_spec['component_encodings'][0]))
+  elif coder_spec['@type'] == 'kind:bytes':
+assert ('component_encodings' not in coder_spec
+or len(coder_spec['component_encodings'] == 0))
+return coders.BytesCoder()
 
   # We pass coders in the form "$" to make the job
   # description JSON more readable.



[GitHub] beam pull request #3323: upgrade to 2.1.0-SNAPSHOT

2017-06-07 Thread XuMingmin
GitHub user XuMingmin opened a pull request:

https://github.com/apache/beam/pull/3323

upgrade to 2.1.0-SNAPSHOT

upgrade to version 2.1.0-SNAPSHOT in branch `DSL_SQL`.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/XuMingmin/beam upgrade_to_v2

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3323.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3323


commit 03a913a95c99474841a175b727925ba7c1eed4c9
Author: mingmxu 
Date:   2017-06-08T02:27:32Z

upgrade to version 2.1.0-SNAPSHOT




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Spark #2303

2017-06-07 Thread Apache Jenkins Server
See 




[GitHub] beam pull request #3321: Remove support for NativeSinks from the Python Dire...

2017-06-07 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3321


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[2/2] beam git commit: This closes #3321

2017-06-07 Thread altay
This closes #3321


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/66460cb2
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/66460cb2
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/66460cb2

Branch: refs/heads/master
Commit: 66460cb2d3db56261483c39acdea4786fec156fd
Parents: 32f22b7 f2e3088
Author: Ahmet Altay 
Authored: Wed Jun 7 18:45:42 2017 -0700
Committer: Ahmet Altay 
Committed: Wed Jun 7 18:45:42 2017 -0700

--
 .../runners/direct/transform_evaluator.py   | 62 +---
 1 file changed, 1 insertion(+), 61 deletions(-)
--




[1/2] beam git commit: Remove support for NativeSinks from the Python DirectRunner

2017-06-07 Thread altay
Repository: beam
Updated Branches:
  refs/heads/master 32f22b7d9 -> 66460cb2d


Remove support for NativeSinks from the Python DirectRunner


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/f2e30886
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/f2e30886
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/f2e30886

Branch: refs/heads/master
Commit: f2e3088633fef10f19bfd11ff9b508930916a740
Parents: 32f22b7
Author: Charles Chen 
Authored: Wed Jun 7 17:00:57 2017 -0700
Committer: Charles Chen 
Committed: Wed Jun 7 17:01:33 2017 -0700

--
 .../runners/direct/transform_evaluator.py   | 62 +---
 1 file changed, 1 insertion(+), 61 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/f2e30886/sdks/python/apache_beam/runners/direct/transform_evaluator.py
--
diff --git a/sdks/python/apache_beam/runners/direct/transform_evaluator.py 
b/sdks/python/apache_beam/runners/direct/transform_evaluator.py
index b1cb626..0fec8b8 100644
--- a/sdks/python/apache_beam/runners/direct/transform_evaluator.py
+++ b/sdks/python/apache_beam/runners/direct/transform_evaluator.py
@@ -29,7 +29,6 @@ from apache_beam.runners.common import DoFnRunner
 from apache_beam.runners.common import DoFnState
 from apache_beam.runners.direct.watermark_manager import WatermarkManager
 from apache_beam.runners.direct.transform_result import TransformResult
-from apache_beam.runners.dataflow.native_io.iobase import _NativeWrite  # 
pylint: disable=protected-access
 from apache_beam.transforms import core
 from apache_beam.transforms.window import GlobalWindows
 from apache_beam.transforms.window import WindowedValue
@@ -54,7 +53,6 @@ class TransformEvaluatorRegistry(object):
 core.Flatten: _FlattenEvaluator,
 core.ParDo: _ParDoEvaluator,
 core._GroupByKeyOnly: _GroupByKeyOnlyEvaluator,
-_NativeWrite: _NativeWriteEvaluator,
 }
 
   def for_application(
@@ -98,8 +96,7 @@ class TransformEvaluatorRegistry(object):
 Returns:
   True if executor should execute applied_ptransform serially.
 """
-return isinstance(applied_ptransform.transform,
-  (core._GroupByKeyOnly, _NativeWrite))
+return isinstance(applied_ptransform.transform, core._GroupByKeyOnly)
 
 
 class _TransformEvaluator(object):
@@ -403,60 +400,3 @@ class _GroupByKeyOnlyEvaluator(_TransformEvaluator):
 
 return TransformResult(
 self._applied_ptransform, bundles, state, None, None, hold)
-
-
-class _NativeWriteEvaluator(_TransformEvaluator):
-  """TransformEvaluator for _NativeWrite transform."""
-
-  def __init__(self, evaluation_context, applied_ptransform,
-   input_committed_bundle, side_inputs, scoped_metrics_container):
-assert not side_inputs
-super(_NativeWriteEvaluator, self).__init__(
-evaluation_context, applied_ptransform, input_committed_bundle,
-side_inputs, scoped_metrics_container)
-
-assert applied_ptransform.transform.sink
-self._sink = applied_ptransform.transform.sink
-
-  @property
-  def _is_final_bundle(self):
-return (self._execution_context.watermarks.input_watermark
-== WatermarkManager.WATERMARK_POS_INF)
-
-  @property
-  def _has_already_produced_output(self):
-return (self._execution_context.watermarks.output_watermark
-== WatermarkManager.WATERMARK_POS_INF)
-
-  def start_bundle(self):
-# state: [values]
-self.state = (self._execution_context.existing_state
-  if self._execution_context.existing_state else [])
-
-  def process_element(self, element):
-self.state.append(element)
-
-  def finish_bundle(self):
-# finish_bundle will append incoming bundles in memory until all the 
bundles
-# carrying data is processed. This is done to produce only a single output
-# shard (some tests depends on this behavior). It is possible to have
-# incoming empty bundles after the output is produced, these bundles will 
be
-# ignored and would not generate additional output files.
-# TODO(altay): Do not wait until the last bundle to write in a single 
shard.
-if self._is_final_bundle:
-  if self._has_already_produced_output:
-# Ignore empty bundles that arrive after the output is produced.
-assert self.state == []
-  else:
-self._sink.pipeline_options = self._evaluation_context.pipeline_options
-with self._sink.writer() as writer:
-  for v in self.state:
-writer.Write(v.value)
-  state = None
-  hold = WatermarkManager.WATERMARK_POS_INF
-else:
-  state = self.state
-  hold = WatermarkManager.WATERMARK_NEG_INF
-
-return TransformResult(
-self._applied_ptransform, [], s

[50/50] beam git commit: This closes #3314

2017-06-07 Thread davor
This closes #3314


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/4c5b7584
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/4c5b7584
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/4c5b7584

Branch: refs/heads/DSL_SQL
Commit: 4c5b7584afac5dff2abf5c83bccd066164db3c5e
Parents: fcc80ce 32f22b7
Author: Davor Bonaci 
Authored: Wed Jun 7 18:33:58 2017 -0700
Committer: Davor Bonaci 
Committed: Wed Jun 7 18:33:58 2017 -0700

--
 .gitignore  |2 +
 .../jenkins/common_job_properties.groovy|   35 +-
 .../job_beam_PerformanceTests_Dataflow.groovy   |3 +
 ..._beam_PostCommit_Java_JDKVersionsTest.groovy |   60 +
 ...job_beam_PostCommit_Java_MavenInstall.groovy |2 +-
 ..._PostCommit_Java_MavenInstall_Windows.groovy |   45 +
 ...tCommit_Java_ValidatesRunner_Dataflow.groovy |2 +-
 .../job_beam_PreCommit_Java_MavenInstall.groovy |5 +-
 .../job_beam_Release_NightlySnapshot.groovy |2 +-
 .../cassandra/LargeITCluster/setup.sh   |   21 +
 .../cassandra/LargeITCluster/start-up.sh|   22 -
 .../cassandra/LargeITCluster/teardown.sh|1 -
 .../cassandra/SmallITCluster/setup.sh   |   22 +
 .../cassandra/SmallITCluster/start-up.sh|   23 -
 .../cassandra/SmallITCluster/teardown.sh|1 -
 .../LargeProductionCluster/setup.sh |   21 +
 .../LargeProductionCluster/start-up.sh  |   22 -
 .../LargeProductionCluster/teardown.sh  |1 -
 .../elasticsearch/SmallITCluster/setup.sh   |   22 +
 .../elasticsearch/SmallITCluster/start-up.sh|   23 -
 .../elasticsearch/SmallITCluster/teardown.sh|1 -
 README.md   |4 +-
 examples/java/README.md |   64 +-
 examples/java/pom.xml   |   53 +-
 .../apache/beam/examples/WindowedWordCount.java |7 +-
 .../beam/examples/common/ExampleUtils.java  |6 +-
 .../examples/common/WriteOneFilePerWindow.java  |   24 +-
 .../org/apache/beam/examples/complete/README.md |2 +-
 .../apache/beam/examples/complete/TfIdf.java|2 +-
 .../org/apache/beam/examples/cookbook/README.md |2 +-
 .../beam/examples/WindowedWordCountIT.java  |   26 +-
 .../beam/examples/complete/TfIdfTest.java   |2 +-
 examples/java8/pom.xml  |  124 +-
 .../beam/examples/complete/game/GameStats.java  |   15 +-
 .../examples/complete/game/HourlyTeamScore.java |   58 +-
 .../examples/complete/game/LeaderBoard.java |   62 +-
 .../beam/examples/complete/game/README.md   |  131 -
 .../beam/examples/complete/game/UserScore.java  |   74 +-
 .../complete/game/utils/WriteToBigQuery.java|   32 +-
 .../complete/game/utils/WriteToText.java|  184 ++
 .../game/utils/WriteWindowedToBigQuery.java |9 +-
 examples/pom.xml|2 +-
 pom.xml |  173 +-
 runners/apex/README.md  |   76 -
 runners/apex/pom.xml|   15 +-
 .../beam/runners/apex/ApexPipelineOptions.java  |5 +
 .../apache/beam/runners/apex/ApexRunner.java|  173 +-
 .../beam/runners/apex/ApexRunnerRegistrar.java  |3 +-
 .../beam/runners/apex/ApexRunnerResult.java |2 -
 .../beam/runners/apex/ApexYarnLauncher.java |2 -
 .../beam/runners/apex/TestApexRunner.java   |   10 +-
 .../translation/ApexPipelineTranslator.java |   28 +-
 .../apex/translation/ParDoTranslator.java   |   67 +-
 .../translation/ReadUnboundedTranslator.java|1 -
 .../apex/translation/TransformTranslator.java   |2 -
 .../apex/translation/TranslationContext.java|  102 +-
 .../operators/ApexGroupByKeyOperator.java   |   14 +-
 .../operators/ApexParDoOperator.java|  130 +-
 .../operators/ApexProcessFnOperator.java|8 +-
 .../ApexReadUnboundedInputOperator.java |2 -
 .../operators/ApexTimerInternals.java   |   23 +-
 .../translation/utils/ApexStateInternals.java   |   24 +-
 .../apex/translation/utils/ApexStreamTuple.java |   16 +-
 .../utils/CoderAdapterStreamCodec.java  |2 -
 .../apex/translation/utils/NoOpStepContext.java |   37 +-
 .../utils/SerializablePipelineOptions.java  |   19 +-
 .../translation/utils/StateInternalsProxy.java  |   11 +-
 .../utils/ValueAndCoderKryoSerializable.java|2 -
 .../apex/translation/utils/ValuesSource.java|2 -
 .../beam/runners/apex/ApexRunnerTest.java   |   49 +-
 .../beam/runners/apex/ApexYarnLauncherTest.java |2 -
 .../apex/examples/UnboundedTextSource.java  |2 -
 .../runners/apex/examples/WordCountTest.java|2 -
 .../translation/ApexGroupByKeyOperatorTest.java |4 +-
 .../FlattenPCollectionTranslatorTest.java   |   13 +-
 .../apex/translation/ParDoT

[21/50] beam git commit: This closes #3305

2017-06-07 Thread davor
This closes #3305


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/fa3922b5
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/fa3922b5
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/fa3922b5

Branch: refs/heads/DSL_SQL
Commit: fa3922b57b86beb163efb892d1f3c699402d684a
Parents: e3139a3 5113950
Author: Ahmet Altay 
Authored: Tue Jun 6 13:57:01 2017 -0700
Committer: Ahmet Altay 
Committed: Tue Jun 6 13:57:01 2017 -0700

--
 sdks/python/apache_beam/options/pipeline_options.py  | 5 +++--
 .../apache_beam/options/pipeline_options_validator_test.py   | 8 
 sdks/python/apache_beam/runners/dataflow/dataflow_runner.py  | 2 +-
 .../apache_beam/runners/dataflow/internal/apiclient.py   | 2 +-
 4 files changed, 5 insertions(+), 12 deletions(-)
--




[49/50] beam git commit: This closes #3294

2017-06-07 Thread davor
This closes #3294


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/32f22b7d
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/32f22b7d
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/32f22b7d

Branch: refs/heads/DSL_SQL
Commit: 32f22b7d9cfd5dcd22555272c5a7365fd1323e5f
Parents: caecac3 9c83ffe
Author: Ismaël Mejía 
Authored: Wed Jun 7 23:14:02 2017 +0200
Committer: Ismaël Mejía 
Committed: Wed Jun 7 23:14:02 2017 +0200

--
 .../wrappers/streaming/io/UnboundedSourceWrapper.java   | 5 +
 1 file changed, 5 insertions(+)
--




[25/50] beam git commit: This closes #3304

2017-06-07 Thread davor
This closes #3304


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/6853d8ef
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/6853d8ef
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/6853d8ef

Branch: refs/heads/DSL_SQL
Commit: 6853d8ef6532f0897ad075bda2e47c38a7aa6214
Parents: fdfd775 dcfb31f
Author: Ahmet Altay 
Authored: Tue Jun 6 14:13:40 2017 -0700
Committer: Ahmet Altay 
Committed: Tue Jun 6 14:13:40 2017 -0700

--
 sdks/python/tox.ini | 2 ++
 1 file changed, 2 insertions(+)
--




[14/50] beam git commit: This closes #3279

2017-06-07 Thread davor
This closes #3279


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/7c608c32
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/7c608c32
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/7c608c32

Branch: refs/heads/DSL_SQL
Commit: 7c608c32a871124e2dcb8533fee2d354229283e9
Parents: a054550 ae3dc5f
Author: Ahmet Altay 
Authored: Tue Jun 6 12:58:31 2017 -0700
Committer: Ahmet Altay 
Committed: Tue Jun 6 12:58:31 2017 -0700

--
 sdks/python/setup.py | 20 
 1 file changed, 12 insertions(+), 8 deletions(-)
--




[32/50] beam git commit: This closes #3300

2017-06-07 Thread davor
This closes #3300


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/1d2000d8
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/1d2000d8
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/1d2000d8

Branch: refs/heads/DSL_SQL
Commit: 1d2000d8c570ce2a2cdb63b5a208201526c394d8
Parents: 6fed177 171a993
Author: Ahmet Altay 
Authored: Tue Jun 6 16:44:03 2017 -0700
Committer: Ahmet Altay 
Committed: Tue Jun 6 16:44:03 2017 -0700

--
 .../runners/dataflow/internal/dependency.py   |  7 ++-
 .../runners/dataflow/internal/dependency_test.py  | 14 +-
 2 files changed, 19 insertions(+), 2 deletions(-)
--




[23/50] beam git commit: This closes #3288

2017-06-07 Thread davor
This closes #3288


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/fdfd7751
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/fdfd7751
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/fdfd7751

Branch: refs/heads/DSL_SQL
Commit: fdfd775101f0e24f7cb3dce1894dafb10b39fb2b
Parents: fa3922b 1498684
Author: chamik...@google.com 
Authored: Tue Jun 6 14:05:03 2017 -0700
Committer: chamik...@google.com 
Committed: Tue Jun 6 14:05:03 2017 -0700

--
 sdks/python/apache_beam/io/gcp/bigquery.py  | 177 +++
 sdks/python/apache_beam/io/gcp/bigquery_test.py | 172 ++
 2 files changed, 349 insertions(+)
--




[36/50] beam git commit: [BEAM-245] Add CassandraIO

2017-06-07 Thread davor
[BEAM-245] Add CassandraIO


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/0b0bb3dc
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/0b0bb3dc
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/0b0bb3dc

Branch: refs/heads/DSL_SQL
Commit: 0b0bb3dc8d18b7d78780dbd39705e16a8aae028e
Parents: 3cc4ff6
Author: Jean-Baptiste Onofré 
Authored: Tue Mar 28 16:46:37 2017 +0200
Committer: Jean-Baptiste Onofré 
Committed: Wed Jun 7 07:40:05 2017 +0200

--
 sdks/java/io/cassandra/pom.xml  | 113 
 .../beam/sdk/io/cassandra/CassandraIO.java  | 510 +++
 .../beam/sdk/io/cassandra/CassandraService.java |  66 +++
 .../sdk/io/cassandra/CassandraServiceImpl.java  | 398 +++
 .../beam/sdk/io/cassandra/package-info.java |  22 +
 .../beam/sdk/io/cassandra/CassandraIOIT.java| 254 +
 .../beam/sdk/io/cassandra/CassandraIOTest.java  | 279 ++
 .../io/cassandra/CassandraServiceImplTest.java  | 138 +
 .../sdk/io/cassandra/CassandraTestDataSet.java  | 153 ++
 .../sdk/io/common/IOTestPipelineOptions.java|  10 +
 sdks/java/io/pom.xml|   1 +
 11 files changed, 1944 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/0b0bb3dc/sdks/java/io/cassandra/pom.xml
--
diff --git a/sdks/java/io/cassandra/pom.xml b/sdks/java/io/cassandra/pom.xml
new file mode 100644
index 000..8249f57
--- /dev/null
+++ b/sdks/java/io/cassandra/pom.xml
@@ -0,0 +1,113 @@
+
+
+http://maven.apache.org/POM/4.0.0"; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"; 
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
+
+  4.0.0
+
+  
+org.apache.beam
+beam-sdks-java-io-parent
+2.1.0-SNAPSHOT
+../pom.xml
+  
+
+  beam-sdks-java-io-cassandra
+  Apache Beam :: SDKs :: Java :: IO :: Cassandra
+  IO to read and write with Apache Cassandra 
database
+
+  
+3.2.0
+  
+
+  
+
+  org.apache.beam
+  beam-sdks-java-core
+
+
+  org.slf4j
+  slf4j-api
+
+
+  com.google.guava
+  guava
+
+
+
+  com.google.code.findbugs
+  jsr305
+
+
+
+  com.datastax.cassandra
+  cassandra-driver-mapping
+  ${cassandra.driver.version}
+
+
+  com.datastax.cassandra
+  cassandra-driver-core
+  ${cassandra.driver.version}
+
+
+
+
+  com.google.auto.value
+  auto-value
+  provided
+
+
+
+
+  junit
+  junit
+  test
+
+
+  org.hamcrest
+  hamcrest-core
+  test
+
+
+  org.hamcrest
+  hamcrest-all
+  test
+
+
+  org.slf4j
+  slf4j-jdk14
+  test
+
+
+  org.apache.beam
+  beam-runners-direct-java
+  test
+
+
+  org.mockito
+  mockito-all
+  test
+
+
+  org.apache.beam
+  beam-sdks-java-io-common
+  test
+  tests
+
+  
+
+
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/beam/blob/0b0bb3dc/sdks/java/io/cassandra/src/main/java/org/apache/beam/sdk/io/cassandra/CassandraIO.java
--
diff --git 
a/sdks/java/io/cassandra/src/main/java/org/apache/beam/sdk/io/cassandra/CassandraIO.java
 
b/sdks/java/io/cassandra/src/main/java/org/apache/beam/sdk/io/cassandra/CassandraIO.java
new file mode 100644
index 000..b6f4ef6
--- /dev/null
+++ 
b/sdks/java/io/cassandra/src/main/java/org/apache/beam/sdk/io/cassandra/CassandraIO.java
@@ -0,0 +1,510 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.cassandra;
+
+import static com.google.common.base.Preconditions.checkArgument;
+import static com.google.common.base.Preconditions.checkState;
+
+import com.google.auto.value.AutoValue;
+import com.google.common.annotations.VisibleForTesting;
+
+import java.util.List;
+
+import javax.annotation.Nullable;
+
+i

[40/50] beam git commit: [BEAM-1231] Add missed "kind:bytes" to CloudObjectKinds/CloudObjectTranslators

2017-06-07 Thread davor
[BEAM-1231] Add missed "kind:bytes" to CloudObjectKinds/CloudObjectTranslators

This closes #3310


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/b2de3db8
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/b2de3db8
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/b2de3db8

Branch: refs/heads/DSL_SQL
Commit: b2de3db899a6e1bfc48dc31d76761b3cf6089c2a
Parents: 78b6e3c d302a0f
Author: Luke Cwik 
Authored: Wed Jun 7 09:01:35 2017 -0700
Committer: Luke Cwik 
Committed: Wed Jun 7 09:01:35 2017 -0700

--
 .../org/apache/beam/runners/dataflow/util/CloudObjectKinds.java | 1 +
 .../beam/runners/dataflow/util/CloudObjectTranslators.java  | 5 +++--
 2 files changed, 4 insertions(+), 2 deletions(-)
--




[45/50] beam git commit: This closes #2286

2017-06-07 Thread davor
This closes #2286


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/5f7e73bb
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/5f7e73bb
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/5f7e73bb

Branch: refs/heads/DSL_SQL
Commit: 5f7e73bbacf7096eed44002a54910a560b195801
Parents: b2de3db 8f4fa43
Author: Aljoscha Krettek 
Authored: Wed Jun 7 19:43:19 2017 +0200
Committer: Luke Cwik 
Committed: Wed Jun 7 13:41:20 2017 -0700

--
 .../translation/types/CoderTypeSerializer.java  |  41 ++-
 .../streaming/io/UnboundedSourceWrapper.java|   2 +
 .../flink/streaming/TestCountingSource.java |  48 ++-
 .../streaming/UnboundedSourceWrapperTest.java   | 309 +++
 .../beam/runners/dataflow/DataflowRunner.java   |  24 +-
 5 files changed, 269 insertions(+), 155 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/5f7e73bb/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowRunner.java
--
diff --cc 
runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowRunner.java
index cce6ce7,cce6ce7..ed29330
--- 
a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowRunner.java
+++ 
b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowRunner.java
@@@ -428,12 -428,12 +428,15 @@@ public class DataflowRunner extends Pip
  public PTransformReplacement> 
getReplacementTransform(
  AppliedPTransform, PTransform>> transform) {
PTransform> original = transform.getTransform();
++  PCollection output =
++  (PCollection) 
Iterables.getOnlyElement(transform.getOutputs().values());
return PTransformReplacement.of(
transform.getPipeline().begin(),
InstanceBuilder.ofType(replacement)
.withArg(DataflowRunner.class, runner)
.withArg(
(Class>>) 
original.getClass(), original)
++  .withArg((Class>) output.getClass(), 
output)
.build());
  }
  
@@@ -809,11 -809,11 +812,12 @@@
extends PTransform> {
  private final PubsubUnboundedSource transform;
  
--/**
-- * Builds an instance of this class from the overridden transform.
-- */
++/** Builds an instance of this class from the overridden transform. */
++@SuppressWarnings("unused") // used via reflection in 
DataflowRunner#apply()
  public StreamingPubsubIORead(
--DataflowRunner runner, PubsubUnboundedSource transform) {
++DataflowRunner runner,
++PubsubUnboundedSource transform,
++PCollection originalOutput) {
this.transform = transform;
  }
  
@@@ -992,11 -992,11 +996,11 @@@
private static class StreamingUnboundedRead extends PTransform> {
  private final UnboundedSource source;
  
--/**
-- * Builds an instance of this class from the overridden transform.
-- */
++/** Builds an instance of this class from the overridden transform. */
  @SuppressWarnings("unused") // used via reflection in 
DataflowRunner#apply()
--public StreamingUnboundedRead(DataflowRunner runner, Read.Unbounded 
transform) {
++public StreamingUnboundedRead(DataflowRunner runner,
++Read.Unbounded transform,
++PCollection originalOutput) {
this.source = transform.getSource();
  }
  
@@@ -,7 -,7 +1115,9 @@@
  
  /** Builds an instance of this class from the overridden transform. */
  @SuppressWarnings("unused") // used via reflection in 
DataflowRunner#apply()
--public StreamingBoundedRead(DataflowRunner runner, Read.Bounded 
transform) {
++public StreamingBoundedRead(DataflowRunner runner,
++Read.Bounded transform,
++PCollection originalOutput) {
this.source = transform.getSource();
  }
  



[20/50] beam git commit: soft-enable the use of streaming flag

2017-06-07 Thread davor
soft-enable the use of streaming flag


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/51139509
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/51139509
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/51139509

Branch: refs/heads/DSL_SQL
Commit: 51139509b65b5fa04a39c31f584a02f1a29170dc
Parents: e3139a3
Author: Ahmet Altay 
Authored: Tue Jun 6 13:34:09 2017 -0700
Committer: Ahmet Altay 
Committed: Tue Jun 6 13:56:57 2017 -0700

--
 sdks/python/apache_beam/options/pipeline_options.py  | 5 +++--
 .../apache_beam/options/pipeline_options_validator_test.py   | 8 
 sdks/python/apache_beam/runners/dataflow/dataflow_runner.py  | 2 +-
 .../apache_beam/runners/dataflow/internal/apiclient.py   | 2 +-
 4 files changed, 5 insertions(+), 12 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/51139509/sdks/python/apache_beam/options/pipeline_options.py
--
diff --git a/sdks/python/apache_beam/options/pipeline_options.py 
b/sdks/python/apache_beam/options/pipeline_options.py
index 777926a..daef3a7 100644
--- a/sdks/python/apache_beam/options/pipeline_options.py
+++ b/sdks/python/apache_beam/options/pipeline_options.py
@@ -18,6 +18,7 @@
 """Pipeline options obtained from command line parsing."""
 
 import argparse
+import warnings
 
 from apache_beam.transforms.display import HasDisplayData
 from apache_beam.options.value_provider import StaticValueProvider
@@ -278,12 +279,12 @@ class StandardOptions(PipelineOptions):
 action='store_true',
 help='Whether to enable streaming mode.')
 
-  # TODO(BEAM-1265): Remove this error, once at least one runner supports
+  # TODO(BEAM-1265): Remove this warning, once at least one runner supports
   # streaming pipelines.
   def validate(self, validator):
 errors = []
 if self.view_as(StandardOptions).streaming:
-  errors.append('Streaming pipelines are not supported.')
+  warnings.warn('Streaming pipelines are not supported.')
 return errors
 
 

http://git-wip-us.apache.org/repos/asf/beam/blob/51139509/sdks/python/apache_beam/options/pipeline_options_validator_test.py
--
diff --git a/sdks/python/apache_beam/options/pipeline_options_validator_test.py 
b/sdks/python/apache_beam/options/pipeline_options_validator_test.py
index 28fcbe3..97834cc 100644
--- a/sdks/python/apache_beam/options/pipeline_options_validator_test.py
+++ b/sdks/python/apache_beam/options/pipeline_options_validator_test.py
@@ -300,14 +300,6 @@ class SetupTest(unittest.TestCase):
 errors = validator.validate()
 self.assertFalse(errors)
 
-  def test_streaming(self):
-pipeline_options = PipelineOptions(['--streaming'])
-runner = MockRunners.TestDataflowRunner()
-validator = PipelineOptionsValidator(pipeline_options, runner)
-errors = validator.validate()
-
-self.assertIn('Streaming pipelines are not supported.', errors)
-
   def test_test_matcher(self):
 def get_validator(matcher):
   options = ['--project=example:example',

http://git-wip-us.apache.org/repos/asf/beam/blob/51139509/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py
--
diff --git a/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py 
b/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py
index 3e0e268..62cea33 100644
--- a/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py
+++ b/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py
@@ -64,7 +64,7 @@ class DataflowRunner(PipelineRunner):
   # a job submission and is used by the service to establish what features
   # are expected by the workers.
   BATCH_ENVIRONMENT_MAJOR_VERSION = '6'
-  STREAMING_ENVIRONMENT_MAJOR_VERSION = '0'
+  STREAMING_ENVIRONMENT_MAJOR_VERSION = '1'
 
   def __init__(self, cache=None):
 # Cache of CloudWorkflowStep protos generated while the runner

http://git-wip-us.apache.org/repos/asf/beam/blob/51139509/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py
--
diff --git a/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py 
b/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py
index bfdd5e4..df1a3f2 100644
--- a/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py
+++ b/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py
@@ -145,7 +145,7 @@ class Environment(object):
 # Version information.
 self.proto.version = dataflow.Environment.VersionValue()
 if self.standard_options.streaming:
-  job_type = 'PYTHON_STREAMING'
+  job_type = 'FNAPI_STREAMING

[26/50] beam git commit: Increase visibility of some Metrics methods

2017-06-07 Thread davor
Increase visibility of some Metrics methods

Also revise the Javadoc on MetricsContainers.


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/39674ca8
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/39674ca8
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/39674ca8

Branch: refs/heads/DSL_SQL
Commit: 39674ca8d0ae1d82bfb5da6a81f26843613d3cd7
Parents: 6853d8e
Author: bchambers 
Authored: Tue Jun 6 15:08:46 2017 -0700
Committer: bchambers 
Committed: Tue Jun 6 15:08:55 2017 -0700

--
 .../apache/beam/runners/core/metrics/CounterCell.java| 10 +++---
 .../org/apache/beam/runners/core/metrics/DirtyState.java |  4 +++-
 .../beam/runners/core/metrics/DistributionCell.java  | 10 +++---
 .../org/apache/beam/runners/core/metrics/GaugeCell.java  | 11 +++
 .../beam/runners/core/metrics/MetricsContainerImpl.java  |  4 +++-
 .../org/apache/beam/sdk/metrics/MetricsContainer.java|  3 ++-
 6 files changed, 29 insertions(+), 13 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/39674ca8/runners/core-java/src/main/java/org/apache/beam/runners/core/metrics/CounterCell.java
--
diff --git 
a/runners/core-java/src/main/java/org/apache/beam/runners/core/metrics/CounterCell.java
 
b/runners/core-java/src/main/java/org/apache/beam/runners/core/metrics/CounterCell.java
index 4378bb9..886d681 100644
--- 
a/runners/core-java/src/main/java/org/apache/beam/runners/core/metrics/CounterCell.java
+++ 
b/runners/core-java/src/main/java/org/apache/beam/runners/core/metrics/CounterCell.java
@@ -21,8 +21,10 @@ package org.apache.beam.runners.core.metrics;
 import java.util.concurrent.atomic.AtomicLong;
 import org.apache.beam.sdk.annotations.Experimental;
 import org.apache.beam.sdk.annotations.Experimental.Kind;
+import org.apache.beam.sdk.annotations.Internal;
 import org.apache.beam.sdk.metrics.Counter;
 import org.apache.beam.sdk.metrics.MetricName;
+import org.apache.beam.sdk.metrics.MetricsContainer;
 
 /**
  * Tracks the current value (and delta) for a Counter metric for a specific 
context and bundle.
@@ -40,10 +42,12 @@ public class CounterCell implements Counter, 
MetricCell {
   private final MetricName name;
 
   /**
-   * Package-visibility because all {@link CounterCell CounterCells} should be 
created by
-   * {@link MetricsContainerImpl#getCounter(MetricName)}.
+   * Generally, runners should construct instances using the methods in
+   * {@link MetricsContainerImpl}, unless they need to define their own 
version of
+   * {@link MetricsContainer}. These constructors are *only* public so runners 
can instantiate.
*/
-  CounterCell(MetricName name) {
+  @Internal
+  public CounterCell(MetricName name) {
 this.name = name;
   }
 

http://git-wip-us.apache.org/repos/asf/beam/blob/39674ca8/runners/core-java/src/main/java/org/apache/beam/runners/core/metrics/DirtyState.java
--
diff --git 
a/runners/core-java/src/main/java/org/apache/beam/runners/core/metrics/DirtyState.java
 
b/runners/core-java/src/main/java/org/apache/beam/runners/core/metrics/DirtyState.java
index 532fc2a..1976049 100644
--- 
a/runners/core-java/src/main/java/org/apache/beam/runners/core/metrics/DirtyState.java
+++ 
b/runners/core-java/src/main/java/org/apache/beam/runners/core/metrics/DirtyState.java
@@ -22,6 +22,7 @@ import java.io.Serializable;
 import java.util.concurrent.atomic.AtomicReference;
 import org.apache.beam.sdk.annotations.Experimental;
 import org.apache.beam.sdk.annotations.Experimental.Kind;
+import org.apache.beam.sdk.annotations.Internal;
 
 /**
  * Atomically tracks the dirty-state of a metric.
@@ -42,7 +43,8 @@ import org.apache.beam.sdk.annotations.Experimental.Kind;
  * completed.
  */
 @Experimental(Kind.METRICS)
-class DirtyState implements Serializable {
+@Internal
+public class DirtyState implements Serializable {
   private enum State {
 /** Indicates that there have been changes to the MetricCell since last 
commit. */
 DIRTY,

http://git-wip-us.apache.org/repos/asf/beam/blob/39674ca8/runners/core-java/src/main/java/org/apache/beam/runners/core/metrics/DistributionCell.java
--
diff --git 
a/runners/core-java/src/main/java/org/apache/beam/runners/core/metrics/DistributionCell.java
 
b/runners/core-java/src/main/java/org/apache/beam/runners/core/metrics/DistributionCell.java
index 5a5099a..8713ec4 100644
--- 
a/runners/core-java/src/main/java/org/apache/beam/runners/core/metrics/DistributionCell.java
+++ 
b/runners/core-java/src/main/java/org/apache/beam/runners/core/metrics/DistributionCell.java
@@ -21,8 +21,10 @@ package org.apache.beam.runners.core.metrics;
 imp

[01/50] beam git commit: [BEAM-1347] Add additional logging

2017-06-07 Thread davor
Repository: beam
Updated Branches:
  refs/heads/DSL_SQL fcc80ce84 -> 4c5b7584a


[BEAM-1347] Add additional logging


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/7905def3
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/7905def3
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/7905def3

Branch: refs/heads/DSL_SQL
Commit: 7905def326a81e1830a0cbb3bcce0b304a2f9878
Parents: bf2d300
Author: Luke Cwik 
Authored: Mon Jun 5 15:03:50 2017 -0700
Committer: Luke Cwik 
Committed: Mon Jun 5 15:03:50 2017 -0700

--
 .../org/apache/beam/fn/harness/control/BeamFnControlClient.java | 3 ++-
 .../org/apache/beam/fn/harness/data/BeamFnDataGrpcClient.java   | 5 -
 .../apache/beam/fn/harness/data/BeamFnDataGrpcMultiplexer.java  | 3 ++-
 3 files changed, 8 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/7905def3/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/control/BeamFnControlClient.java
--
diff --git 
a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/control/BeamFnControlClient.java
 
b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/control/BeamFnControlClient.java
index e40bb2f..1c4d277 100644
--- 
a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/control/BeamFnControlClient.java
+++ 
b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/control/BeamFnControlClient.java
@@ -89,7 +89,7 @@ public class BeamFnControlClient {
   private class InboundObserver implements 
StreamObserver {
 @Override
 public void onNext(BeamFnApi.InstructionRequest value) {
-  LOG.info("InstructionRequest received {}", value);
+  LOG.debug("Received InstructionRequest {}", value);
   Uninterruptibles.putUninterruptibly(bufferedInstructions, value);
 }
 
@@ -155,6 +155,7 @@ public class BeamFnControlClient {
   }
 
   public void sendInstructionResponse(BeamFnApi.InstructionResponse value) {
+LOG.debug("Sending InstructionResponse {}", value);
 outboundObserver.onNext(value);
   }
 

http://git-wip-us.apache.org/repos/asf/beam/blob/7905def3/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcClient.java
--
diff --git 
a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcClient.java
 
b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcClient.java
index 4137cd7..8351626 100644
--- 
a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcClient.java
+++ 
b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcClient.java
@@ -78,7 +78,7 @@ public class BeamFnDataGrpcClient implements BeamFnDataClient 
{
   KV inputLocation,
   Coder> coder,
   ThrowingConsumer> consumer) {
-LOG.debug("Registering consumer instruction {} for target {}",
+LOG.debug("Registering consumer for instruction {} and target {}",
 inputLocation.getKey(),
 inputLocation.getValue());
 
@@ -106,6 +106,9 @@ public class BeamFnDataGrpcClient implements 
BeamFnDataClient {
   Coder> coder) {
 BeamFnDataGrpcMultiplexer client = getClientFor(apiServiceDescriptor);
 
+LOG.debug("Creating output consumer for instruction {} and target {}",
+outputLocation.getKey(),
+outputLocation.getValue());
 return new BeamFnDataBufferingOutboundObserver<>(
 options, outputLocation, coder, client.getOutboundObserver());
   }

http://git-wip-us.apache.org/repos/asf/beam/blob/7905def3/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcMultiplexer.java
--
diff --git 
a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcMultiplexer.java
 
b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcMultiplexer.java
index 15e8c0d..8ee5491 100644
--- 
a/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcMultiplexer.java
+++ 
b/sdks/java/harness/src/main/java/org/apache/beam/fn/harness/data/BeamFnDataGrpcMultiplexer.java
@@ -104,7 +104,8 @@ public class BeamFnDataGrpcMultiplexer {
   KV.of(data.getInstructionReference(), data.getTarget());
   CompletableFuture> consumer = 
futureForKey(key);
   if (!consumer.isDone()) {
-LOG.debug("Received data for key {} without consumer ready.", key);
+LOG.debug("Received data for key {} without consumer ready. "
++ "Waiting for consumer to be registered.", key);
   }
   consumer.get().accept(data);
   if (data.getData().isEmpty()) {



[43/50] beam git commit: [BEAM-2408] Fix watermark emission in Flink UnboundedSourceWrapper

2017-06-07 Thread davor
[BEAM-2408] Fix watermark emission in Flink UnboundedSourceWrapper

Before, there was no call to setNextWatermarkTimer() in case the source
had multiple Readers.

This also adds a test for watermark emission to
UnboundedSourceWrapperTest.


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/8f4fa439
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/8f4fa439
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/8f4fa439

Branch: refs/heads/DSL_SQL
Commit: 8f4fa4394609504348ada988948f0a1386d54c0e
Parents: c1dc8f5
Author: Aljoscha Krettek 
Authored: Mon Jun 5 15:46:26 2017 +0200
Committer: Aljoscha Krettek 
Committed: Wed Jun 7 19:43:11 2017 +0200

--
 .../streaming/io/UnboundedSourceWrapper.java|   2 +
 .../streaming/UnboundedSourceWrapperTest.java   | 111 ++-
 2 files changed, 112 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/8f4fa439/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
--
diff --git 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
index ec21699..6055a43 100644
--- 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
+++ 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
@@ -283,6 +283,8 @@ public class UnboundedSourceWrapper<
 }
   }
 
+  setNextWatermarkTimer(this.runtimeContext);
+
   // a flag telling us whether any of the localReaders had data
   // if no reader had data, sleep for bit
   boolean hadData = false;

http://git-wip-us.apache.org/repos/asf/beam/blob/8f4fa439/runners/flink/src/test/java/org/apache/beam/runners/flink/streaming/UnboundedSourceWrapperTest.java
--
diff --git 
a/runners/flink/src/test/java/org/apache/beam/runners/flink/streaming/UnboundedSourceWrapperTest.java
 
b/runners/flink/src/test/java/org/apache/beam/runners/flink/streaming/UnboundedSourceWrapperTest.java
index 716e71d..bb2be60 100644
--- 
a/runners/flink/src/test/java/org/apache/beam/runners/flink/streaming/UnboundedSourceWrapperTest.java
+++ 
b/runners/flink/src/test/java/org/apache/beam/runners/flink/streaming/UnboundedSourceWrapperTest.java
@@ -21,11 +21,14 @@ import static org.junit.Assert.assertEquals;
 import static org.junit.Assert.assertTrue;
 import static org.junit.Assert.fail;
 
+import com.google.common.base.Joiner;
 import java.util.ArrayList;
 import java.util.Arrays;
 import java.util.Collection;
 import java.util.HashSet;
 import java.util.Set;
+import java.util.concurrent.ConcurrentLinkedQueue;
+import java.util.concurrent.atomic.AtomicBoolean;
 import 
org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper;
 import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.options.PipelineOptions;
@@ -45,6 +48,7 @@ import 
org.apache.flink.streaming.runtime.tasks.OperatorStateHandles;
 import org.apache.flink.streaming.util.AbstractStreamOperatorTestHarness;
 import org.apache.flink.util.InstantiationUtil;
 import org.apache.flink.util.OutputTag;
+import org.joda.time.Instant;
 import org.junit.Test;
 import org.junit.experimental.runners.Enclosed;
 import org.junit.runner.RunWith;
@@ -88,7 +92,7 @@ public class UnboundedSourceWrapperTest {
  * If numSplits > numTasks the source has one source will manage multiple 
readers.
  */
 @Test
-public void testReaders() throws Exception {
+public void testValueEmission() throws Exception {
   final int numElements = 20;
   final Object checkpointLock = new Object();
   PipelineOptions options = PipelineOptionsFactory.create();
@@ -164,6 +168,111 @@ public class UnboundedSourceWrapperTest {
 }
 
 /**
+ * Creates a {@link UnboundedSourceWrapper} that has one or multiple 
readers per source.
+ * If numSplits > numTasks the source has one source will manage multiple 
readers.
+ *
+ * This test verifies that watermark are correctly forwarded.
+ */
+@Test(timeout = 30_000)
+public void testWatermarkEmission() throws Exception {
+  final int numElements = 500;
+  final Object checkpointLock = new Object();
+  PipelineOptions options = PipelineOptionsFactory.create();
+
+  // this source will emit exactly NUM_ELEMENTS across all parallel 
readers,
+  // afterwards it will stall. We check whether we also receive 

[30/50] beam git commit: This closes #3308

2017-06-07 Thread davor
This closes #3308


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/6fed1779
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/6fed1779
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/6fed1779

Branch: refs/heads/DSL_SQL
Commit: 6fed1779dbd78dddfe69bab46571f74101329504
Parents: be26dd3 efe8e1f
Author: Ahmet Altay 
Authored: Tue Jun 6 16:12:44 2017 -0700
Committer: Ahmet Altay 
Committed: Tue Jun 6 16:12:44 2017 -0700

--
 sdks/python/apache_beam/runners/worker/sdk_worker.py | 6 --
 1 file changed, 4 insertions(+), 2 deletions(-)
--




[07/50] beam git commit: [BEAM-2276] This closes #3232

2017-06-07 Thread davor
[BEAM-2276] This closes #3232


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/88f78fa2
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/88f78fa2
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/88f78fa2

Branch: refs/heads/DSL_SQL
Commit: 88f78fa2f16df8f507127738953f2bb8ca083d1d
Parents: 6d64c6e e764167
Author: Jean-Baptiste Onofré 
Authored: Tue Jun 6 14:08:26 2017 +0200
Committer: Jean-Baptiste Onofré 
Committed: Tue Jun 6 14:08:26 2017 +0200

--
 .../construction/PTransformMatchersTest.java|   8 +-
 .../direct/WriteWithShardingFactoryTest.java|  23 +++--
 .../java/org/apache/beam/sdk/io/AvroIO.java |   2 +-
 .../beam/sdk/io/DefaultFilenamePolicy.java  | 103 +--
 .../java/org/apache/beam/sdk/io/TFRecordIO.java |   2 +-
 .../java/org/apache/beam/sdk/io/TextIO.java |   2 +-
 .../java/org/apache/beam/sdk/io/AvroIOTest.java |  11 +-
 .../beam/sdk/io/DefaultFilenamePolicyTest.java  |  26 ++---
 .../java/org/apache/beam/sdk/io/TextIOTest.java |   5 +-
 .../org/apache/beam/sdk/io/xml/XmlSink.java |   2 +-
 10 files changed, 69 insertions(+), 115 deletions(-)
--




[34/50] beam git commit: [BEAM-2405] Override to sink interface in the batch dataflow BQ

2017-06-07 Thread davor
[BEAM-2405] Override to sink interface in the batch dataflow BQ


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/e641997a
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/e641997a
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/e641997a

Branch: refs/heads/DSL_SQL
Commit: e641997affc378ec0337d5ac19d8677cba0d0933
Parents: b6347d0
Author: Sourabh Bajaj 
Authored: Tue Jun 6 19:49:54 2017 -0700
Committer: chamik...@google.com 
Committed: Tue Jun 6 22:17:05 2017 -0700

--
 .../examples/cookbook/bigquery_tornadoes.py   | 11 +--
 sdks/python/apache_beam/io/gcp/bigquery.py|  2 +-
 .../runners/dataflow/dataflow_runner.py   | 18 ++
 3 files changed, 24 insertions(+), 7 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/e641997a/sdks/python/apache_beam/examples/cookbook/bigquery_tornadoes.py
--
diff --git a/sdks/python/apache_beam/examples/cookbook/bigquery_tornadoes.py 
b/sdks/python/apache_beam/examples/cookbook/bigquery_tornadoes.py
index d3b216e..1ca49c5 100644
--- a/sdks/python/apache_beam/examples/cookbook/bigquery_tornadoes.py
+++ b/sdks/python/apache_beam/examples/cookbook/bigquery_tornadoes.py
@@ -83,12 +83,11 @@ def run(argv=None):
 
 # Write the output using a "Write" transform that has side effects.
 # pylint: disable=expression-not-assigned
-counts | 'write' >> beam.io.Write(
-beam.io.BigQuerySink(
-known_args.output,
-schema='month:INTEGER, tornado_count:INTEGER',
-create_disposition=beam.io.BigQueryDisposition.CREATE_IF_NEEDED,
-write_disposition=beam.io.BigQueryDisposition.WRITE_TRUNCATE))
+counts | 'Write' >> beam.io.WriteToBigQuery(
+known_args.output,
+schema='month:INTEGER, tornado_count:INTEGER',
+create_disposition=beam.io.BigQueryDisposition.CREATE_IF_NEEDED,
+write_disposition=beam.io.BigQueryDisposition.WRITE_TRUNCATE)
 
 # Run the pipeline (all operations are deferred until run() is called).
 

http://git-wip-us.apache.org/repos/asf/beam/blob/e641997a/sdks/python/apache_beam/io/gcp/bigquery.py
--
diff --git a/sdks/python/apache_beam/io/gcp/bigquery.py 
b/sdks/python/apache_beam/io/gcp/bigquery.py
index 9069f73..da8be68 100644
--- a/sdks/python/apache_beam/io/gcp/bigquery.py
+++ b/sdks/python/apache_beam/io/gcp/bigquery.py
@@ -1299,7 +1299,7 @@ class WriteToBigQuery(PTransform):
 create_disposition=self.create_disposition,
 write_disposition=self.write_disposition,
 client=self.test_client)
-return pcoll | 'Write to BigQuery' >> ParDo(bigquery_write_fn)
+return pcoll | 'WriteToBigQuery' >> ParDo(bigquery_write_fn)
 
   def display_data(self):
 res = {}

http://git-wip-us.apache.org/repos/asf/beam/blob/e641997a/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py
--
diff --git a/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py 
b/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py
index 62cea33..3fc8983 100644
--- a/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py
+++ b/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py
@@ -27,6 +27,7 @@ import time
 import traceback
 import urllib
 
+import apache_beam as beam
 from apache_beam import error
 from apache_beam import coders
 from apache_beam import pvalue
@@ -378,6 +379,23 @@ class DataflowRunner(PipelineRunner):
   PropertyNames.ENCODING: step.encoding,
   PropertyNames.OUTPUT_NAME: PropertyNames.OUT}])
 
+  def apply_WriteToBigQuery(self, transform, pcoll):
+standard_options = pcoll.pipeline._options.view_as(StandardOptions)
+if standard_options.streaming:
+  if (transform.write_disposition ==
+  beam.io.BigQueryDisposition.WRITE_TRUNCATE):
+raise RuntimeError('Can not use write truncation mode in streaming')
+  return self.apply_PTransform(transform, pcoll)
+else:
+  return pcoll  | 'WriteToBigQuery' >> beam.io.Write(
+  beam.io.BigQuerySink(
+  transform.table_reference.tableId,
+  transform.table_reference.datasetId,
+  transform.table_reference.projectId,
+  transform.schema,
+  transform.create_disposition,
+  transform.write_disposition))
+
   def apply_GroupByKey(self, transform, pcoll):
 # Infer coder of parent.
 #



[GitHub] beam pull request #3314: sync-up DSL_SQL from Master

2017-06-07 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3314


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[17/50] beam git commit: This closes #3282: [BEAM-3271] Improve Splittable ParDo translation

2017-06-07 Thread davor
This closes #3282: [BEAM-3271] Improve Splittable ParDo translation

  Improve Splittable ParDo translation
  Fix RawPTransform translation


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/513c952f
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/513c952f
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/513c952f

Branch: refs/heads/DSL_SQL
Commit: 513c952fa65dbddee167160f6efabb634133425d
Parents: 7c608c3 1b00d95
Author: Kenneth Knowles 
Authored: Tue Jun 6 13:13:40 2017 -0700
Committer: Kenneth Knowles 
Committed: Tue Jun 6 13:13:40 2017 -0700

--
 .../construction/PTransformTranslation.java | 57 
 .../core/construction/ParDoTranslation.java | 20 +++
 .../core/construction/SplittableParDo.java  | 18 +--
 .../core/construction/ParDoTranslationTest.java | 35 +++-
 .../core/SplittableParDoViaKeyedWorkItems.java  | 10 +++-
 runners/direct-java/pom.xml |  5 --
 .../beam/runners/direct/DirectGroupByKey.java   |  5 +-
 .../direct/KeyedPValueTrackingVisitor.java  |  2 +-
 .../direct/ParDoMultiOverrideFactory.java   |  3 +-
 .../direct/TestStreamEvaluatorFactory.java  |  3 +-
 .../direct/TransformEvaluatorRegistry.java  |  8 +--
 .../runners/direct/ViewOverrideFactory.java |  3 +-
 .../src/main/proto/beam_runner_api.proto|  3 ++
 13 files changed, 138 insertions(+), 34 deletions(-)
--




[37/50] beam git commit: [BEAM-245] This closes #592

2017-06-07 Thread davor
[BEAM-245] This closes #592


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/c189d5c0
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/c189d5c0
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/c189d5c0

Branch: refs/heads/DSL_SQL
Commit: c189d5c0e4582b446564db9cbf1ae06970f6079d
Parents: 3cc4ff6 0b0bb3d
Author: Jean-Baptiste Onofré 
Authored: Wed Jun 7 08:05:15 2017 +0200
Committer: Jean-Baptiste Onofré 
Committed: Wed Jun 7 08:05:15 2017 +0200

--
 sdks/java/io/cassandra/pom.xml  | 113 
 .../beam/sdk/io/cassandra/CassandraIO.java  | 510 +++
 .../beam/sdk/io/cassandra/CassandraService.java |  66 +++
 .../sdk/io/cassandra/CassandraServiceImpl.java  | 398 +++
 .../beam/sdk/io/cassandra/package-info.java |  22 +
 .../beam/sdk/io/cassandra/CassandraIOIT.java| 254 +
 .../beam/sdk/io/cassandra/CassandraIOTest.java  | 279 ++
 .../io/cassandra/CassandraServiceImplTest.java  | 138 +
 .../sdk/io/cassandra/CassandraTestDataSet.java  | 153 ++
 .../sdk/io/common/IOTestPipelineOptions.java|  10 +
 sdks/java/io/pom.xml|   1 +
 11 files changed, 1944 insertions(+)
--




[41/50] beam git commit: [BEAM-2407] Fix Flink CoderTyperSerializer ConfigSnapshot

2017-06-07 Thread davor
[BEAM-2407] Fix Flink CoderTyperSerializer ConfigSnapshot

Before, the config snapshot was not deserializable because there was no
default constructor and read()/write() where not implemented.

This also changes the compatibility-check logic to compare the class
name of the Coder to avoid serializing the coder using Java
Serialization.


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/62b942a0
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/62b942a0
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/62b942a0

Branch: refs/heads/DSL_SQL
Commit: 62b942a02ec633c172d543946be9cfe0648825ea
Parents: b2de3db
Author: Aljoscha Krettek 
Authored: Mon Jun 5 09:10:59 2017 +0200
Committer: Aljoscha Krettek 
Committed: Wed Jun 7 19:43:11 2017 +0200

--
 .../translation/types/CoderTypeSerializer.java  | 41 +++-
 1 file changed, 32 insertions(+), 9 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/62b942a0/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializer.java
--
diff --git 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializer.java
 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializer.java
index bea562e..ecfd3fb 100644
--- 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializer.java
+++ 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/types/CoderTypeSerializer.java
@@ -19,6 +19,7 @@ package org.apache.beam.runners.flink.translation.types;
 
 import java.io.EOFException;
 import java.io.IOException;
+import java.util.Objects;
 import org.apache.beam.runners.flink.translation.wrappers.DataInputViewWrapper;
 import 
org.apache.beam.runners.flink.translation.wrappers.DataOutputViewWrapper;
 import org.apache.beam.sdk.coders.Coder;
@@ -139,24 +140,28 @@ public class CoderTypeSerializer extends 
TypeSerializer {
 
   @Override
   public CompatibilityResult 
ensureCompatibility(TypeSerializerConfigSnapshot configSnapshot) {
-if (configSnapshot instanceof CoderTypeSerializerConfigSnapshot) {
-  if (coder.equals(((CoderTypeSerializerConfigSnapshot) 
configSnapshot).coder)) {
-return CompatibilityResult.compatible();
-  }
+if (snapshotConfiguration().equals(configSnapshot)) {
+  return CompatibilityResult.compatible();
 }
 return CompatibilityResult.requiresMigration();
   }
 
   /**
-   *  TypeSerializerConfigSnapshot of CoderTypeSerializer.
+   *  TypeSerializerConfigSnapshot of CoderTypeSerializer. This uses the class 
name of the
+   *  {@link Coder} to determine compatibility. This is a bit crude but better 
than using
+   *  Java Serialization to (de)serialize the {@link Coder}.
*/
   public static class CoderTypeSerializerConfigSnapshot extends 
TypeSerializerConfigSnapshot {
 
 private static final int VERSION = 1;
-private Coder coder;
+private String coderName;
+
+public CoderTypeSerializerConfigSnapshot() {
+  // empty constructor for satisfying IOReadableWritable which is used for 
deserialization
+}
 
 public CoderTypeSerializerConfigSnapshot(Coder coder) {
-  this.coder = coder;
+  this.coderName = coder.getClass().getCanonicalName();
 }
 
 @Override
@@ -175,13 +180,31 @@ public class CoderTypeSerializer extends 
TypeSerializer {
 
   CoderTypeSerializerConfigSnapshot that = 
(CoderTypeSerializerConfigSnapshot) o;
 
-  return coder != null ? coder.equals(that.coder) : that.coder == null;
+  return coderName != null ? coderName.equals(that.coderName) : 
that.coderName == null;
+}
+
+@Override
+public void write(DataOutputView out) throws IOException {
+  super.write(out);
+  out.writeUTF(coderName);
+}
+
+@Override
+public void read(DataInputView in) throws IOException {
+  super.read(in);
+  this.coderName = in.readUTF();
 }
 
 @Override
 public int hashCode() {
-  return coder.hashCode();
+  return Objects.hash(coderName);
 }
   }
 
+  @Override
+  public String toString() {
+return "CoderTypeSerializer{"
++ "coder=" + coder
++ '}';
+  }
 }



[35/50] beam git commit: This closes #3306

2017-06-07 Thread davor
This closes #3306


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/3cc4ff6d
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/3cc4ff6d
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/3cc4ff6d

Branch: refs/heads/DSL_SQL
Commit: 3cc4ff6d72cf74e91b0a0d9cdd4277288958c242
Parents: b6347d0 e641997
Author: chamik...@google.com 
Authored: Tue Jun 6 22:17:18 2017 -0700
Committer: chamik...@google.com 
Committed: Tue Jun 6 22:17:18 2017 -0700

--
 .../examples/cookbook/bigquery_tornadoes.py   | 11 +--
 sdks/python/apache_beam/io/gcp/bigquery.py|  2 +-
 .../runners/dataflow/dataflow_runner.py   | 18 ++
 3 files changed, 24 insertions(+), 7 deletions(-)
--




[02/50] beam git commit: [BEAM-1347] Add additional logging

2017-06-07 Thread davor
[BEAM-1347] Add additional logging

This closes #3296


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/1cc6dc12
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/1cc6dc12
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/1cc6dc12

Branch: refs/heads/DSL_SQL
Commit: 1cc6dc12002da5818640438fa37c69ba6ed5eccb
Parents: bf2d300 7905def
Author: Luke Cwik 
Authored: Mon Jun 5 15:04:21 2017 -0700
Committer: Luke Cwik 
Committed: Mon Jun 5 15:04:21 2017 -0700

--
 .../org/apache/beam/fn/harness/control/BeamFnControlClient.java | 3 ++-
 .../org/apache/beam/fn/harness/data/BeamFnDataGrpcClient.java   | 5 -
 .../apache/beam/fn/harness/data/BeamFnDataGrpcMultiplexer.java  | 3 ++-
 3 files changed, 8 insertions(+), 3 deletions(-)
--




[46/50] beam git commit: [BEAM-2421] Swap to use an Impulse primitive + DoFn for Create when executing with the Fn API.

2017-06-07 Thread davor
[BEAM-2421] Swap to use an Impulse primitive + DoFn for Create when executing 
with the Fn API.


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/1cdb80cb
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/1cdb80cb
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/1cdb80cb

Branch: refs/heads/DSL_SQL
Commit: 1cdb80cb6319c04fa94961c14c038a5e15736d68
Parents: 5f7e73b
Author: Luke Cwik 
Authored: Wed Jun 7 08:53:14 2017 -0700
Committer: Luke Cwik 
Committed: Wed Jun 7 13:41:20 2017 -0700

--
 .../beam/runners/dataflow/DataflowRunner.java   | 158 +--
 1 file changed, 145 insertions(+), 13 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/1cdb80cb/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowRunner.java
--
diff --git 
a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowRunner.java
 
b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowRunner.java
index ed29330..3e7c8ce 100644
--- 
a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowRunner.java
+++ 
b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowRunner.java
@@ -49,6 +49,7 @@ import java.net.URLClassLoader;
 import java.nio.channels.Channels;
 import java.util.ArrayList;
 import java.util.Arrays;
+import java.util.Collection;
 import java.util.Collections;
 import java.util.HashSet;
 import java.util.List;
@@ -57,6 +58,7 @@ import java.util.Random;
 import java.util.Set;
 import java.util.SortedSet;
 import java.util.TreeSet;
+import org.apache.beam.runners.core.construction.CoderTranslation;
 import org.apache.beam.runners.core.construction.DeduplicatedFlattenFactory;
 import org.apache.beam.runners.core.construction.EmptyFlattenAsCreateFactory;
 import org.apache.beam.runners.core.construction.PTransformMatchers;
@@ -79,10 +81,12 @@ import org.apache.beam.sdk.Pipeline.PipelineVisitor;
 import org.apache.beam.sdk.PipelineResult.State;
 import org.apache.beam.sdk.PipelineRunner;
 import org.apache.beam.sdk.annotations.Experimental;
+import org.apache.beam.sdk.coders.ByteArrayCoder;
 import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.coders.Coder.NonDeterministicException;
 import org.apache.beam.sdk.coders.KvCoder;
 import org.apache.beam.sdk.coders.VoidCoder;
+import org.apache.beam.sdk.common.runner.v1.RunnerApi;
 import org.apache.beam.sdk.extensions.gcp.storage.PathValidator;
 import org.apache.beam.sdk.io.BoundedSource;
 import org.apache.beam.sdk.io.FileSystems;
@@ -103,6 +107,7 @@ import org.apache.beam.sdk.runners.TransformHierarchy;
 import org.apache.beam.sdk.runners.TransformHierarchy.Node;
 import org.apache.beam.sdk.transforms.Combine;
 import org.apache.beam.sdk.transforms.Combine.GroupedValues;
+import org.apache.beam.sdk.transforms.Create;
 import org.apache.beam.sdk.transforms.DoFn;
 import org.apache.beam.sdk.transforms.PTransform;
 import org.apache.beam.sdk.transforms.ParDo;
@@ -113,6 +118,7 @@ import org.apache.beam.sdk.transforms.View;
 import org.apache.beam.sdk.transforms.WithKeys;
 import org.apache.beam.sdk.transforms.display.DisplayData;
 import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
+import org.apache.beam.sdk.util.CoderUtils;
 import org.apache.beam.sdk.util.InstanceBuilder;
 import org.apache.beam.sdk.util.MimeTypes;
 import org.apache.beam.sdk.util.NameUtils;
@@ -312,6 +318,12 @@ public class DataflowRunner extends 
PipelineRunner {
 PTransformMatchers.classEqualTo(PubsubUnboundedSink.class),
 new StreamingPubsubIOWriteOverrideFactory(this)));
   }
+  if (hasExperiment(options, "beam_fn_api")) {
+overridesBuilder.add(
+PTransformOverride.of(
+PTransformMatchers.classEqualTo(Create.Values.class),
+new StreamingFnApiCreateOverrideFactory()));
+  }
   overridesBuilder
   .add(
   // Streaming Bounded Read is implemented in terms of Streaming 
Unbounded Read, and
@@ -428,15 +440,12 @@ public class DataflowRunner extends 
PipelineRunner {
 public PTransformReplacement> 
getReplacementTransform(
 AppliedPTransform, PTransform>> transform) {
   PTransform> original = transform.getTransform();
-  PCollection output =
-  (PCollection) 
Iterables.getOnlyElement(transform.getOutputs().values());
   return PTransformReplacement.of(
   transform.getPipeline().begin(),
   InstanceBuilder.ofType(replacement)
   .withArg(DataflowRunner.class, runner)
   .withArg(
   (Class>>) 
original.getClass()

[03/50] beam git commit: Add CreatePCollectionView translation

2017-06-07 Thread davor
Add CreatePCollectionView translation


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/c3b036a2
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/c3b036a2
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/c3b036a2

Branch: refs/heads/DSL_SQL
Commit: c3b036a243c768546f0273e22fb44eaa2fcfb245
Parents: ae7bc1d
Author: Kenneth Knowles 
Authored: Thu May 25 06:56:23 2017 -0700
Committer: Kenneth Knowles 
Committed: Mon Jun 5 19:48:27 2017 -0700

--
 .../CreatePCollectionViewTranslation.java   | 126 +
 .../construction/PTransformTranslation.java |  10 +-
 .../CreatePCollectionViewTranslationTest.java   | 136 +++
 3 files changed, 270 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/c3b036a2/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/CreatePCollectionViewTranslation.java
--
diff --git 
a/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/CreatePCollectionViewTranslation.java
 
b/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/CreatePCollectionViewTranslation.java
new file mode 100644
index 000..aa24909
--- /dev/null
+++ 
b/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/CreatePCollectionViewTranslation.java
@@ -0,0 +1,126 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.beam.runners.core.construction;
+
+import static com.google.common.base.Preconditions.checkArgument;
+
+import com.google.auto.service.AutoService;
+import com.google.protobuf.Any;
+import com.google.protobuf.ByteString;
+import com.google.protobuf.BytesValue;
+import java.io.IOException;
+import java.util.Collections;
+import java.util.Map;
+import 
org.apache.beam.runners.core.construction.PTransformTranslation.TransformPayloadTranslator;
+import org.apache.beam.sdk.common.runner.v1.RunnerApi;
+import org.apache.beam.sdk.common.runner.v1.RunnerApi.FunctionSpec;
+import org.apache.beam.sdk.runners.AppliedPTransform;
+import org.apache.beam.sdk.transforms.PTransform;
+import org.apache.beam.sdk.transforms.ParDo;
+import org.apache.beam.sdk.transforms.View;
+import org.apache.beam.sdk.transforms.View.CreatePCollectionView;
+import org.apache.beam.sdk.util.SerializableUtils;
+import org.apache.beam.sdk.values.PCollection;
+import org.apache.beam.sdk.values.PCollectionView;
+
+/**
+ * Utility methods for translating a {@link View} transforms to and from 
{@link RunnerApi}
+ * representations.
+ *
+ * @deprecated this should generally be done as part of {@link ParDo} 
translation, or moved into a
+ * dedicated runners-core-construction auxiliary class
+ */
+@Deprecated
+public class CreatePCollectionViewTranslation {
+
+  /**
+   * @deprecated Since {@link CreatePCollectionView} is not a part of the Beam 
model, there is no
+   * SDK-agnostic specification. Using this method means your runner is 
tied to Java.
+   */
+  @Deprecated
+  public static  PCollectionView getView(
+  AppliedPTransform<
+  PCollection, PCollectionView,
+  PTransform, PCollectionView>>
+  application)
+  throws IOException {
+
+RunnerApi.PTransform transformProto =
+PTransformTranslation.toProto(
+application,
+Collections.>emptyList(),
+SdkComponents.create());
+
+checkArgument(
+
PTransformTranslation.CREATE_VIEW_TRANSFORM_URN.equals(transformProto.getSpec().getUrn()),
+"Illegal attempt to extract %s from transform %s with name \"%s\" and 
URN \"%s\"",
+PCollectionView.class.getSimpleName(),
+application.getTransform(),
+application.getFullName(),
+transformProto.getSpec().getUrn());
+
+return (PCollectionView)
+SerializableUtils.deserializeFromByteArray(
+transformProto
+.getSpec()
+ 

[04/50] beam git commit: Clarify javadoc on PTransformTranslation

2017-06-07 Thread davor
Clarify javadoc on PTransformTranslation


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/ae7bc1d7
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/ae7bc1d7
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/ae7bc1d7

Branch: refs/heads/DSL_SQL
Commit: ae7bc1d781f793d5091b70bab1c788b795866a8f
Parents: 1cc6dc1
Author: Kenneth Knowles 
Authored: Mon Jun 5 19:47:31 2017 -0700
Committer: Kenneth Knowles 
Committed: Mon Jun 5 19:48:27 2017 -0700

--
 .../beam/runners/core/construction/PTransformTranslation.java  | 6 +-
 1 file changed, 5 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/ae7bc1d7/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/PTransformTranslation.java
--
diff --git 
a/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/PTransformTranslation.java
 
b/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/PTransformTranslation.java
index fd3f9f3..fcbe84b 100644
--- 
a/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/PTransformTranslation.java
+++ 
b/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/PTransformTranslation.java
@@ -29,6 +29,7 @@ import java.util.List;
 import java.util.Map;
 import java.util.ServiceLoader;
 import javax.annotation.Nullable;
+import org.apache.beam.sdk.Pipeline;
 import org.apache.beam.sdk.common.runner.v1.RunnerApi;
 import org.apache.beam.sdk.common.runner.v1.RunnerApi.FunctionSpec;
 import org.apache.beam.sdk.runners.AppliedPTransform;
@@ -123,7 +124,10 @@ public class PTransformTranslation {
   }
 
   /**
-   * Translates a non-composite {@link AppliedPTransform} into a runner API 
proto.
+   * Translates a composite {@link AppliedPTransform} into a runner API proto 
with no component
+   * transforms.
+   *
+   * This should not be used when translating a {@link Pipeline}.
*
* Does not register the {@code appliedPTransform} within the provided 
{@link SdkComponents}.
*/



[28/50] beam git commit: This closes #3298

2017-06-07 Thread davor
This closes #3298


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/be26dd30
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/be26dd30
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/be26dd30

Branch: refs/heads/DSL_SQL
Commit: be26dd301a0da28451dbebe8d846e0265311d3fe
Parents: 6853d8e 3a9c00a
Author: Davor Bonaci 
Authored: Tue Jun 6 15:44:54 2017 -0700
Committer: Davor Bonaci 
Committed: Tue Jun 6 15:44:54 2017 -0700

--
 .../beam/sdk/io/gcp/spanner/SpannerWriteIT.java | 32 ++--
 1 file changed, 23 insertions(+), 9 deletions(-)
--




[44/50] beam git commit: This closes #2286

2017-06-07 Thread davor
This closes #2286


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/609016d7
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/609016d7
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/609016d7

Branch: refs/heads/DSL_SQL
Commit: 609016d700c84800cf942482fb7cd2ddaa420b00
Parents: b2de3db 8f4fa43
Author: Aljoscha Krettek 
Authored: Wed Jun 7 19:43:19 2017 +0200
Committer: Aljoscha Krettek 
Committed: Wed Jun 7 19:43:19 2017 +0200

--
 .../translation/types/CoderTypeSerializer.java  |  41 ++-
 .../streaming/io/UnboundedSourceWrapper.java|   2 +
 .../flink/streaming/TestCountingSource.java |  48 ++-
 .../streaming/UnboundedSourceWrapperTest.java   | 309 +++
 4 files changed, 254 insertions(+), 146 deletions(-)
--




[08/50] beam git commit: Remove the FnOutputT parameter from DoFnOperator

2017-06-07 Thread davor
Remove the FnOutputT parameter from DoFnOperator


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/e8f26085
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/e8f26085
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/e8f26085

Branch: refs/heads/DSL_SQL
Commit: e8f26085e889f8f618c0961a5458cbc42b432c01
Parents: b0601fd
Author: JingsongLi 
Authored: Tue Jun 6 17:31:09 2017 +0800
Committer: Aljoscha Krettek 
Committed: Tue Jun 6 14:33:36 2017 +0200

--
 .../FlinkStreamingTransformTranslators.java | 10 +-
 .../wrappers/streaming/DoFnOperator.java| 20 ++--
 .../streaming/SplittableDoFnOperator.java   | 12 ++--
 .../wrappers/streaming/WindowDoFnOperator.java  |  2 +-
 .../beam/runners/flink/PipelineOptionsTest.java |  6 +++---
 .../flink/streaming/DoFnOperatorTest.java   | 11 +--
 6 files changed, 30 insertions(+), 31 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/e8f26085/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingTransformTranslators.java
--
diff --git 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingTransformTranslators.java
 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingTransformTranslators.java
index d8c3049..2a7c5d6 100644
--- 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingTransformTranslators.java
+++ 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingTransformTranslators.java
@@ -332,7 +332,7 @@ class FlinkStreamingTransformTranslators {
   static class ParDoTranslationHelper {
 
 interface DoFnOperatorFactory {
-  DoFnOperator createDoFnOperator(
+  DoFnOperator createDoFnOperator(
   DoFn doFn,
   String stepName,
   List> sideInputs,
@@ -395,7 +395,7 @@ class FlinkStreamingTransformTranslators {
   context.getCoder((PCollection) 
outputs.get(mainOutputTag)));
 
   if (sideInputs.isEmpty()) {
-DoFnOperator doFnOperator =
+DoFnOperator doFnOperator =
 doFnOperatorFactory.createDoFnOperator(
 doFn,
 context.getCurrentTransform().getFullName(),
@@ -416,7 +416,7 @@ class FlinkStreamingTransformTranslators {
 Tuple2>, DataStream> 
transformedSideInputs =
 transformSideInputs(sideInputs, context);
 
-DoFnOperator doFnOperator =
+DoFnOperator doFnOperator =
 doFnOperatorFactory.createDoFnOperator(
 doFn,
 context.getCurrentTransform().getFullName(),
@@ -493,7 +493,7 @@ class FlinkStreamingTransformTranslators {
   context,
   new ParDoTranslationHelper.DoFnOperatorFactory() {
 @Override
-public DoFnOperator createDoFnOperator(
+public DoFnOperator createDoFnOperator(
 DoFn doFn,
 String stepName,
 List> sideInputs,
@@ -547,7 +547,7 @@ class FlinkStreamingTransformTranslators {
 @Override
 public DoFnOperator<
 KeyedWorkItem>,
-OutputT, OutputT> createDoFnOperator(
+OutputT> createDoFnOperator(
 DoFn<
 KeyedWorkItem>,
 OutputT> doFn,

http://git-wip-us.apache.org/repos/asf/beam/blob/e8f26085/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
--
diff --git 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
index 8c27ed9..350f323 100644
--- 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
+++ 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
@@ -94,21 +94,21 @@ import org.joda.time.Instant;
  * Flink operator for executing {@link DoFn DoFns}.
  *
  * @param  the input type of the {@link DoFn}
- * @param  the output type of the {@link DoFn}
+ * @param  the output type of the {@link DoFn}
  * @param  the output type of the operator, this can be different 
from the fn output
  * type when we have additional tagged outputs
  */
-public class DoFnOperator
+public class DoFnOperator
 extends AbstractStreamOperator>
 implements OneInputStreamOperator, 
WindowedValue>,
   TwoInputStreamOperator, RawUnionValue, 
WindowedValue>,
 KeyGroupCheckpointedOperator, Triggerable

[29/50] beam git commit: Fix imports in sdk_worker.

2017-06-07 Thread davor
Fix imports in sdk_worker.


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/efe8e1f4
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/efe8e1f4
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/efe8e1f4

Branch: refs/heads/DSL_SQL
Commit: efe8e1f41d78e06bef7ab5c9d72a7c65f553c5a3
Parents: be26dd3
Author: Robert Bradshaw 
Authored: Tue Jun 6 15:14:12 2017 -0700
Committer: Ahmet Altay 
Committed: Tue Jun 6 16:12:39 2017 -0700

--
 sdks/python/apache_beam/runners/worker/sdk_worker.py | 6 --
 1 file changed, 4 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/efe8e1f4/sdks/python/apache_beam/runners/worker/sdk_worker.py
--
diff --git a/sdks/python/apache_beam/runners/worker/sdk_worker.py 
b/sdks/python/apache_beam/runners/worker/sdk_worker.py
index 596bb90..33c50ad 100644
--- a/sdks/python/apache_beam/runners/worker/sdk_worker.py
+++ b/sdks/python/apache_beam/runners/worker/sdk_worker.py
@@ -35,7 +35,8 @@ from google.protobuf import wrappers_pb2
 from apache_beam.coders import coder_impl
 from apache_beam.coders import WindowedValueCoder
 from apache_beam.internal import pickler
-from apache_beam.runners.dataflow.native_io import iobase
+from apache_beam.io import iobase
+from apache_beam.runners.dataflow.native_io import iobase as native_iobase
 from apache_beam.utils import counters
 from apache_beam.runners.api import beam_fn_api_pb2
 from apache_beam.runners.worker import operation_specs
@@ -126,7 +127,8 @@ class DataInputOperation(RunnerIOOperation):
 # custom sources without forcing intermediate materialization.  This seems very
 # related to the desire to inject key and window preserving [Splittable]DoFns
 # into the view computation.
-class SideInputSource(iobase.NativeSource, iobase.NativeSourceReader):
+class SideInputSource(native_iobase.NativeSource,
+  native_iobase.NativeSourceReader):
   """A 'source' for reading side inputs via state API calls.
   """
 



[18/50] beam git commit: Migrate Python tests to not depend on fix sharding for file output

2017-06-07 Thread davor
Migrate Python tests to not depend on fix sharding for file output


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/b5c257d5
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/b5c257d5
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/b5c257d5

Branch: refs/heads/DSL_SQL
Commit: b5c257d5fa2e3445a37a8154bde706392c23c305
Parents: 513c952
Author: Charles Chen 
Authored: Mon Jun 5 16:31:13 2017 -0700
Committer: Ahmet Altay 
Committed: Tue Jun 6 13:55:13 2017 -0700

--
 .../complete/juliaset/juliaset/juliaset_test.py |  5 +++--
 .../apache_beam/examples/complete/tfidf_test.py |  5 +++--
 .../examples/cookbook/group_with_coder_test.py  |  5 +++--
 .../examples/cookbook/mergecontacts_test.py |  3 ++-
 .../examples/cookbook/multiple_output_pardo_test.py | 11 ++-
 .../examples/wordcount_debugging_test.py|  3 ++-
 .../apache_beam/examples/wordcount_minimal_test.py  |  3 ++-
 sdks/python/apache_beam/examples/wordcount_test.py  |  3 ++-
 sdks/python/apache_beam/testing/util.py | 16 
 9 files changed, 39 insertions(+), 15 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/b5c257d5/sdks/python/apache_beam/examples/complete/juliaset/juliaset/juliaset_test.py
--
diff --git 
a/sdks/python/apache_beam/examples/complete/juliaset/juliaset/juliaset_test.py 
b/sdks/python/apache_beam/examples/complete/juliaset/juliaset/juliaset_test.py
index 17d9cf3..91c75aa 100644
--- 
a/sdks/python/apache_beam/examples/complete/juliaset/juliaset/juliaset_test.py
+++ 
b/sdks/python/apache_beam/examples/complete/juliaset/juliaset/juliaset_test.py
@@ -25,6 +25,7 @@ import unittest
 
 
 from apache_beam.examples.complete.juliaset.juliaset import juliaset
+from apache_beam.testing.util import open_shards
 
 
 class JuliaSetTest(unittest.TestCase):
@@ -60,8 +61,8 @@ class JuliaSetTest(unittest.TestCase):
 
 # Parse the results from the file, and ensure it was written in the proper
 # format.
-with open(self.test_files['output_coord_file_name'] +
-  '-0-of-1') as result_file:
+with open_shards(self.test_files['output_coord_file_name'] +
+ '-*-of-*') as result_file:
   output_lines = result_file.readlines()
 
   # Should have a line for each x-coordinate.

http://git-wip-us.apache.org/repos/asf/beam/blob/b5c257d5/sdks/python/apache_beam/examples/complete/tfidf_test.py
--
diff --git a/sdks/python/apache_beam/examples/complete/tfidf_test.py 
b/sdks/python/apache_beam/examples/complete/tfidf_test.py
index 322426f..b6f8825 100644
--- a/sdks/python/apache_beam/examples/complete/tfidf_test.py
+++ b/sdks/python/apache_beam/examples/complete/tfidf_test.py
@@ -28,6 +28,7 @@ from apache_beam.examples.complete import tfidf
 from apache_beam.testing.test_pipeline import TestPipeline
 from apache_beam.testing.util import assert_that
 from apache_beam.testing.util import equal_to
+from apache_beam.testing.util import open_shards
 
 
 EXPECTED_RESULTS = set([
@@ -76,8 +77,8 @@ class TfIdfTest(unittest.TestCase):
 '--output', os.path.join(temp_folder, 'result')])
 # Parse result file and compare.
 results = []
-with open(os.path.join(temp_folder,
-   'result-0-of-1')) as result_file:
+with open_shards(os.path.join(
+temp_folder, 'result-*-of-*')) as result_file:
   for line in result_file:
 match = re.search(EXPECTED_LINE_RE, line)
 logging.info('Result line: %s', line)

http://git-wip-us.apache.org/repos/asf/beam/blob/b5c257d5/sdks/python/apache_beam/examples/cookbook/group_with_coder_test.py
--
diff --git a/sdks/python/apache_beam/examples/cookbook/group_with_coder_test.py 
b/sdks/python/apache_beam/examples/cookbook/group_with_coder_test.py
index 268ba8d..fb630ba 100644
--- a/sdks/python/apache_beam/examples/cookbook/group_with_coder_test.py
+++ b/sdks/python/apache_beam/examples/cookbook/group_with_coder_test.py
@@ -22,6 +22,7 @@ import tempfile
 import unittest
 
 from apache_beam.examples.cookbook import group_with_coder
+from apache_beam.testing.util import open_shards
 
 
 # Patch group_with_coder.PlayerCoder.decode(). To test that the PlayerCoder was
@@ -53,7 +54,7 @@ class GroupWithCoderTest(unittest.TestCase):
 '--output=%s.result' % temp_path])
 # Parse result file and compare.
 results = []
-with open(temp_path + '.result-0-of-1') as result_file:
+with open_shards(temp_path + '.result-*-of-*') as result_file:
   for line in result_file:
 name, points = line.split(',')
   

[09/50] beam git commit: [BEAM-1498] Use Flink-native side outputs

2017-06-07 Thread davor
[BEAM-1498] Use Flink-native side outputs


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/b0601fd4
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/b0601fd4
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/b0601fd4

Branch: refs/heads/DSL_SQL
Commit: b0601fd43e0929e8b925dbe566e564460f91d9fc
Parents: 88f78fa
Author: JingsongLi 
Authored: Sun Jun 4 21:56:10 2017 +0800
Committer: Aljoscha Krettek 
Committed: Tue Jun 6 14:33:36 2017 +0200

--
 .../FlinkStreamingTransformTranslators.java | 145 ++-
 .../wrappers/streaming/DoFnOperator.java|  40 +++--
 .../wrappers/streaming/WindowDoFnOperator.java  |   4 +-
 .../beam/runners/flink/PipelineOptionsTest.java |   5 +-
 .../flink/streaming/DoFnOperatorTest.java   |  65 +
 5 files changed, 112 insertions(+), 147 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/b0601fd4/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingTransformTranslators.java
--
diff --git 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingTransformTranslators.java
 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingTransformTranslators.java
index 00e9934..d8c3049 100644
--- 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingTransformTranslators.java
+++ 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/FlinkStreamingTransformTranslators.java
@@ -18,9 +18,6 @@
 
 package org.apache.beam.runners.flink;
 
-import static com.google.common.base.Preconditions.checkArgument;
-
-import com.google.common.collect.Lists;
 import com.google.common.collect.Maps;
 import java.nio.ByteBuffer;
 import java.util.ArrayList;
@@ -29,7 +26,6 @@ import java.util.Collections;
 import java.util.HashMap;
 import java.util.List;
 import java.util.Map;
-import java.util.Map.Entry;
 import org.apache.beam.runners.core.KeyedWorkItem;
 import org.apache.beam.runners.core.SplittableParDoViaKeyedWorkItems;
 import org.apache.beam.runners.core.SystemReduceFn;
@@ -84,16 +80,15 @@ import org.apache.flink.api.java.functions.KeySelector;
 import org.apache.flink.api.java.tuple.Tuple2;
 import org.apache.flink.api.java.typeutils.GenericTypeInfo;
 import org.apache.flink.api.java.typeutils.ResultTypeQueryable;
-import org.apache.flink.streaming.api.collector.selector.OutputSelector;
 import org.apache.flink.streaming.api.datastream.DataStream;
 import org.apache.flink.streaming.api.datastream.DataStreamSource;
 import org.apache.flink.streaming.api.datastream.KeyedStream;
 import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
-import org.apache.flink.streaming.api.datastream.SplitStream;
 import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
 import org.apache.flink.streaming.api.operators.TwoInputStreamOperator;
 import org.apache.flink.streaming.api.transformations.TwoInputTransformation;
 import org.apache.flink.util.Collector;
+import org.apache.flink.util.OutputTag;
 
 /**
  * This class contains all the mappings between Beam and Flink
@@ -337,7 +332,7 @@ class FlinkStreamingTransformTranslators {
   static class ParDoTranslationHelper {
 
 interface DoFnOperatorFactory {
-  DoFnOperator createDoFnOperator(
+  DoFnOperator createDoFnOperator(
   DoFn doFn,
   String stepName,
   List> sideInputs,
@@ -345,7 +340,7 @@ class FlinkStreamingTransformTranslators {
   List> additionalOutputTags,
   FlinkStreamingTranslationContext context,
   WindowingStrategy windowingStrategy,
-  Map, Integer> tagsToLabels,
+  Map, OutputTag>> tagsToLabels,
   Coder> inputCoder,
   Coder keyCoder,
   Map> transformedSideInputs);
@@ -354,7 +349,6 @@ class FlinkStreamingTransformTranslators {
 static  void translateParDo(
 String transformName,
 DoFn doFn,
-String stepName,
 PCollection input,
 List> sideInputs,
 Map, PValue> outputs,
@@ -366,10 +360,15 @@ class FlinkStreamingTransformTranslators {
   // we assume that the transformation does not change the windowing 
strategy.
   WindowingStrategy windowingStrategy = input.getWindowingStrategy();
 
-  Map, Integer> tagsToLabels =
-  transformTupleTagsToLabels(mainOutputTag, outputs);
+  Map, OutputTag>> tagsToOutputTags = 
Maps.newHashMap();
+  for (Map.Entry, PValue> entry : outputs.entrySet()) {
+if (!tagsToOutputTags.containsKey(entry.getKey())) {
+  tagsToOutputTags.put(entry.getKey(), new 
OutputTag<>(entry.getKey().getId(),
+  (TypeInformation) context.getTypeInfo((PCollection) 
entry.getValue(;
+}
+  

[12/50] beam git commit: This closes #3289

2017-06-07 Thread davor
This closes #3289


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/a0545508
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/a0545508
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/a0545508

Branch: refs/heads/DSL_SQL
Commit: a05455088de338b46582a55df974865561dc70e7
Parents: aebd3a4 dbab052
Author: Pei He 
Authored: Tue Jun 6 23:19:25 2017 +0800
Committer: Pei He 
Committed: Tue Jun 6 23:19:25 2017 +0800

--
 runners/flink/pom.xml   |   1 -
 .../streaming/state/FlinkStateInternals.java| 205 ++-
 2 files changed, 202 insertions(+), 4 deletions(-)
--




[11/50] beam git commit: Flink runner: support MapState in FlinkStateInternals.

2017-06-07 Thread davor
Flink runner: support MapState in FlinkStateInternals.


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/dbab052c
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/dbab052c
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/dbab052c

Branch: refs/heads/DSL_SQL
Commit: dbab052c4456ff51dd4ce44979c77a508acc17e9
Parents: aebd3a4
Author: 波特 
Authored: Thu May 18 12:23:20 2017 +0800
Committer: Pei He 
Committed: Tue Jun 6 23:18:33 2017 +0800

--
 runners/flink/pom.xml   |   1 -
 .../streaming/state/FlinkStateInternals.java| 205 ++-
 2 files changed, 202 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/dbab052c/runners/flink/pom.xml
--
diff --git a/runners/flink/pom.xml b/runners/flink/pom.xml
index 92f95a0..c4c6b55 100644
--- a/runners/flink/pom.xml
+++ b/runners/flink/pom.xml
@@ -92,7 +92,6 @@
 org.apache.beam.sdk.testing.FlattenWithHeterogeneousCoders,
 org.apache.beam.sdk.testing.LargeKeys$Above100MB,
 org.apache.beam.sdk.testing.UsesSetState,
-org.apache.beam.sdk.testing.UsesMapState,
 org.apache.beam.sdk.testing.UsesCommittedMetrics,
 org.apache.beam.sdk.testing.UsesTestStream,
 org.apache.beam.sdk.testing.UsesSplittableParDo

http://git-wip-us.apache.org/repos/asf/beam/blob/dbab052c/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/state/FlinkStateInternals.java
--
diff --git 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/state/FlinkStateInternals.java
 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/state/FlinkStateInternals.java
index b73abe9..f0d3278 100644
--- 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/state/FlinkStateInternals.java
+++ 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/state/FlinkStateInternals.java
@@ -26,6 +26,7 @@ import org.apache.beam.runners.core.StateInternals;
 import org.apache.beam.runners.core.StateNamespace;
 import org.apache.beam.runners.core.StateTag;
 import org.apache.beam.runners.flink.translation.types.CoderTypeInformation;
+import org.apache.beam.runners.flink.translation.types.CoderTypeSerializer;
 import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.coders.CoderException;
 import org.apache.beam.sdk.coders.InstantCoder;
@@ -46,6 +47,7 @@ import 
org.apache.beam.sdk.transforms.windowing.TimestampCombiner;
 import org.apache.beam.sdk.util.CoderUtils;
 import org.apache.beam.sdk.util.CombineContextFactory;
 import org.apache.flink.api.common.state.ListStateDescriptor;
+import org.apache.flink.api.common.state.MapStateDescriptor;
 import org.apache.flink.api.common.state.ValueStateDescriptor;
 import org.apache.flink.api.common.typeutils.base.StringSerializer;
 import org.apache.flink.runtime.state.KeyedStateBackend;
@@ -132,11 +134,11 @@ public class FlinkStateInternals implements 
StateInternals {
 
   @Override
   public  MapState bindMap(
-  StateTag> spec,
+  StateTag> address,
   Coder mapKeyCoder,
   Coder mapValueCoder) {
-throw new UnsupportedOperationException(
-String.format("%s is not supported", 
MapState.class.getSimpleName()));
+return new FlinkMapState<>(
+flinkStateBackend, address, namespace, mapKeyCoder, 
mapValueCoder);
   }
 
   @Override
@@ -1029,4 +1031,201 @@ public class FlinkStateInternals implements 
StateInternals {
   return result;
 }
   }
+
+  private static class FlinkMapState implements MapState {
+
+private final StateNamespace namespace;
+private final StateTag> address;
+private final MapStateDescriptor flinkStateDescriptor;
+private final KeyedStateBackend flinkStateBackend;
+
+FlinkMapState(
+KeyedStateBackend flinkStateBackend,
+StateTag> address,
+StateNamespace namespace,
+Coder mapKeyCoder, Coder mapValueCoder) {
+  this.namespace = namespace;
+  this.address = address;
+  this.flinkStateBackend = flinkStateBackend;
+  this.flinkStateDescriptor = new MapStateDescriptor<>(address.getId(),
+  new CoderTypeSerializer<>(mapKeyCoder), new 
CoderTypeSerializer<>(mapValueCoder));
+}
+
+@Override
+public ReadableState get(final KeyT input) {
+  return new ReadableState() {
+@Override
+public Val

[06/50] beam git commit: [BEAM-2276] Cleanups on the windowed DefaultFilenamePolicy

2017-06-07 Thread davor
[BEAM-2276] Cleanups on the windowed DefaultFilenamePolicy


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/e764167f
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/e764167f
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/e764167f

Branch: refs/heads/DSL_SQL
Commit: e764167f40e603ac00ac80758cac0108bcc49769
Parents: 6d64c6e
Author: Reuven Lax 
Authored: Thu May 25 23:42:17 2017 -0700
Committer: Jean-Baptiste Onofré 
Committed: Tue Jun 6 11:08:35 2017 +0200

--
 .../construction/PTransformMatchersTest.java|   8 +-
 .../direct/WriteWithShardingFactoryTest.java|  23 +++--
 .../java/org/apache/beam/sdk/io/AvroIO.java |   2 +-
 .../beam/sdk/io/DefaultFilenamePolicy.java  | 103 +--
 .../java/org/apache/beam/sdk/io/TFRecordIO.java |   2 +-
 .../java/org/apache/beam/sdk/io/TextIO.java |   2 +-
 .../java/org/apache/beam/sdk/io/AvroIOTest.java |  11 +-
 .../beam/sdk/io/DefaultFilenamePolicyTest.java  |  26 ++---
 .../java/org/apache/beam/sdk/io/TextIOTest.java |   5 +-
 .../org/apache/beam/sdk/io/xml/XmlSink.java |   2 +-
 10 files changed, 69 insertions(+), 115 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/e764167f/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/PTransformMatchersTest.java
--
diff --git 
a/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/PTransformMatchersTest.java
 
b/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/PTransformMatchersTest.java
index cfea62f..2497598 100644
--- 
a/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/PTransformMatchersTest.java
+++ 
b/runners/core-construction-java/src/test/java/org/apache/beam/runners/core/construction/PTransformMatchersTest.java
@@ -504,8 +504,12 @@ public class PTransformMatchersTest implements 
Serializable {
   @Test
   public void writeWithRunnerDeterminedSharding() {
 ResourceId outputDirectory = LocalResources.fromString("/foo/bar", true /* 
isDirectory */);
-FilenamePolicy policy = 
DefaultFilenamePolicy.constructUsingStandardParameters(
-StaticValueProvider.of(outputDirectory), 
DefaultFilenamePolicy.DEFAULT_SHARD_TEMPLATE, "");
+FilenamePolicy policy =
+DefaultFilenamePolicy.constructUsingStandardParameters(
+StaticValueProvider.of(outputDirectory),
+DefaultFilenamePolicy.DEFAULT_UNWINDOWED_SHARD_TEMPLATE,
+"",
+false);
 WriteFiles write =
 WriteFiles.to(
 new 
FileBasedSink(StaticValueProvider.of(outputDirectory), policy) {

http://git-wip-us.apache.org/repos/asf/beam/blob/e764167f/runners/direct-java/src/test/java/org/apache/beam/runners/direct/WriteWithShardingFactoryTest.java
--
diff --git 
a/runners/direct-java/src/test/java/org/apache/beam/runners/direct/WriteWithShardingFactoryTest.java
 
b/runners/direct-java/src/test/java/org/apache/beam/runners/direct/WriteWithShardingFactoryTest.java
index 5c4fea1..a88d95e 100644
--- 
a/runners/direct-java/src/test/java/org/apache/beam/runners/direct/WriteWithShardingFactoryTest.java
+++ 
b/runners/direct-java/src/test/java/org/apache/beam/runners/direct/WriteWithShardingFactoryTest.java
@@ -129,15 +129,20 @@ public class WriteWithShardingFactoryTest {
   @Test
   public void withNoShardingSpecifiedReturnsNewTransform() {
 ResourceId outputDirectory = LocalResources.fromString("/foo", true /* 
isDirectory */);
-FilenamePolicy policy = 
DefaultFilenamePolicy.constructUsingStandardParameters(
-StaticValueProvider.of(outputDirectory), 
DefaultFilenamePolicy.DEFAULT_SHARD_TEMPLATE, "");
-WriteFiles original = WriteFiles.to(
-new FileBasedSink(StaticValueProvider.of(outputDirectory), 
policy) {
-  @Override
-  public WriteOperation createWriteOperation() {
-throw new IllegalArgumentException("Should not be used");
-  }
-});
+FilenamePolicy policy =
+DefaultFilenamePolicy.constructUsingStandardParameters(
+StaticValueProvider.of(outputDirectory),
+DefaultFilenamePolicy.DEFAULT_UNWINDOWED_SHARD_TEMPLATE,
+"",
+false);
+WriteFiles original =
+WriteFiles.to(
+new FileBasedSink(StaticValueProvider.of(outputDirectory), 
policy) {
+  @Override
+  public WriteOperation createWriteOperation() {
+throw new IllegalArgumentException("Should not be used");
+  }
+});
 @SuppressWarnings("unchecked")
 PCollection objs = (PCollection) 
p.a

[16/50] beam git commit: Improve Splittable ParDo translation

2017-06-07 Thread davor
Improve Splittable ParDo translation


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/1b00d95a
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/1b00d95a
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/1b00d95a

Branch: refs/heads/DSL_SQL
Commit: 1b00d95a1105d2611b985dc463da0884a6646354
Parents: 840492d
Author: Kenneth Knowles 
Authored: Thu May 25 06:29:16 2017 -0700
Committer: Kenneth Knowles 
Committed: Tue Jun 6 13:13:12 2017 -0700

--
 .../core/construction/ParDoTranslation.java | 20 +++
 .../core/construction/SplittableParDo.java  | 18 --
 .../core/construction/ParDoTranslationTest.java | 35 +++-
 .../core/SplittableParDoViaKeyedWorkItems.java  | 10 +-
 .../direct/KeyedPValueTrackingVisitor.java  |  2 +-
 .../src/main/proto/beam_runner_api.proto|  3 ++
 6 files changed, 82 insertions(+), 6 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/1b00d95a/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/ParDoTranslation.java
--
diff --git 
a/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/ParDoTranslation.java
 
b/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/ParDoTranslation.java
index fe66179..34e0d86 100644
--- 
a/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/ParDoTranslation.java
+++ 
b/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/ParDoTranslation.java
@@ -144,6 +144,7 @@ public class ParDoTranslation {
 
 ParDoPayload.Builder builder = ParDoPayload.newBuilder();
 builder.setDoFn(toProto(parDo.getFn(), parDo.getMainOutputTag()));
+builder.setSplittable(signature.processElement().isSplittable());
 for (PCollectionView sideInput : parDo.getSideInputs()) {
   builder.putSideInputs(sideInput.getTagInternal().getId(), 
toProto(sideInput));
 }
@@ -496,6 +497,25 @@ public class ParDoTranslation {
 .build();
   }
 
+  private static  ParDoPayload getParDoPayload(AppliedPTransform 
transform)
+  throws IOException {
+return PTransformTranslation.toProto(
+transform, Collections.>emptyList(), 
SdkComponents.create())
+.getSpec()
+.getParameter()
+.unpack(ParDoPayload.class);
+  }
+
+  public static boolean usesStateOrTimers(AppliedPTransform 
transform) throws IOException {
+ParDoPayload payload = getParDoPayload(transform);
+return payload.getStateSpecsCount() > 0 || payload.getTimerSpecsCount() > 
0;
+  }
+
+  public static boolean isSplittable(AppliedPTransform transform) 
throws IOException {
+ParDoPayload payload = getParDoPayload(transform);
+return payload.getSplittable();
+  }
+
   private static ViewFn viewFnFromProto(SdkFunctionSpec viewFn)
   throws InvalidProtocolBufferException {
 FunctionSpec spec = viewFn.getSpec();

http://git-wip-us.apache.org/repos/asf/beam/blob/1b00d95a/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/SplittableParDo.java
--
diff --git 
a/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/SplittableParDo.java
 
b/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/SplittableParDo.java
index dfca7d2..665e39d 100644
--- 
a/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/SplittableParDo.java
+++ 
b/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/SplittableParDo.java
@@ -22,6 +22,7 @@ import static 
com.google.common.base.Preconditions.checkNotNull;
 
 import java.util.List;
 import java.util.UUID;
+import 
org.apache.beam.runners.core.construction.PTransformTranslation.RawPTransform;
 import org.apache.beam.sdk.annotations.Experimental;
 import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.transforms.DoFn;
@@ -67,6 +68,12 @@ public class SplittableParDo
   public static final String SPLITTABLE_PROCESS_URN =
   "urn:beam:runners_core:transforms:splittable_process:v1";
 
+  public static final String SPLITTABLE_PROCESS_KEYED_ELEMENTS_URN =
+  "urn:beam:runners_core:transforms:splittable_process_keyed_elements:v1";
+
+  public static final String SPLITTABLE_GBKIKWI_URN =
+  "urn:beam:runners_core:transforms:splittable_gbkikwi:v1";
+
   /**
* Creates the transform for the given original multi-output {@link ParDo}.
*
@@ -133,11 +140,11 @@ public class SplittableParDo
 
   /**
* Runner-specific primitive {@link PTransform} that invokes the {@link 
DoF

[13/50] beam git commit: Do not fail when gen_protos cannot be imported

2017-06-07 Thread davor
Do not fail when gen_protos cannot be imported


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/ae3dc5f3
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/ae3dc5f3
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/ae3dc5f3

Branch: refs/heads/DSL_SQL
Commit: ae3dc5f313ad55f3f86805b9f220bd1cdf1c902b
Parents: a054550
Author: Ahmet Altay 
Authored: Thu Jun 1 11:29:42 2017 -0700
Committer: Ahmet Altay 
Committed: Tue Jun 6 12:58:26 2017 -0700

--
 sdks/python/setup.py | 20 
 1 file changed, 12 insertions(+), 8 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/ae3dc5f3/sdks/python/setup.py
--
diff --git a/sdks/python/setup.py b/sdks/python/setup.py
index 596c8c5..051043b 100644
--- a/sdks/python/setup.py
+++ b/sdks/python/setup.py
@@ -125,14 +125,18 @@ GCP_REQUIREMENTS = [
 
 # We must generate protos after setup_requires are installed.
 def generate_protos_first(original_cmd):
-  # See https://issues.apache.org/jira/browse/BEAM-2366
-  # pylint: disable=wrong-import-position
-  import gen_protos
-  class cmd(original_cmd, object):
-def run(self):
-  gen_protos.generate_proto_files()
-  super(cmd, self).run()
-  return cmd
+  try:
+# See https://issues.apache.org/jira/browse/BEAM-2366
+# pylint: disable=wrong-import-position
+import gen_protos
+class cmd(original_cmd, object):
+  def run(self):
+gen_protos.generate_proto_files()
+super(cmd, self).run()
+return cmd
+  except ImportError:
+warnings.warn("Could not import gen_protos, skipping proto generation.")
+return original_cmd
 
 
 setuptools.setup(



[19/50] beam git commit: This closes #3299

2017-06-07 Thread davor
This closes #3299


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/e3139a38
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/e3139a38
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/e3139a38

Branch: refs/heads/DSL_SQL
Commit: e3139a38ff533c21fbc70f85eafcb1d68b52a4b0
Parents: 513c952 b5c257d
Author: Ahmet Altay 
Authored: Tue Jun 6 13:55:25 2017 -0700
Committer: Ahmet Altay 
Committed: Tue Jun 6 13:55:25 2017 -0700

--
 .../complete/juliaset/juliaset/juliaset_test.py |  5 +++--
 .../apache_beam/examples/complete/tfidf_test.py |  5 +++--
 .../examples/cookbook/group_with_coder_test.py  |  5 +++--
 .../examples/cookbook/mergecontacts_test.py |  3 ++-
 .../examples/cookbook/multiple_output_pardo_test.py | 11 ++-
 .../examples/wordcount_debugging_test.py|  3 ++-
 .../apache_beam/examples/wordcount_minimal_test.py  |  3 ++-
 sdks/python/apache_beam/examples/wordcount_test.py  |  3 ++-
 sdks/python/apache_beam/testing/util.py | 16 
 9 files changed, 39 insertions(+), 15 deletions(-)
--




[22/50] beam git commit: [BEAM-2405] Write to BQ using the streaming API

2017-06-07 Thread davor
[BEAM-2405] Write to BQ using the streaming API


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/1498684d
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/1498684d
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/1498684d

Branch: refs/heads/DSL_SQL
Commit: 1498684dfea31594a236edd7fde5d299e4b0aa1e
Parents: fa3922b
Author: Sourabh Bajaj 
Authored: Fri Jun 2 20:32:48 2017 -0700
Committer: chamik...@google.com 
Committed: Tue Jun 6 14:04:33 2017 -0700

--
 sdks/python/apache_beam/io/gcp/bigquery.py  | 177 +++
 sdks/python/apache_beam/io/gcp/bigquery_test.py | 172 ++
 2 files changed, 349 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/1498684d/sdks/python/apache_beam/io/gcp/bigquery.py
--
diff --git a/sdks/python/apache_beam/io/gcp/bigquery.py 
b/sdks/python/apache_beam/io/gcp/bigquery.py
index 201c798..9069f73 100644
--- a/sdks/python/apache_beam/io/gcp/bigquery.py
+++ b/sdks/python/apache_beam/io/gcp/bigquery.py
@@ -115,6 +115,9 @@ from apache_beam.internal.gcp import auth
 from apache_beam.internal.gcp.json_value import from_json_value
 from apache_beam.internal.gcp.json_value import to_json_value
 from apache_beam.runners.dataflow.native_io import iobase as dataflow_io
+from apache_beam.transforms import DoFn
+from apache_beam.transforms import ParDo
+from apache_beam.transforms import PTransform
 from apache_beam.transforms.display import DisplayDataItem
 from apache_beam.utils import retry
 from apache_beam.options.pipeline_options import GoogleCloudOptions
@@ -134,6 +137,7 @@ __all__ = [
 'BigQueryDisposition',
 'BigQuerySource',
 'BigQuerySink',
+'WriteToBigQuery',
 ]
 
 JSON_COMPLIANCE_ERROR = 'NAN, INF and -INF values are not JSON compliant.'
@@ -813,6 +817,7 @@ class BigQueryWrapper(object):
 request = bigquery.BigqueryTablesInsertRequest(
 projectId=project_id, datasetId=dataset_id, table=table)
 response = self.client.tables.Insert(request)
+logging.debug("Created the table with id %s", table_id)
 # The response is a bigquery.Table instance.
 return response
 
@@ -1134,3 +1139,175 @@ class BigQueryWrapper(object):
   else:
 result[field.name] = self._convert_cell_value_to_dict(value, field)
 return result
+
+
+class BigQueryWriteFn(DoFn):
+  """A ``DoFn`` that streams writes to BigQuery once the table is created.
+  """
+
+  def __init__(self, table_id, dataset_id, project_id, batch_size, schema,
+   create_disposition, write_disposition, client):
+"""Initialize a WriteToBigQuery transform.
+
+Args:
+  table_id: The ID of the table. The ID must contain only letters
+(a-z, A-Z), numbers (0-9), or underscores (_). If dataset argument is
+None then the table argument must contain the entire table reference
+specified as: 'DATASET.TABLE' or 'PROJECT:DATASET.TABLE'.
+  dataset_id: The ID of the dataset containing this table or null if the
+table reference is specified entirely by the table argument.
+  project_id: The ID of the project containing this table or null if the
+table reference is specified entirely by the table argument.
+  batch_size: Number of rows to be written to BQ per streaming API insert.
+  schema: The schema to be used if the BigQuery table to write has to be
+created. This can be either specified as a 'bigquery.TableSchema' 
object
+or a single string  of the form 
'field1:type1,field2:type2,field3:type3'
+that defines a comma separated list of fields. Here 'type' should
+specify the BigQuery type of the field. Single string based schemas do
+not support nested fields, repeated fields, or specifying a BigQuery
+mode for fields (mode will always be set to 'NULLABLE').
+  create_disposition: A string describing what happens if the table does 
not
+exist. Possible values are:
+- BigQueryDisposition.CREATE_IF_NEEDED: create if does not exist.
+- BigQueryDisposition.CREATE_NEVER: fail the write if does not exist.
+  write_disposition: A string describing what happens if the table has
+already some data. Possible values are:
+-  BigQueryDisposition.WRITE_TRUNCATE: delete existing rows.
+-  BigQueryDisposition.WRITE_APPEND: add to existing rows.
+-  BigQueryDisposition.WRITE_EMPTY: fail the write if table not empty.
+For streaming pipelines WriteTruncate can not be used.
+  test_client: Override the default bigquery client used for testing.
+"""
+self.table_id = table_id
+self.dataset_id = dataset_id
+self.project_id = project_id
+self.schema = schema
+self.client 

[24/50] beam git commit: Whitelist find for tox environments

2017-06-07 Thread davor
Whitelist find for tox environments


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/dcfb31ff
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/dcfb31ff
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/dcfb31ff

Branch: refs/heads/DSL_SQL
Commit: dcfb31ffe46a037e69967c8f6c054562d4d9b3b9
Parents: fdfd775
Author: Sourabh Bajaj 
Authored: Tue Jun 6 11:58:53 2017 -0700
Committer: Ahmet Altay 
Committed: Tue Jun 6 14:13:36 2017 -0700

--
 sdks/python/tox.ini | 2 ++
 1 file changed, 2 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/dcfb31ff/sdks/python/tox.ini
--
diff --git a/sdks/python/tox.ini b/sdks/python/tox.ini
index 2166f6a..eff91fe 100644
--- a/sdks/python/tox.ini
+++ b/sdks/python/tox.ini
@@ -29,6 +29,7 @@ select = E3
 deps =
   nose==1.3.7
   grpcio-tools==1.3.5
+whitelist_externals=find
 commands =
   python --version
   # Clean up all previous python generated files.
@@ -73,6 +74,7 @@ passenv = TRAVIS*
 # autocomplete_test depends on nose when invoked directly.
 deps =
   nose==1.3.7
+whitelist_externals=find
 commands =
   pip install -e .[test,gcp]
   python --version



[47/50] beam git commit: [BEAM-2421] Swap to use an Impulse primitive + DoFn for Create when executing with the Fn API.

2017-06-07 Thread davor
[BEAM-2421] Swap to use an Impulse primitive + DoFn for Create when executing 
with the Fn API.

This closes #3312


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/caecac3b
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/caecac3b
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/caecac3b

Branch: refs/heads/DSL_SQL
Commit: caecac3b4acb5bfa6e36143d3868b2d80ab119da
Parents: 609016d 1cdb80c
Author: Luke Cwik 
Authored: Wed Jun 7 13:43:38 2017 -0700
Committer: Luke Cwik 
Committed: Wed Jun 7 13:43:38 2017 -0700

--
 .../beam/runners/dataflow/DataflowRunner.java   | 154 ++-
 1 file changed, 146 insertions(+), 8 deletions(-)
--




[33/50] beam git commit: This closes #3307: Increase visibility of some Metrics methods

2017-06-07 Thread davor
This closes #3307: Increase visibility of some Metrics methods


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/b6347d02
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/b6347d02
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/b6347d02

Branch: refs/heads/DSL_SQL
Commit: b6347d02ce3086a01e8dd59f80103e45f6af1b5c
Parents: 1d2000d 39674ca
Author: Kenneth Knowles 
Authored: Tue Jun 6 19:43:53 2017 -0700
Committer: Kenneth Knowles 
Committed: Tue Jun 6 19:43:53 2017 -0700

--
 .../apache/beam/runners/core/metrics/CounterCell.java| 10 +++---
 .../org/apache/beam/runners/core/metrics/DirtyState.java |  4 +++-
 .../beam/runners/core/metrics/DistributionCell.java  | 10 +++---
 .../org/apache/beam/runners/core/metrics/GaugeCell.java  | 11 +++
 .../beam/runners/core/metrics/MetricsContainerImpl.java  |  4 +++-
 .../org/apache/beam/sdk/metrics/MetricsContainer.java|  3 ++-
 6 files changed, 29 insertions(+), 13 deletions(-)
--




[48/50] beam git commit: Shutdown Flink Streaming Pipeline when reaching +Inf watermark

2017-06-07 Thread davor
Shutdown Flink Streaming Pipeline when reaching +Inf watermark


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/9c83ffe0
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/9c83ffe0
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/9c83ffe0

Branch: refs/heads/DSL_SQL
Commit: 9c83ffe0cdc6636d2187bf9439a73a3b45756d50
Parents: caecac3
Author: Aljoscha Krettek 
Authored: Mon Jun 5 12:19:00 2017 +0200
Committer: Ismaël Mejía 
Committed: Wed Jun 7 23:13:52 2017 +0200

--
 .../wrappers/streaming/io/UnboundedSourceWrapper.java   | 5 +
 1 file changed, 5 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/9c83ffe0/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
--
diff --git 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
index 6055a43..e75072a 100644
--- 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
+++ 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
@@ -31,6 +31,7 @@ import org.apache.beam.sdk.coders.KvCoder;
 import org.apache.beam.sdk.coders.SerializableCoder;
 import org.apache.beam.sdk.io.UnboundedSource;
 import org.apache.beam.sdk.options.PipelineOptions;
+import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
 import org.apache.beam.sdk.transforms.windowing.GlobalWindow;
 import org.apache.beam.sdk.transforms.windowing.PaneInfo;
 import org.apache.beam.sdk.util.WindowedValue;
@@ -436,6 +437,10 @@ public class UnboundedSourceWrapper<
   }
 }
 context.emitWatermark(new Watermark(watermarkMillis));
+
+if (watermarkMillis >= BoundedWindow.TIMESTAMP_MAX_VALUE.getMillis()) {
+  this.isRunning = false;
+}
   }
   setNextWatermarkTimer(this.runtimeContext);
 }



[42/50] beam git commit: [BEAM-1779] Port UnboundedSourceWrapperTest to use Flink operator test harness

2017-06-07 Thread davor
[BEAM-1779] Port UnboundedSourceWrapperTest to use Flink operator test harness


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/c1dc8f53
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/c1dc8f53
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/c1dc8f53

Branch: refs/heads/DSL_SQL
Commit: c1dc8f53c5438b575a7e84e9f680616ead49d61e
Parents: 62b942a
Author: Aljoscha Krettek 
Authored: Wed Mar 22 11:43:30 2017 +0100
Committer: Aljoscha Krettek 
Committed: Wed Jun 7 19:43:11 2017 +0200

--
 .../flink/streaming/TestCountingSource.java |  48 +++--
 .../streaming/UnboundedSourceWrapperTest.java   | 198 +++
 2 files changed, 110 insertions(+), 136 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/c1dc8f53/runners/flink/src/test/java/org/apache/beam/runners/flink/streaming/TestCountingSource.java
--
diff --git 
a/runners/flink/src/test/java/org/apache/beam/runners/flink/streaming/TestCountingSource.java
 
b/runners/flink/src/test/java/org/apache/beam/runners/flink/streaming/TestCountingSource.java
index 3a08088..edf548a 100644
--- 
a/runners/flink/src/test/java/org/apache/beam/runners/flink/streaming/TestCountingSource.java
+++ 
b/runners/flink/src/test/java/org/apache/beam/runners/flink/streaming/TestCountingSource.java
@@ -133,18 +133,8 @@ public class TestCountingSource
   public Coder getCheckpointMarkCoder() {
 return DelegateCoder.of(
 VarIntCoder.of(),
-new DelegateCoder.CodingFunction() {
-  @Override
-  public Integer apply(CounterMark input) {
-return input.current;
-  }
-},
-new DelegateCoder.CodingFunction() {
-  @Override
-  public CounterMark apply(Integer input) {
-return new CounterMark(input);
-  }
-});
+new FromCounterMark(),
+new ToCounterMark());
   }
 
   @Override
@@ -251,4 +241,38 @@ public class TestCountingSource
   public Coder> getDefaultOutputCoder() {
 return KvCoder.of(VarIntCoder.of(), VarIntCoder.of());
   }
+
+  private class FromCounterMark implements 
DelegateCoder.CodingFunction {
+@Override
+public Integer apply(CounterMark input) {
+  return input.current;
+}
+
+@Override
+public int hashCode() {
+  return FromCounterMark.class.hashCode();
+}
+
+@Override
+public boolean equals(Object obj) {
+  return obj instanceof FromCounterMark;
+}
+  }
+
+  private class ToCounterMark implements DelegateCoder.CodingFunction {
+@Override
+public CounterMark apply(Integer input) {
+  return new CounterMark(input);
+}
+
+@Override
+public int hashCode() {
+  return ToCounterMark.class.hashCode();
+}
+
+@Override
+public boolean equals(Object obj) {
+  return obj instanceof ToCounterMark;
+}
+  }
 }

http://git-wip-us.apache.org/repos/asf/beam/blob/c1dc8f53/runners/flink/src/test/java/org/apache/beam/runners/flink/streaming/UnboundedSourceWrapperTest.java
--
diff --git 
a/runners/flink/src/test/java/org/apache/beam/runners/flink/streaming/UnboundedSourceWrapperTest.java
 
b/runners/flink/src/test/java/org/apache/beam/runners/flink/streaming/UnboundedSourceWrapperTest.java
index e3875bc..716e71d 100644
--- 
a/runners/flink/src/test/java/org/apache/beam/runners/flink/streaming/UnboundedSourceWrapperTest.java
+++ 
b/runners/flink/src/test/java/org/apache/beam/runners/flink/streaming/UnboundedSourceWrapperTest.java
@@ -20,36 +20,20 @@ package org.apache.beam.runners.flink.streaming;
 import static org.junit.Assert.assertEquals;
 import static org.junit.Assert.assertTrue;
 import static org.junit.Assert.fail;
-import static org.mockito.Mockito.mock;
-import static org.mockito.Mockito.when;
 
 import java.util.ArrayList;
 import java.util.Arrays;
 import java.util.Collection;
-import java.util.Collections;
 import java.util.HashSet;
-import java.util.List;
 import java.util.Set;
 import 
org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapper;
 import org.apache.beam.sdk.coders.Coder;
-import org.apache.beam.sdk.io.UnboundedSource;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.options.PipelineOptionsFactory;
 import org.apache.beam.sdk.util.WindowedValue;
 import org.apache.beam.sdk.values.KV;
 import org.apache.beam.sdk.values.ValueWithRecordId;
-import org.apache.flink.api.common.ExecutionConfig;
-import org.apache.flink.api.common.accumulators.Accumulator;
-import org.apache.flink.api.common.state.ListState;
-import org.apache.flink.api.common.state.ListStateDescriptor;
-import org.apache.flink.api.common.state.OperatorStat

[39/50] beam git commit: [BEAM-1231] Add missed "kind:bytes" to CloudObjectKinds/CloudObjectTranslators

2017-06-07 Thread davor
[BEAM-1231] Add missed "kind:bytes" to CloudObjectKinds/CloudObjectTranslators


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/d302a0f3
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/d302a0f3
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/d302a0f3

Branch: refs/heads/DSL_SQL
Commit: d302a0f3bcb89d1a2367efb90925ab5abe70f7dd
Parents: 78b6e3c
Author: Luke Cwik 
Authored: Wed Jun 7 07:48:09 2017 -0700
Committer: Luke Cwik 
Committed: Wed Jun 7 07:53:42 2017 -0700

--
 .../org/apache/beam/runners/dataflow/util/CloudObjectKinds.java | 1 +
 .../beam/runners/dataflow/util/CloudObjectTranslators.java  | 5 +++--
 2 files changed, 4 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/d302a0f3/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/CloudObjectKinds.java
--
diff --git 
a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/CloudObjectKinds.java
 
b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/CloudObjectKinds.java
index 403ade2..58bc446 100644
--- 
a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/CloudObjectKinds.java
+++ 
b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/CloudObjectKinds.java
@@ -28,4 +28,5 @@ class CloudObjectKinds {
   static final String KIND_PAIR = "kind:pair";
   static final String KIND_STREAM = "kind:stream";
   static final String KIND_WINDOWED_VALUE = "kind:windowed_value";
+  static final String KIND_BYTES = "kind:bytes";
 }

http://git-wip-us.apache.org/repos/asf/beam/blob/d302a0f3/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/CloudObjectTranslators.java
--
diff --git 
a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/CloudObjectTranslators.java
 
b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/CloudObjectTranslators.java
index 012a669..ad2363d 100644
--- 
a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/CloudObjectTranslators.java
+++ 
b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/CloudObjectTranslators.java
@@ -279,7 +279,8 @@ class CloudObjectTranslators {
   @Override
   public CloudObject toCloudObject(ByteArrayCoder target) {
 return addComponents(
-CloudObject.forClass(target.getClass()), 
Collections.>emptyList());
+CloudObject.forClassName(CloudObjectKinds.KIND_BYTES),
+Collections.>emptyList());
   }
 
   @Override
@@ -294,7 +295,7 @@ class CloudObjectTranslators {
 
   @Override
   public String cloudObjectClassName() {
-return CloudObject.forClass(ByteArrayCoder.class).getClassName();
+return CloudObjectKinds.KIND_BYTES;
   }
 
 };



[27/50] beam git commit: Generate a random table name. Assume Spanner database exists.

2017-06-07 Thread davor
Generate a random table name.
Assume Spanner database exists.


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/3a9c00ac
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/3a9c00ac
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/3a9c00ac

Branch: refs/heads/DSL_SQL
Commit: 3a9c00ac7303823490d97f2f0adb5469700687ac
Parents: 6853d8e
Author: Mairbek Khadikov 
Authored: Mon Jun 5 12:29:02 2017 -0700
Committer: Davor Bonaci 
Committed: Tue Jun 6 15:44:44 2017 -0700

--
 .../beam/sdk/io/gcp/spanner/SpannerWriteIT.java | 32 ++--
 1 file changed, 23 insertions(+), 9 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/3a9c00ac/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/spanner/SpannerWriteIT.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/spanner/SpannerWriteIT.java
 
b/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/spanner/SpannerWriteIT.java
index 064c65e..8df224b 100644
--- 
a/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/spanner/SpannerWriteIT.java
+++ 
b/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/spanner/SpannerWriteIT.java
@@ -33,6 +33,7 @@ import com.google.cloud.spanner.SpannerOptions;
 import com.google.cloud.spanner.Statement;
 import com.google.spanner.admin.database.v1.CreateDatabaseMetadata;
 import java.util.Collections;
+
 import org.apache.beam.sdk.io.GenerateSequence;
 import org.apache.beam.sdk.options.Default;
 import org.apache.beam.sdk.options.Description;
@@ -52,6 +53,9 @@ import org.junit.runners.JUnit4;
 /** End-to-end test of Cloud Spanner Sink. */
 @RunWith(JUnit4.class)
 public class SpannerWriteIT {
+
+  private static final int MAX_DB_NAME_LENGTH = 30;
+
   @Rule public final transient TestPipeline p = TestPipeline.create();
 
   /** Pipeline options for this test. */
@@ -66,10 +70,10 @@ public class SpannerWriteIT {
 String getInstanceId();
 void setInstanceId(String value);
 
-@Description("Database ID to write to in Spanner")
+@Description("Database ID prefix to write to in Spanner")
 @Default.String("beam-testdb")
-String getDatabaseId();
-void setDatabaseId(String value);
+String getDatabaseIdPrefix();
+void setDatabaseIdPrefix(String value);
 
 @Description("Table name")
 @Default.String("users")
@@ -80,6 +84,7 @@ public class SpannerWriteIT {
   private Spanner spanner;
   private DatabaseAdminClient databaseAdminClient;
   private SpannerTestPipelineOptions options;
+  private String databaseName;
 
   @Before
   public void setUp() throws Exception {
@@ -88,15 +93,17 @@ public class SpannerWriteIT {
 
 spanner = 
SpannerOptions.newBuilder().setProjectId(options.getProjectId()).build().getService();
 
+databaseName = generateDatabaseName();
+
 databaseAdminClient = spanner.getDatabaseAdminClient();
 
 // Delete database if exists.
-databaseAdminClient.dropDatabase(options.getInstanceId(), 
options.getDatabaseId());
+databaseAdminClient.dropDatabase(options.getInstanceId(), databaseName);
 
 Operation op =
 databaseAdminClient.createDatabase(
 options.getInstanceId(),
-options.getDatabaseId(),
+databaseName,
 Collections.singleton(
 "CREATE TABLE "
 + options.getTable()
@@ -107,6 +114,13 @@ public class SpannerWriteIT {
 op.waitFor();
   }
 
+  private String generateDatabaseName() {
+String random = RandomStringUtils
+.randomAlphanumeric(MAX_DB_NAME_LENGTH - 1 - 
options.getDatabaseIdPrefix().length())
+.toLowerCase();
+return options.getDatabaseIdPrefix() + "-" + random;
+  }
+
   @Test
   public void testWrite() throws Exception {
 p.apply(GenerateSequence.from(0).to(100))
@@ -115,13 +129,13 @@ public class SpannerWriteIT {
 SpannerIO.write()
 .withProjectId(options.getProjectId())
 .withInstanceId(options.getInstanceId())
-.withDatabaseId(options.getDatabaseId()));
+.withDatabaseId(databaseName));
 
 p.run();
 DatabaseClient databaseClient =
 spanner.getDatabaseClient(
 DatabaseId.of(
-options.getProjectId(), options.getInstanceId(), 
options.getDatabaseId()));
+options.getProjectId(), options.getInstanceId(), 
databaseName));
 
 ResultSet resultSet =
 databaseClient
@@ -134,7 +148,7 @@ public class SpannerWriteIT {
 
   @After
   public void tearDown() throws Exception {
-databaseAdminClient.dropDatabase(options.getInstanceId(), 
options.getDatabaseId())

[10/50] beam git commit: This closes #3290

2017-06-07 Thread davor
This closes #3290


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/aebd3a4c
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/aebd3a4c
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/aebd3a4c

Branch: refs/heads/DSL_SQL
Commit: aebd3a4c5d416edd9f829e9ac1aecb230ac6
Parents: 88f78fa e8f2608
Author: Aljoscha Krettek 
Authored: Tue Jun 6 14:33:46 2017 +0200
Committer: Aljoscha Krettek 
Committed: Tue Jun 6 14:33:46 2017 +0200

--
 .../FlinkStreamingTransformTranslators.java | 145 ++-
 .../wrappers/streaming/DoFnOperator.java|  60 +---
 .../streaming/SplittableDoFnOperator.java   |  12 +-
 .../wrappers/streaming/WindowDoFnOperator.java  |   4 +-
 .../beam/runners/flink/PipelineOptionsTest.java |   9 +-
 .../flink/streaming/DoFnOperatorTest.java   |  70 +
 6 files changed, 132 insertions(+), 168 deletions(-)
--




[15/50] beam git commit: Fix RawPTransform translation

2017-06-07 Thread davor
Fix RawPTransform translation


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/840492d9
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/840492d9
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/840492d9

Branch: refs/heads/DSL_SQL
Commit: 840492d9d8fb3b08cfe70a525655759fc1a31fdf
Parents: 7c608c3
Author: Kenneth Knowles 
Authored: Fri May 26 14:18:03 2017 -0700
Committer: Kenneth Knowles 
Committed: Tue Jun 6 13:10:33 2017 -0700

--
 .../construction/PTransformTranslation.java | 57 
 runners/direct-java/pom.xml |  5 --
 .../beam/runners/direct/DirectGroupByKey.java   |  5 +-
 .../direct/ParDoMultiOverrideFactory.java   |  3 +-
 .../direct/TestStreamEvaluatorFactory.java  |  3 +-
 .../direct/TransformEvaluatorRegistry.java  |  8 +--
 .../runners/direct/ViewOverrideFactory.java |  3 +-
 7 files changed, 56 insertions(+), 28 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/840492d9/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/PTransformTranslation.java
--
diff --git 
a/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/PTransformTranslation.java
 
b/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/PTransformTranslation.java
index 7c5c593..32ecf43 100644
--- 
a/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/PTransformTranslation.java
+++ 
b/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/PTransformTranslation.java
@@ -22,7 +22,6 @@ import static 
com.google.common.base.Preconditions.checkArgument;
 
 import com.google.common.collect.ImmutableMap;
 import com.google.protobuf.Any;
-import com.google.protobuf.Message;
 import java.io.IOException;
 import java.util.Collections;
 import java.util.List;
@@ -115,7 +114,20 @@ public class PTransformTranslation {
 // TODO: Display Data
 
 PTransform transform = appliedPTransform.getTransform();
-if (KNOWN_PAYLOAD_TRANSLATORS.containsKey(transform.getClass())) {
+// A RawPTransform directly vends its payload. Because it will generally be
+// a subclass, we cannot do dictionary lookup in KNOWN_PAYLOAD_TRANSLATORS.
+if (transform instanceof RawPTransform) {
+  RawPTransform rawPTransform = (RawPTransform) transform;
+
+  if (rawPTransform.getUrn() != null) {
+FunctionSpec.Builder payload = 
FunctionSpec.newBuilder().setUrn(rawPTransform.getUrn());
+@Nullable Any parameter = rawPTransform.getPayload();
+if (parameter != null) {
+  payload.setParameter(parameter);
+}
+transformBuilder.setSpec(payload);
+  }
+} else if (KNOWN_PAYLOAD_TRANSLATORS.containsKey(transform.getClass())) {
   FunctionSpec payload =
   KNOWN_PAYLOAD_TRANSLATORS
   .get(transform.getClass())
@@ -145,6 +157,25 @@ public class PTransformTranslation {
   }
 
   /**
+   * Returns the URN for the transform if it is known, otherwise {@code null}.
+   */
+  @Nullable
+  public static String urnForTransformOrNull(PTransform transform) {
+
+// A RawPTransform directly vends its URN. Because it will generally be
+// a subclass, we cannot do dictionary lookup in KNOWN_PAYLOAD_TRANSLATORS.
+if (transform instanceof RawPTransform) {
+  return ((RawPTransform) transform).getUrn();
+}
+
+TransformPayloadTranslator translator = 
KNOWN_PAYLOAD_TRANSLATORS.get(transform.getClass());
+if (translator == null) {
+  return null;
+}
+return translator.getUrn(transform);
+  }
+
+  /**
* Returns the URN for the transform if it is known, otherwise throws.
*/
   public static String urnForTransform(PTransform transform) {
@@ -176,13 +207,14 @@ public class PTransformTranslation {
* fully expanded in the pipeline proto.
*/
   public abstract static class RawPTransform<
-  InputT extends PInput, OutputT extends POutput, PayloadT extends 
Message>
+  InputT extends PInput, OutputT extends POutput>
   extends PTransform {
 
+@Nullable
 public abstract String getUrn();
 
 @Nullable
-PayloadT getPayload() {
+public Any getPayload() {
   return null;
 }
   }
@@ -190,24 +222,29 @@ public class PTransformTranslation {
   /**
* A translator that uses the explicit URN and payload from a {@link 
RawPTransform}.
*/
-  public static class RawPTransformTranslator
-  implements TransformPayloadTranslator> {
+  public static class RawPTransformTranslator
+  implements TransformPayloadTranslator> {
 @Override
-public String getUrn(RawPTransform transform) {
+   

[31/50] beam git commit: Fix the staging directory path in copying from GCS

2017-06-07 Thread davor
Fix the staging directory path in copying from GCS


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/171a9930
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/171a9930
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/171a9930

Branch: refs/heads/DSL_SQL
Commit: 171a993044d97c42f027e1ec44436a3b8af7c32f
Parents: 6fed177
Author: Sourabh Bajaj 
Authored: Tue Jun 6 12:55:42 2017 -0700
Committer: Ahmet Altay 
Committed: Tue Jun 6 16:43:58 2017 -0700

--
 .../runners/dataflow/internal/dependency.py   |  7 ++-
 .../runners/dataflow/internal/dependency_test.py  | 14 +-
 2 files changed, 19 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/171a9930/sdks/python/apache_beam/runners/dataflow/internal/dependency.py
--
diff --git a/sdks/python/apache_beam/runners/dataflow/internal/dependency.py 
b/sdks/python/apache_beam/runners/dataflow/internal/dependency.py
index 3a0ff46..e656600 100644
--- a/sdks/python/apache_beam/runners/dataflow/internal/dependency.py
+++ b/sdks/python/apache_beam/runners/dataflow/internal/dependency.py
@@ -181,7 +181,12 @@ def _stage_extra_packages(extra_packages, 
staging_location, temp_dir,
   staging_temp_dir = tempfile.mkdtemp(dir=temp_dir)
 logging.info('Downloading extra package: %s locally before staging',
  package)
-_dependency_file_copy(package, staging_temp_dir)
+if os.path.isfile(staging_temp_dir):
+  local_file_path = staging_temp_dir
+else:
+  _, last_component = FileSystems.split(package)
+  local_file_path = FileSystems.join(staging_temp_dir, last_component)
+_dependency_file_copy(package, local_file_path)
   else:
 raise RuntimeError(
 'The file %s cannot be found. It was specified in the '

http://git-wip-us.apache.org/repos/asf/beam/blob/171a9930/sdks/python/apache_beam/runners/dataflow/internal/dependency_test.py
--
diff --git 
a/sdks/python/apache_beam/runners/dataflow/internal/dependency_test.py 
b/sdks/python/apache_beam/runners/dataflow/internal/dependency_test.py
index 5eac7d6..e555b69 100644
--- a/sdks/python/apache_beam/runners/dataflow/internal/dependency_test.py
+++ b/sdks/python/apache_beam/runners/dataflow/internal/dependency_test.py
@@ -31,6 +31,16 @@ from apache_beam.options.pipeline_options import 
PipelineOptions
 from apache_beam.options.pipeline_options import SetupOptions
 
 
+# Protect against environments where GCS library is not available.
+# pylint: disable=wrong-import-order, wrong-import-position
+try:
+  from apitools.base.py.exceptions import HttpError
+except ImportError:
+  HttpError = None
+# pylint: enable=wrong-import-order, wrong-import-position
+
+
+@unittest.skipIf(HttpError is None, 'GCP dependencies are not installed')
 class SetupTest(unittest.TestCase):
 
   def update_options(self, options):
@@ -369,7 +379,9 @@ class SetupTest(unittest.TestCase):
   if from_path.startswith('gs://'):
 gcs_copied_files.append(from_path)
 _, from_name = os.path.split(from_path)
-self.create_temp_file(os.path.join(to_path, from_name), 'nothing')
+if os.path.isdir(to_path):
+  to_path = os.path.join(to_path, from_name)
+self.create_temp_file(to_path, 'nothing')
 logging.info('Fake copied GCS file: %s to %s', from_path, to_path)
   elif to_path.startswith('gs://'):
 logging.info('Faking file_copy(%s, %s)', from_path, to_path)



[38/50] beam git commit: Sort IO by alphanumeric order

2017-06-07 Thread davor
Sort IO by alphanumeric order


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/78b6e3cd
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/78b6e3cd
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/78b6e3cd

Branch: refs/heads/DSL_SQL
Commit: 78b6e3cddc5feb882e184a4bfce3af739a33
Parents: c189d5c
Author: Jean-Baptiste Onofré 
Authored: Wed Jun 7 14:18:55 2017 +0200
Committer: Jean-Baptiste Onofré 
Committed: Wed Jun 7 14:18:55 2017 +0200

--
 sdks/java/io/pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/78b6e3cd/sdks/java/io/pom.xml
--
diff --git a/sdks/java/io/pom.xml b/sdks/java/io/pom.xml
index 94fc6a7..44f3baa 100644
--- a/sdks/java/io/pom.xml
+++ b/sdks/java/io/pom.xml
@@ -64,6 +64,7 @@
   
 
   
+cassandra
 common
 elasticsearch
 google-cloud-platform
@@ -78,7 +79,6 @@
 mongodb
 mqtt
 xml
-cassandra
   
 
   



[05/50] beam git commit: This closes #3281: [BEAM-3271] Add CreatePCollectionView translation

2017-06-07 Thread davor
This closes #3281: [BEAM-3271] Add CreatePCollectionView translation

  Add CreatePCollectionView translation
  Clarify javadoc on PTransformTranslation


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/6d64c6ec
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/6d64c6ec
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/6d64c6ec

Branch: refs/heads/DSL_SQL
Commit: 6d64c6ec1be2134b577161f2709ada4e10bfaeb0
Parents: 1cc6dc1 c3b036a
Author: Kenneth Knowles 
Authored: Mon Jun 5 19:49:33 2017 -0700
Committer: Kenneth Knowles 
Committed: Mon Jun 5 19:49:33 2017 -0700

--
 .../CreatePCollectionViewTranslation.java   | 126 +
 .../construction/PTransformTranslation.java |  16 ++-
 .../CreatePCollectionViewTranslationTest.java   | 136 +++
 3 files changed, 275 insertions(+), 3 deletions(-)
--




Build failed in Jenkins: beam_PostCommit_Java_JDK_Versions_Test » JDK 1.7 (latest),beam #58

2017-06-07 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-2421] Swap to use an Impulse primitive + DoFn for Create when

[iemejia] Shutdown Flink Streaming Pipeline when reaching +Inf watermark

--
[...truncated 1.25 MB...]
2017-06-08\T\01:07:39.579 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.3/scala-library-2.10.3.pom
 (2 KB at 73.9 KB/sec)
2017-06-08\T\01:07:39.581 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/github/scopt/scopt_2.10/3.5.0/scopt_2.10-3.5.0.pom
2017-06-08\T\01:07:39.609 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/github/scopt/scopt_2.10/3.5.0/scopt_2.10-3.5.0.pom
 (2 KB at 58.4 KB/sec)
2017-06-08\T\01:07:39.611 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/twitter/chill_2.10/0.7.4/chill_2.10-0.7.4.pom
2017-06-08\T\01:07:39.639 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/twitter/chill_2.10/0.7.4/chill_2.10-0.7.4.pom
 (3 KB at 82.8 KB/sec)
2017-06-08\T\01:07:39.642 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.5/scala-library-2.10.5.pom
2017-06-08\T\01:07:39.667 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.5/scala-library-2.10.5.pom
 (2 KB at 79.8 KB/sec)
2017-06-08\T\01:07:39.669 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/twitter/chill-java/0.7.4/chill-java-0.7.4.pom
2017-06-08\T\01:07:39.698 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/twitter/chill-java/0.7.4/chill-java-0.7.4.pom
 (2 KB at 68.5 KB/sec)
2017-06-08\T\01:07:39.700 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-optimizer_2.10/1.3.0/flink-optimizer_2.10-1.3.0.pom
2017-06-08\T\01:07:39.727 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-optimizer_2.10/1.3.0/flink-optimizer_2.10-1.3.0.pom
 (5 KB at 156.2 KB/sec)
2017-06-08\T\01:07:39.731 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.3.0/flink-streaming-java_2.10-1.3.0.pom
2017-06-08\T\01:07:39.759 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.3.0/flink-streaming-java_2.10-1.3.0.pom
 (7 KB at 215.4 KB/sec)
2017-06-08\T\01:07:39.765 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.pom
2017-06-08\T\01:07:39.792 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.pom
 (4 KB at 117.2 KB/sec)
2017-06-08\T\01:07:39.794 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/sling/sling/10/sling-10.pom
2017-06-08\T\01:07:39.824 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/sling/sling/10/sling-10.pom (26 
KB at 835.5 KB/sec)
2017-06-08\T\01:07:39.825 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/apache/8/apache-8.pom
2017-06-08\T\01:07:39.853 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/apache/8/apache-8.pom (14 KB at 
490.2 KB/sec)
2017-06-08\T\01:07:39.866 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.3.0/flink-test-utils_2.10-1.3.0.pom
2017-06-08\T\01:07:39.893 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.3.0/flink-test-utils_2.10-1.3.0.pom
 (5 KB at 171.5 KB/sec)
2017-06-08\T\01:07:39.894 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-parent/1.3.0/flink-test-utils-parent-1.3.0.pom
2017-06-08\T\01:07:39.921 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-parent/1.3.0/flink-test-utils-parent-1.3.0.pom
 (2 KB at 52.1 KB/sec)
2017-06-08\T\01:07:39.924 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.3.0/flink-test-utils-junit-1.3.0.pom
2017-06-08\T\01:07:39.952 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.3.0/flink-test-utils-junit-1.3.0.pom
 (4 KB at 114.6 KB/sec)
2017-06-08\T\01:07:39.958 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/curator/curator-test/2.12.0/curator-test-2.12.0.pom
2017-06-08\T\01:07:39.985 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/curator/curator-test/2.12.0/curator-test-2.12.0.pom
 (5 KB at 150.0 KB/sec)
2017-06-08\T\01:07:39.987 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/curator/apache-curator/2.12.0/apache-curator-2.12.0.pom
2017-06-08\T\01:07:40.017 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/curator/apache-curator/2.12.0/apache-curator-2.12

Build failed in Jenkins: beam_PostCommit_Java_JDK_Versions_Test » OpenJDK 8 (on Ubuntu only),beam #58

2017-06-07 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-2421] Swap to use an Impulse primitive + DoFn for Create when

[iemejia] Shutdown Flink Streaming Pipeline when reaching +Inf watermark

--
[...truncated 1.26 MB...]
2017-06-08\T\01:05:07.935 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/data-artisans/flakka-slf4j_2.10/2.3-custom/flakka-slf4j_2.10-2.3-custom.pom
 (3 KB at 96.1 KB/sec)
2017-06-08\T\01:05:07.936 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/clapper/grizzled-slf4j_2.10/1.0.2/grizzled-slf4j_2.10-1.0.2.pom
2017-06-08\T\01:05:07.963 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/clapper/grizzled-slf4j_2.10/1.0.2/grizzled-slf4j_2.10-1.0.2.pom
 (2 KB at 69.3 KB/sec)
2017-06-08\T\01:05:07.964 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.3/scala-library-2.10.3.pom
2017-06-08\T\01:05:07.992 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.3/scala-library-2.10.3.pom
 (2 KB at 71.3 KB/sec)
2017-06-08\T\01:05:07.994 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/github/scopt/scopt_2.10/3.5.0/scopt_2.10-3.5.0.pom
2017-06-08\T\01:05:08.022 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/github/scopt/scopt_2.10/3.5.0/scopt_2.10-3.5.0.pom
 (2 KB at 58.4 KB/sec)
2017-06-08\T\01:05:08.023 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/twitter/chill_2.10/0.7.4/chill_2.10-0.7.4.pom
2017-06-08\T\01:05:08.050 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/twitter/chill_2.10/0.7.4/chill_2.10-0.7.4.pom
 (3 KB at 85.8 KB/sec)
2017-06-08\T\01:05:08.052 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.5/scala-library-2.10.5.pom
2017-06-08\T\01:05:08.079 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.5/scala-library-2.10.5.pom
 (2 KB at 73.9 KB/sec)
2017-06-08\T\01:05:08.080 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/twitter/chill-java/0.7.4/chill-java-0.7.4.pom
2017-06-08\T\01:05:08.108 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/twitter/chill-java/0.7.4/chill-java-0.7.4.pom
 (2 KB at 70.9 KB/sec)
2017-06-08\T\01:05:08.109 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-optimizer_2.10/1.3.0/flink-optimizer_2.10-1.3.0.pom
2017-06-08\T\01:05:08.137 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-optimizer_2.10/1.3.0/flink-optimizer_2.10-1.3.0.pom
 (5 KB at 150.6 KB/sec)
2017-06-08\T\01:05:08.142 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.3.0/flink-streaming-java_2.10-1.3.0.pom
2017-06-08\T\01:05:08.173 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.3.0/flink-streaming-java_2.10-1.3.0.pom
 (7 KB at 194.5 KB/sec)
2017-06-08\T\01:05:08.176 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.pom
2017-06-08\T\01:05:08.270 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.pom
 (4 KB at 33.7 KB/sec)
2017-06-08\T\01:05:08.272 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/sling/sling/10/sling-10.pom
2017-06-08\T\01:05:08.300 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/sling/sling/10/sling-10.pom (26 
KB at 895.2 KB/sec)
2017-06-08\T\01:05:08.302 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/apache/8/apache-8.pom
2017-06-08\T\01:05:08.331 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/apache/8/apache-8.pom (14 KB at 
473.3 KB/sec)
2017-06-08\T\01:05:08.343 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.3.0/flink-test-utils_2.10-1.3.0.pom
2017-06-08\T\01:05:08.380 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.3.0/flink-test-utils_2.10-1.3.0.pom
 (5 KB at 125.2 KB/sec)
2017-06-08\T\01:05:08.382 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-parent/1.3.0/flink-test-utils-parent-1.3.0.pom
2017-06-08\T\01:05:08.409 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-parent/1.3.0/flink-test-utils-parent-1.3.0.pom
 (2 KB at 52.1 KB/sec)
2017-06-08\T\01:05:08.413 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.3.0/flink-test-utils-junit-1.3.0.pom
2017-06-08\T\01:05:08.452 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.3.

Build failed in Jenkins: beam_PostCommit_Java_JDK_Versions_Test » OpenJDK 7 (on Ubuntu only),beam #58

2017-06-07 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-2421] Swap to use an Impulse primitive + DoFn for Create when

[iemejia] Shutdown Flink Streaming Pipeline when reaching +Inf watermark

--
[...truncated 1.26 MB...]
2017-06-08\T\01:02:16.940 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/data-artisans/flakka-slf4j_2.10/2.3-custom/flakka-slf4j_2.10-2.3-custom.pom
 (3 KB at 80.6 KB/sec)
2017-06-08\T\01:02:16.942 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/clapper/grizzled-slf4j_2.10/1.0.2/grizzled-slf4j_2.10-1.0.2.pom
2017-06-08\T\01:02:16.971 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/clapper/grizzled-slf4j_2.10/1.0.2/grizzled-slf4j_2.10-1.0.2.pom
 (2 KB at 64.5 KB/sec)
2017-06-08\T\01:02:16.973 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.3/scala-library-2.10.3.pom
2017-06-08\T\01:02:17.002 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.3/scala-library-2.10.3.pom
 (2 KB at 68.8 KB/sec)
2017-06-08\T\01:02:17.014 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/github/scopt/scopt_2.10/3.5.0/scopt_2.10-3.5.0.pom
2017-06-08\T\01:02:17.042 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/github/scopt/scopt_2.10/3.5.0/scopt_2.10-3.5.0.pom
 (2 KB at 58.4 KB/sec)
2017-06-08\T\01:02:17.044 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/twitter/chill_2.10/0.7.4/chill_2.10-0.7.4.pom
2017-06-08\T\01:02:17.082 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/twitter/chill_2.10/0.7.4/chill_2.10-0.7.4.pom
 (3 KB at 61.0 KB/sec)
2017-06-08\T\01:02:17.084 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.5/scala-library-2.10.5.pom
2017-06-08\T\01:02:17.115 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.5/scala-library-2.10.5.pom
 (2 KB at 64.4 KB/sec)
2017-06-08\T\01:02:17.117 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/twitter/chill-java/0.7.4/chill-java-0.7.4.pom
2017-06-08\T\01:02:17.167 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/twitter/chill-java/0.7.4/chill-java-0.7.4.pom
 (2 KB at 39.7 KB/sec)
2017-06-08\T\01:02:17.170 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-optimizer_2.10/1.3.0/flink-optimizer_2.10-1.3.0.pom
2017-06-08\T\01:02:17.198 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-optimizer_2.10/1.3.0/flink-optimizer_2.10-1.3.0.pom
 (5 KB at 145.4 KB/sec)
2017-06-08\T\01:02:17.205 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.3.0/flink-streaming-java_2.10-1.3.0.pom
2017-06-08\T\01:02:17.236 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.3.0/flink-streaming-java_2.10-1.3.0.pom
 (7 KB at 194.5 KB/sec)
2017-06-08\T\01:02:17.243 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.pom
2017-06-08\T\01:02:17.274 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.pom
 (4 KB at 102.1 KB/sec)
2017-06-08\T\01:02:17.276 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/sling/sling/10/sling-10.pom
2017-06-08\T\01:02:17.308 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/sling/sling/10/sling-10.pom (26 
KB at 783.3 KB/sec)
2017-06-08\T\01:02:17.310 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/apache/8/apache-8.pom
2017-06-08\T\01:02:17.338 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/apache/8/apache-8.pom (14 KB at 
490.2 KB/sec)
2017-06-08\T\01:02:17.351 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.3.0/flink-test-utils_2.10-1.3.0.pom
2017-06-08\T\01:02:17.380 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.3.0/flink-test-utils_2.10-1.3.0.pom
 (5 KB at 159.7 KB/sec)
2017-06-08\T\01:02:17.382 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-parent/1.3.0/flink-test-utils-parent-1.3.0.pom
2017-06-08\T\01:02:17.413 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-parent/1.3.0/flink-test-utils-parent-1.3.0.pom
 (2 KB at 45.4 KB/sec)
2017-06-08\T\01:02:17.417 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.3.0/flink-test-utils-junit-1.3.0.pom
2017-06-08\T\01:02:17.447 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.3

[GitHub] beam pull request #3322: Refine Python DirectRunner watermark advancement be...

2017-06-07 Thread charlesccychen
GitHub user charlesccychen opened a pull request:

https://github.com/apache/beam/pull/3322

Refine Python DirectRunner watermark advancement behavior

This change helps prepare for streaming pipeline execution.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/charlesccychen/beam streaming-watermarks

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3322.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3322


commit 591612d4271d03bd5ac8250cc8ac018b24cdd1a1
Author: Charles Chen 
Date:   2017-06-08T00:46:36Z

Refine Python DirectRunner watermark advancement behavior

This change helps prepare for streaming pipeline execution.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Build failed in Jenkins: beam_PostCommit_Java_JDK_Versions_Test » JDK 1.7 (latest),beam #57

2017-06-07 Thread Apache Jenkins Server
See 


--
[...truncated 1.44 MB...]
  [javadoc]   location: class BeamFnDataReadRunner
  [javadoc]   where OutputT is a type-variable:
  [javadoc] OutputT extends Object declared in class BeamFnDataReadRunner
  [javadoc] 
:62:
 error: cannot find symbol
  [javadoc]   private CompletableFuture readFuture;
  [javadoc]   ^
  [javadoc]   symbol:   class CompletableFuture
  [javadoc]   location: class BeamFnDataReadRunner
  [javadoc]   where OutputT is a type-variable:
  [javadoc] OutputT extends Object declared in class BeamFnDataReadRunner
  [javadoc] 
:66:
 error: cannot find symbol
  [javadoc]   Supplier processBundleInstructionIdSupplier,
  [javadoc]   ^
  [javadoc]   symbol:   class Supplier
  [javadoc]   location: class BeamFnDataReadRunner
  [javadoc]   where OutputT is a type-variable:
  [javadoc] OutputT extends Object declared in class BeamFnDataReadRunner
  [javadoc] 
:51:
 warning - Tag @link: reference not found: DoFn.FinishBundle @FinishBundle
  [javadoc] 
:45:
 warning - Tag @link: reference not found: DoFn.OnTimer @OnTimer
  [javadoc] 
:39:
 warning - Tag @link: reference not found: DoFn.ProcessElement @ProcessElement
  [javadoc] 
:39:
 warning - Tag @link: reference not found: DoFn.ProcessContext
  [javadoc] 
:33:
 warning - Tag @link: reference not found: DoFn.StartBundle @StartBundle
  [javadoc] 
:33:
 warning - Tag @link: reference not found: DoFn.StartBundle @StartBundle
  [javadoc] 
:39:
 warning - Tag @link: reference not found: DoFn.ProcessElement @ProcessElement
  [javadoc] 
:39:
 warning - Tag @link: reference not found: DoFn.ProcessContext
  [javadoc] 
:45:
 warning - Tag @link: reference not found: DoFn.OnTimer @OnTimer
  [javadoc] 
:51:
 warning - Tag @link: reference not found: DoFn.FinishBundle @FinishBundle
  [javadoc] 
:34:
 warning - Tag @link: reference not found: Source.Reader
  [javadoc] 
:292:
 warning - Tag @link: reference not found: UnboundedSource.CheckpointMark
  [javadoc] 


Jenkins build is back to normal : beam_PostCommit_Java_JDK_Versions_Test » OpenJDK 8 (on Ubuntu only),beam #57

2017-06-07 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Java_MavenInstall_Windows #106

2017-06-07 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-2421] Swap to use an Impulse primitive + DoFn for Create when

[iemejia] Shutdown Flink Streaming Pipeline when reaching +Inf watermark

--
[...truncated 2.55 MB...]
2017-06-08T00:32:48.306 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-slf4j/1.8/gossip-slf4j-1.8.pom
2017-06-08T00:32:48.315 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-slf4j/1.8/gossip-slf4j-1.8.pom
 (2 KB at 160.8 KB/sec)
2017-06-08T00:32:48.319 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip/1.8/gossip-1.8.pom
2017-06-08T00:32:48.328 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip/1.8/gossip-1.8.pom
 (12 KB at 1250.7 KB/sec)
2017-06-08T00:32:48.332 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/forge/forge-parent/9/forge-parent-9.pom
2017-06-08T00:32:48.342 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/forge/forge-parent/9/forge-parent-9.pom
 (13 KB at 1283.4 KB/sec)
2017-06-08T00:32:48.346 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-core/1.8/gossip-core-1.8.pom
2017-06-08T00:32:48.354 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-core/1.8/gossip-core-1.8.pom
 (3 KB at 271.6 KB/sec)
2017-06-08T00:32:48.358 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-bootstrap/1.8/gossip-bootstrap-1.8.pom
2017-06-08T00:32:48.366 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-bootstrap/1.8/gossip-bootstrap-1.8.pom
 (2 KB at 196.7 KB/sec)
2017-06-08T00:32:48.371 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/guava/guava/14.0.1/guava-14.0.1.pom
2017-06-08T00:32:48.381 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/guava/guava/14.0.1/guava-14.0.1.pom
 (6 KB at 525.0 KB/sec)
2017-06-08T00:32:48.384 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom
2017-06-08T00:32:48.393 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom
 (3 KB at 277.2 KB/sec)
2017-06-08T00:32:48.397 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-classworlds/2.4.2/plexus-classworlds-2.4.2.pom
2017-06-08T00:32:48.404 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-classworlds/2.4.2/plexus-classworlds-2.4.2.pom
 (4 KB at 489.5 KB/sec)
2017-06-08T00:32:48.409 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-interpolation/1.16/plexus-interpolation-1.16.pom
2017-06-08T00:32:48.417 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-interpolation/1.16/plexus-interpolation-1.16.pom
 (2 KB at 125.4 KB/sec)
2017-06-08T00:32:48.421 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/eclipse/aether/aether-api/0.9.0.M2/aether-api-0.9.0.M2.pom
2017-06-08T00:32:48.433 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/eclipse/aether/aether-api/0.9.0.M2/aether-api-0.9.0.M2.pom
 (2 KB at 141.4 KB/sec)
2017-06-08T00:32:48.438 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-api/2.0/gmaven-adapter-api-2.0.pom
2017-06-08T00:32:48.445 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-api/2.0/gmaven-adapter-api-2.0.pom
 (2 KB at 252.0 KB/sec)
2017-06-08T00:32:48.450 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-impl/2.0/gmaven-adapter-impl-2.0.pom
2017-06-08T00:32:48.458 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-impl/2.0/gmaven-adapter-impl-2.0.pom
 (3 KB at 331.8 KB/sec)
2017-06-08T00:32:48.463 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/groovy/groovy-all/2.1.5/groovy-all-2.1.5.pom
2017-06-08T00:32:48.474 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/groovy/groovy-all/2.1.5/groovy-all-2.1.5.pom
 (18 KB at 1601.5 KB/sec)
2017-06-08T00:32:48.479 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant/1.8.4/ant-1.8.4.pom
2017-06-08T00:32:48.490 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant/1.8.4/ant-1.8.4.pom (10 
KB at 857.1 KB/sec)
2017-06-08T00:32:48.494 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-parent/1.8.4/ant-parent-1.8.4.pom
2017-06-08T00:32:48.502 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-parent/1.8.4/ant-parent-1.8.4.pom
 (5 KB at 556.8 KB/sec)
2017-06-08T00:32:48.507 [INFO] Downloading: 
https://repo.m

Jenkins build is back to normal : beam_PostCommit_Java_JDK_Versions_Test » OpenJDK 7 (on Ubuntu only),beam #57

2017-06-07 Thread Apache Jenkins Server
See 




[GitHub] beam pull request #3321: Remove support for NativeSinks from the Python Dire...

2017-06-07 Thread charlesccychen
GitHub user charlesccychen opened a pull request:

https://github.com/apache/beam/pull/3321

Remove support for NativeSinks from the Python DirectRunner



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/charlesccychen/beam remove-nativewrite-dr

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3321.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3321


commit f2e3088633fef10f19bfd11ff9b508930916a740
Author: Charles Chen 
Date:   2017-06-08T00:00:57Z

Remove support for NativeSinks from the Python DirectRunner




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] beam pull request #3320: Fix Flink compile error occurs in some JDK versions

2017-06-07 Thread markflyhigh
GitHub user markflyhigh opened a pull request:

https://github.com/apache/beam/pull/3320

Fix Flink compile error occurs in some JDK versions

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`.
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---

Compiling error happened in `runners/flink` when using JD1.7, OpenJDK7 and 
OpenJDK8. Jenkins build 
[here](https://builds.apache.org/job/beam_PostCommit_Java_JDK_Versions_Test/56/).

Will run Postcommit_Java_JDK_Version_test to verify the fix.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/markflyhigh/incubator-beam fix-compile-error

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3320.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3320


commit 946de711a2a2f88b14f5943c550393ea6670a3f7
Author: Mark Liu 
Date:   2017-06-07T23:27:34Z

Fix compile error occurs in some JDKs




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] beam pull request #3319: [BEAM-1542] SpannerIO: Introduced a MutationGroup.

2017-06-07 Thread mairbek
GitHub user mairbek opened a pull request:

https://github.com/apache/beam/pull/3319

[BEAM-1542] SpannerIO: Introduced a MutationGroup.

Allows to group together mutations in a logical bundle that is submitted in 
the same transaction.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mairbek/beam mutation-group

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3319.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3319


commit c6b6c181664eb3ddc4ed29e56b380c46551885ae
Author: Mairbek Khadikov 
Date:   2017-06-07T23:27:01Z

SpannerIO: Introduced a MutationGroup.

Allows to group together mutation in a logical bundle that is submitted in 
the same transaction.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-1542) Need Source/Sink for Spanner

2017-06-07 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1542?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16041901#comment-16041901
 ] 

ASF GitHub Bot commented on BEAM-1542:
--

GitHub user mairbek opened a pull request:

https://github.com/apache/beam/pull/3319

[BEAM-1542] SpannerIO: Introduced a MutationGroup.

Allows to group together mutations in a logical bundle that is submitted in 
the same transaction.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mairbek/beam mutation-group

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3319.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3319


commit c6b6c181664eb3ddc4ed29e56b380c46551885ae
Author: Mairbek Khadikov 
Date:   2017-06-07T23:27:01Z

SpannerIO: Introduced a MutationGroup.

Allows to group together mutation in a logical bundle that is submitted in 
the same transaction.




> Need Source/Sink for Spanner
> 
>
> Key: BEAM-1542
> URL: https://issues.apache.org/jira/browse/BEAM-1542
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-java-gcp
>Reporter: Guy Molinari
>Assignee: Mairbek Khadikov
>
> Is there a source/sink for Spanner in the works?   If not I would gladly give 
> this a shot.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #3318: Choose GroupAlsoByWindows implementation based on s...

2017-06-07 Thread charlesccychen
GitHub user charlesccychen opened a pull request:

https://github.com/apache/beam/pull/3318

Choose GroupAlsoByWindows implementation based on streaming flag

This change depends on https://github.com/apache/beam/pull/3315.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/charlesccychen/beam streaming-gabw-refactor

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3318.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3318


commit 113a7d9591a91d6066f2e4f2b11095d755e55193
Author: Charles Chen 
Date:   2017-06-07T23:08:43Z

Move Runner API protos to portability/runners/api

This fixes a circular import issue between transforms/ and runners/

commit 1ce756980b7c98cd2146b9261a4d6ce374e5e913
Author: Charles Chen 
Date:   2017-06-07T23:09:10Z

Choose GroupAlsoByWindows implementation based on streaming flag




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] beam pull request #3317: python pubsub io fixes for streaming

2017-06-07 Thread vikkyrk
GitHub user vikkyrk opened a pull request:

https://github.com/apache/beam/pull/3317

python pubsub io fixes for streaming

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`.
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---

- include coder info in pubsub source and sink
- use `GlobalWindowCoder` as a place holder instead of default `PickleCoder`
- fix pubsub source to accept either a topic or subscription.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/vikkyrk/incubator-beam pubsub

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3317.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3317


commit 58285834ca29ad9be8189aa526fac748cea7c079
Author: Vikas Kedigehalli 
Date:   2017-06-07T23:28:18Z

Add coder info to pubsub io




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] beam pull request #3316: Make BytesCoder to be a known type

2017-06-07 Thread vikkyrk
GitHub user vikkyrk opened a pull request:

https://github.com/apache/beam/pull/3316

Make BytesCoder to be a known type

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`.
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/vikkyrk/incubator-beam streaming

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3316.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3316


commit ec2fb60df664092f7907fc308c0b8b6caabcb8a0
Author: Vikas Kedigehalli 
Date:   2017-06-07T23:26:21Z

Make BytesCoder to be a known type




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] beam pull request #3315: Move Runner API protos to portability/runners/api

2017-06-07 Thread charlesccychen
GitHub user charlesccychen opened a pull request:

https://github.com/apache/beam/pull/3315

Move Runner API protos to portability/runners/api

This fixes a circular import issue between transforms/ and runners/

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`.
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/charlesccychen/beam fix-circular-api

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3315.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3315


commit 113a7d9591a91d6066f2e4f2b11095d755e55193
Author: Charles Chen 
Date:   2017-06-07T23:08:43Z

Move Runner API protos to portability/runners/api

This fixes a circular import issue between transforms/ and runners/




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-2421) Migrate Dataflow to use impulse primitive as the only root primitive

2017-06-07 Thread Luke Cwik (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2421?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16041677#comment-16041677
 ] 

Luke Cwik commented on BEAM-2421:
-

This is still in the prototyping phase so no tiny doc or discussion beyond what 
is occurring here on this JIRA.

The byte array is one of the well known types across all SDKs. Its also very 
useful to make it such that all IOs output/consume byte arrays and any 
formatting/conversion functions happen within a ParDo/SplittableParDo allowing 
for a language to interpret that data in anyway it chooses without needing to 
have a well known coder for the type that is shared across languages.

> Migrate Dataflow to use impulse primitive as the only root primitive
> 
>
> Key: BEAM-2421
> URL: https://issues.apache.org/jira/browse/BEAM-2421
> Project: Beam
>  Issue Type: Improvement
>  Components: beam-model-fn-api, beam-model-runner-api, runner-dataflow
>Reporter: Luke Cwik
>
> The impulse source emits a single byte array element within the global window.
> This is in preparation for using SDF as the replacement for different bounded 
> and unbounded source implementations.
> Before:
> Read(Source) -> ParDo -> ...
> Impulse -> SDF(Source) -> ParDo -> ...



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-2421) Migrate Dataflow to use impulse primitive as the only root primitive

2017-06-07 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2421?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16041641#comment-16041641
 ] 

Kenneth Knowles commented on BEAM-2421:
---

Is this in the SDF doc, or does it have its own (tiny) doc & discussion? Tried 
to find where we came to consensus on impulse+SDF.

One obvious question is: why a byte array?

> Migrate Dataflow to use impulse primitive as the only root primitive
> 
>
> Key: BEAM-2421
> URL: https://issues.apache.org/jira/browse/BEAM-2421
> Project: Beam
>  Issue Type: Improvement
>  Components: beam-model-fn-api, beam-model-runner-api, runner-dataflow
>Reporter: Luke Cwik
>
> The impulse source emits a single byte array element within the global window.
> This is in preparation for using SDF as the replacement for different bounded 
> and unbounded source implementations.
> Before:
> Read(Source) -> ParDo -> ...
> Impulse -> SDF(Source) -> ParDo -> ...



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #3294: Shutdown Flink Streaming Pipeline when reaching +In...

2017-06-07 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3294


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[2/2] beam git commit: This closes #3294

2017-06-07 Thread iemejia
This closes #3294


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/32f22b7d
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/32f22b7d
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/32f22b7d

Branch: refs/heads/master
Commit: 32f22b7d9cfd5dcd22555272c5a7365fd1323e5f
Parents: caecac3 9c83ffe
Author: Ismaël Mejía 
Authored: Wed Jun 7 23:14:02 2017 +0200
Committer: Ismaël Mejía 
Committed: Wed Jun 7 23:14:02 2017 +0200

--
 .../wrappers/streaming/io/UnboundedSourceWrapper.java   | 5 +
 1 file changed, 5 insertions(+)
--




[1/2] beam git commit: Shutdown Flink Streaming Pipeline when reaching +Inf watermark

2017-06-07 Thread iemejia
Repository: beam
Updated Branches:
  refs/heads/master caecac3b4 -> 32f22b7d9


Shutdown Flink Streaming Pipeline when reaching +Inf watermark


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/9c83ffe0
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/9c83ffe0
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/9c83ffe0

Branch: refs/heads/master
Commit: 9c83ffe0cdc6636d2187bf9439a73a3b45756d50
Parents: caecac3
Author: Aljoscha Krettek 
Authored: Mon Jun 5 12:19:00 2017 +0200
Committer: Ismaël Mejía 
Committed: Wed Jun 7 23:13:52 2017 +0200

--
 .../wrappers/streaming/io/UnboundedSourceWrapper.java   | 5 +
 1 file changed, 5 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/9c83ffe0/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
--
diff --git 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
index 6055a43..e75072a 100644
--- 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
+++ 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java
@@ -31,6 +31,7 @@ import org.apache.beam.sdk.coders.KvCoder;
 import org.apache.beam.sdk.coders.SerializableCoder;
 import org.apache.beam.sdk.io.UnboundedSource;
 import org.apache.beam.sdk.options.PipelineOptions;
+import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
 import org.apache.beam.sdk.transforms.windowing.GlobalWindow;
 import org.apache.beam.sdk.transforms.windowing.PaneInfo;
 import org.apache.beam.sdk.util.WindowedValue;
@@ -436,6 +437,10 @@ public class UnboundedSourceWrapper<
   }
 }
 context.emitWatermark(new Watermark(watermarkMillis));
+
+if (watermarkMillis >= BoundedWindow.TIMESTAMP_MAX_VALUE.getMillis()) {
+  this.isRunning = false;
+}
   }
   setNextWatermarkTimer(this.runtimeContext);
 }



[jira] [Closed] (BEAM-2394) Postcommit_Java_JDK_Version_Test is broken since SpannerWriteIT failed

2017-06-07 Thread Mark Liu (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2394?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Liu closed BEAM-2394.
--

> Postcommit_Java_JDK_Version_Test is broken since SpannerWriteIT failed
> --
>
> Key: BEAM-2394
> URL: https://issues.apache.org/jira/browse/BEAM-2394
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-gcp, testing
>Reporter: Mark Liu
>Assignee: Mairbek Khadikov
> Fix For: Not applicable
>
>
> SpannerWriteIT.testWrite failed in Postcommit_Java_JDK_Version_Test since 
> database didn't setup successfully.
> Error logs:
> {code}
> org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT
> 2017-05-31\T\12:21:30.032 [ERROR] 
> testWrite(org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT)  Time elapsed: 
> 283.011 s  <<< ERROR!
> java.lang.RuntimeException: 
> (b2cfd106d806288f): com.google.cloud.spanner.SpannerException: NOT_FOUND: 
> io.grpc.StatusRuntimeException: NOT_FOUND: Database not found: 
> projects/apache-beam-testing/instances/beam-test/databases/beam-testdb
> resource_type: "type.googleapis.com/google.spanner.admin.database.v1.Database"
> resource_name: 
> "projects/apache-beam-testing/instances/beam-test/databases/beam-testdb"
> description: "Database does not exist."
>   at 
> com.google.cloud.spanner.SpannerExceptionFactory.newSpannerExceptionPreformatted(SpannerExceptionFactory.java:119)
>   at 
> com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:71)
>   at 
> com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:58)
>   at 
> com.google.cloud.spanner.SessionPool$Waiter.take(SessionPool.java:376)
>   at 
> com.google.cloud.spanner.SessionPool$Waiter.access$2800(SessionPool.java:362)
>   at 
> com.google.cloud.spanner.SessionPool.getReadSession(SessionPool.java:697)
>   at 
> com.google.cloud.spanner.DatabaseClientImpl.writeAtLeastOnce(DatabaseClientImpl.java:37)
>   at 
> org.apache.beam.sdk.io.gcp.spanner.SpannerIO$SpannerWriteFn.flushBatch(SpannerIO.java:322)
>   at 
> org.apache.beam.sdk.io.gcp.spanner.SpannerIO$SpannerWriteFn.finishBundle(SpannerIO.java:281)
> ...
> {code}
> Jenkins link:
> https://builds.apache.org/view/Beam/job/beam_PostCommit_Java_JDK_Versions_Test/26/jdk=OpenJDK%207%20(on%20Ubuntu%20only),label=beam/
> https://builds.apache.org/view/Beam/job/beam_PostCommit_Java_JDK_Versions_Test/26/jdk=OpenJDK%208%20(on%20Ubuntu%20only),label=beam/
> Note: the root directory of JDK version test contains space, which is the 
> main difference with Postcommit_Java_MavenInstall. It can be like: 
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Java_JDK_Versions_Test/jdk/OpenJDK
>  7 (on Ubuntu only)/label/beam/..."



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Resolved] (BEAM-2394) Postcommit_Java_JDK_Version_Test is broken since SpannerWriteIT failed

2017-06-07 Thread Mark Liu (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2394?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Liu resolved BEAM-2394.

   Resolution: Fixed
Fix Version/s: Not applicable

> Postcommit_Java_JDK_Version_Test is broken since SpannerWriteIT failed
> --
>
> Key: BEAM-2394
> URL: https://issues.apache.org/jira/browse/BEAM-2394
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-gcp, testing
>Reporter: Mark Liu
>Assignee: Mairbek Khadikov
> Fix For: Not applicable
>
>
> SpannerWriteIT.testWrite failed in Postcommit_Java_JDK_Version_Test since 
> database didn't setup successfully.
> Error logs:
> {code}
> org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT
> 2017-05-31\T\12:21:30.032 [ERROR] 
> testWrite(org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT)  Time elapsed: 
> 283.011 s  <<< ERROR!
> java.lang.RuntimeException: 
> (b2cfd106d806288f): com.google.cloud.spanner.SpannerException: NOT_FOUND: 
> io.grpc.StatusRuntimeException: NOT_FOUND: Database not found: 
> projects/apache-beam-testing/instances/beam-test/databases/beam-testdb
> resource_type: "type.googleapis.com/google.spanner.admin.database.v1.Database"
> resource_name: 
> "projects/apache-beam-testing/instances/beam-test/databases/beam-testdb"
> description: "Database does not exist."
>   at 
> com.google.cloud.spanner.SpannerExceptionFactory.newSpannerExceptionPreformatted(SpannerExceptionFactory.java:119)
>   at 
> com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:71)
>   at 
> com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:58)
>   at 
> com.google.cloud.spanner.SessionPool$Waiter.take(SessionPool.java:376)
>   at 
> com.google.cloud.spanner.SessionPool$Waiter.access$2800(SessionPool.java:362)
>   at 
> com.google.cloud.spanner.SessionPool.getReadSession(SessionPool.java:697)
>   at 
> com.google.cloud.spanner.DatabaseClientImpl.writeAtLeastOnce(DatabaseClientImpl.java:37)
>   at 
> org.apache.beam.sdk.io.gcp.spanner.SpannerIO$SpannerWriteFn.flushBatch(SpannerIO.java:322)
>   at 
> org.apache.beam.sdk.io.gcp.spanner.SpannerIO$SpannerWriteFn.finishBundle(SpannerIO.java:281)
> ...
> {code}
> Jenkins link:
> https://builds.apache.org/view/Beam/job/beam_PostCommit_Java_JDK_Versions_Test/26/jdk=OpenJDK%207%20(on%20Ubuntu%20only),label=beam/
> https://builds.apache.org/view/Beam/job/beam_PostCommit_Java_JDK_Versions_Test/26/jdk=OpenJDK%208%20(on%20Ubuntu%20only),label=beam/
> Note: the root directory of JDK version test contains space, which is the 
> main difference with Postcommit_Java_MavenInstall. It can be like: 
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Java_JDK_Versions_Test/jdk/OpenJDK
>  7 (on Ubuntu only)/label/beam/..."



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-2421) Migrate Dataflow to use impulse primitive as the only root primitive

2017-06-07 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2421?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16041588#comment-16041588
 ] 

ASF GitHub Bot commented on BEAM-2421:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3312


> Migrate Dataflow to use impulse primitive as the only root primitive
> 
>
> Key: BEAM-2421
> URL: https://issues.apache.org/jira/browse/BEAM-2421
> Project: Beam
>  Issue Type: Improvement
>  Components: beam-model-fn-api, beam-model-runner-api, runner-dataflow
>Reporter: Luke Cwik
>
> The impulse source emits a single byte array element within the global window.
> This is in preparation for using SDF as the replacement for different bounded 
> and unbounded source implementations.
> Before:
> Read(Source) -> ParDo -> ...
> Impulse -> SDF(Source) -> ParDo -> ...



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #3312: [BEAM-2421] Swap to use an Impulse primitive + DoFn...

2017-06-07 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3312


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[3/3] beam git commit: [BEAM-2421] Swap to use an Impulse primitive + DoFn for Create when executing with the Fn API.

2017-06-07 Thread lcwik
[BEAM-2421] Swap to use an Impulse primitive + DoFn for Create when executing 
with the Fn API.

This closes #3312


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/caecac3b
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/caecac3b
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/caecac3b

Branch: refs/heads/master
Commit: caecac3b4acb5bfa6e36143d3868b2d80ab119da
Parents: 609016d 1cdb80c
Author: Luke Cwik 
Authored: Wed Jun 7 13:43:38 2017 -0700
Committer: Luke Cwik 
Committed: Wed Jun 7 13:43:38 2017 -0700

--
 .../beam/runners/dataflow/DataflowRunner.java   | 154 ++-
 1 file changed, 146 insertions(+), 8 deletions(-)
--




  1   2   >