[jira] [Commented] (BEAM-2390) allow user to use .setTimePartitioning in BigQueryIO.write

2017-05-31 Thread Eric Johston (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2390?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16032409#comment-16032409
 ] 

Eric Johston commented on BEAM-2390:


My initial commit had some errors in that

a) TimePartitioning is not serializable, and
b) CreateTables#possibleCreate fails when TableDestination includes a partition 
with $

I've made changes to these by storing TimePartitioning in Json format similar 
to how the schemas are propagated. I've also modified the table creation such 
that when creating a table Beam only looks at the part before $. It seems to be 
working now (running this from my own fork in production)

> allow user to use .setTimePartitioning in BigQueryIO.write
> --
>
> Key: BEAM-2390
> URL: https://issues.apache.org/jira/browse/BEAM-2390
> Project: Beam
>  Issue Type: Improvement
>  Components: beam-model-runner-api
>Affects Versions: 2.0.0
>Reporter: Eric Johston
>Assignee: Kenneth Knowles
>  Labels: easyfix, features, newbie
> Fix For: 2.0.0
>
>   Original Estimate: 2h
>  Remaining Estimate: 2h
>
> Currently when writing to a table with BigQueryIO sink, there is no way to 
> create a new table that is date partitioned. This would be very useful, since 
> currently the only way to do this is  by manually creating a table ahead of 
> time. We should be able to leverage the automatic table creation 
> functionality for date partitioned tables.
> The best way to do this would be to have a withTimePartitioning method in the 
> BigQueryIO class.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-2371) Make Java DirectRunner demonstrate language-agnostic Runner API translation wrappers

2017-05-31 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2371?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16032406#comment-16032406
 ] 

ASF GitHub Bot commented on BEAM-2371:
--

GitHub user kennknowles opened a pull request:

https://github.com/apache/beam/pull/3274

[BEAM-2371] Write files translation

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`.
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---

The Java DirectRunner needs `WriteFiles` to be well-defined with a payload 
in order to detect (and provide) runner-specified sharding.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/kennknowles/beam WriteFiles-translation

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3274.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3274


commit 2ee7d69ea6f130186b38ae21abc2ab06aee2c5a7
Author: Kenneth Knowles 
Date:   2017-05-26T23:06:23Z

Add WriteFilesPayload to Runner API well-known transforms

commit 0093cf5e53cbe6f3e6d61f91ea44d0d1b0f0f901
Author: Kenneth Knowles 
Date:   2017-05-26T23:06:57Z

Make WriteFiles config publicly readable

commit c4f28a539ef7acba8cd561a13f3dc32c507cc124
Author: Kenneth Knowles 
Date:   2017-05-30T21:43:25Z

Add WriteFiles translation




> Make Java DirectRunner demonstrate language-agnostic Runner API translation 
> wrappers
> 
>
> Key: BEAM-2371
> URL: https://issues.apache.org/jira/browse/BEAM-2371
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-direct
>Reporter: Kenneth Knowles
>Assignee: Kenneth Knowles
>
> This will complete the PoC for runners-core-construction-java and the Runner 
> API and show other runners the easy path to executing non-Java pipelines, 
> modulo Fn API.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #3274: [BEAM-2371] Write files translation

2017-05-31 Thread kennknowles
GitHub user kennknowles opened a pull request:

https://github.com/apache/beam/pull/3274

[BEAM-2371] Write files translation

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`.
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---

The Java DirectRunner needs `WriteFiles` to be well-defined with a payload 
in order to detect (and provide) runner-specified sharding.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/kennknowles/beam WriteFiles-translation

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3274.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3274


commit 2ee7d69ea6f130186b38ae21abc2ab06aee2c5a7
Author: Kenneth Knowles 
Date:   2017-05-26T23:06:23Z

Add WriteFilesPayload to Runner API well-known transforms

commit 0093cf5e53cbe6f3e6d61f91ea44d0d1b0f0f901
Author: Kenneth Knowles 
Date:   2017-05-26T23:06:57Z

Make WriteFiles config publicly readable

commit c4f28a539ef7acba8cd561a13f3dc32c507cc124
Author: Kenneth Knowles 
Date:   2017-05-30T21:43:25Z

Add WriteFiles translation




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Jenkins build became unstable: beam_PostCommit_Java_MavenInstall #3977

2017-05-31 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-1585) Ability to add new file systems to beamFS in the python sdk

2017-05-31 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1585?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16032281#comment-16032281
 ] 

ASF GitHub Bot commented on BEAM-1585:
--

GitHub user sb2nov opened a pull request:

https://github.com/apache/beam/pull/3273

[BEAM-1585] Add ability to attach a bootstrap script to the pipeline

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`.
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---

R: @aaltay PTAL

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/sb2nov/beam BEAM-bootstrap

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3273.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3273


commit 2e0fc969a7f9bc89972f484bc4878e673777eea5
Author: Sourabh Bajaj 
Date:   2017-06-01T00:54:17Z

[BEAM-1585] Add ability to attach a bootstrap script to the pipeline




> Ability to add new file systems to beamFS in the python sdk
> ---
>
> Key: BEAM-1585
> URL: https://issues.apache.org/jira/browse/BEAM-1585
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-py
>Reporter: Sourabh Bajaj
>Assignee: Sourabh Bajaj
> Fix For: Not applicable
>
>
> BEAM-1441 implements the new BeamFileSystem in the python SDK but currently 
> lacks the ability to add user implemented file systems.
> This needs to be executed in the worker so should be packaged correctly with 
> the pipeline code. 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #3273: [BEAM-1585] Add ability to attach a bootstrap scrip...

2017-05-31 Thread sb2nov
GitHub user sb2nov opened a pull request:

https://github.com/apache/beam/pull/3273

[BEAM-1585] Add ability to attach a bootstrap script to the pipeline

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`.
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---

R: @aaltay PTAL

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/sb2nov/beam BEAM-bootstrap

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3273.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3273


commit 2e0fc969a7f9bc89972f484bc4878e673777eea5
Author: Sourabh Bajaj 
Date:   2017-06-01T00:54:17Z

[BEAM-1585] Add ability to attach a bootstrap script to the pipeline




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Jenkins build is back to normal : beam_PostCommit_Java_JDK_Versions_Test » OpenJDK 8 (on Ubuntu only),beam #28

2017-05-31 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PostCommit_Java_JDK_Versions_Test » OpenJDK 7 (on Ubuntu only),beam #28

2017-05-31 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Java_MavenInstall_Windows #78

2017-05-31 Thread Apache Jenkins Server
See 


Changes:

[dhalperi] Flink*DoFnFunction: fix check for single-output dofns

[dhalperi] Add RawUnion code to FlinkDoFnFunction

[jbonofre] [BEAM-2246] Use CLIENT_ACK instead of AUTO_ACK in JmsIO

--
[...truncated 2.53 MB...]
2017-06-01T00:32:52.397 [INFO] 
2017-06-01T00:32:52.397 [INFO] --- groovy-maven-plugin:2.0:execute 
(find-supported-python-for-clean) @ beam-sdks-python ---
2017-06-01T00:32:52.403 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-slf4j/1.8/gossip-slf4j-1.8.pom
2017-06-01T00:32:52.415 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-slf4j/1.8/gossip-slf4j-1.8.pom
 (2 KB at 120.6 KB/sec)
2017-06-01T00:32:52.418 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip/1.8/gossip-1.8.pom
2017-06-01T00:32:52.431 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip/1.8/gossip-1.8.pom
 (12 KB at 865.8 KB/sec)
2017-06-01T00:32:52.436 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/forge/forge-parent/9/forge-parent-9.pom
2017-06-01T00:32:52.446 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/forge/forge-parent/9/forge-parent-9.pom
 (13 KB at 1283.4 KB/sec)
2017-06-01T00:32:52.452 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-core/1.8/gossip-core-1.8.pom
2017-06-01T00:32:52.464 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-core/1.8/gossip-core-1.8.pom
 (3 KB at 181.1 KB/sec)
2017-06-01T00:32:52.468 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-bootstrap/1.8/gossip-bootstrap-1.8.pom
2017-06-01T00:32:52.477 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/sonatype/gossip/gossip-bootstrap/1.8/gossip-bootstrap-1.8.pom
 (2 KB at 174.8 KB/sec)
2017-06-01T00:32:52.481 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/guava/guava/14.0.1/guava-14.0.1.pom
2017-06-01T00:32:52.491 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/guava/guava/14.0.1/guava-14.0.1.pom
 (6 KB at 525.0 KB/sec)
2017-06-01T00:32:52.495 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom
2017-06-01T00:32:52.506 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom
 (3 KB at 226.8 KB/sec)
2017-06-01T00:32:52.510 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-classworlds/2.4.2/plexus-classworlds-2.4.2.pom
2017-06-01T00:32:52.522 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-classworlds/2.4.2/plexus-classworlds-2.4.2.pom
 (4 KB at 285.6 KB/sec)
2017-06-01T00:32:52.526 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-interpolation/1.16/plexus-interpolation-1.16.pom
2017-06-01T00:32:52.536 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-interpolation/1.16/plexus-interpolation-1.16.pom
 (2 KB at 100.3 KB/sec)
2017-06-01T00:32:52.540 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/eclipse/aether/aether-api/0.9.0.M2/aether-api-0.9.0.M2.pom
2017-06-01T00:32:52.550 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/eclipse/aether/aether-api/0.9.0.M2/aether-api-0.9.0.M2.pom
 (2 KB at 169.6 KB/sec)
2017-06-01T00:32:52.557 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-api/2.0/gmaven-adapter-api-2.0.pom
2017-06-01T00:32:52.566 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-api/2.0/gmaven-adapter-api-2.0.pom
 (2 KB at 196.0 KB/sec)
2017-06-01T00:32:52.570 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-impl/2.0/gmaven-adapter-impl-2.0.pom
2017-06-01T00:32:52.582 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/gmaven/gmaven-adapter-impl/2.0/gmaven-adapter-impl-2.0.pom
 (3 KB at 221.2 KB/sec)
2017-06-01T00:32:52.586 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/groovy/groovy-all/2.1.5/groovy-all-2.1.5.pom
2017-06-01T00:32:52.598 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/groovy/groovy-all/2.1.5/groovy-all-2.1.5.pom
 (18 KB at 1468.0 KB/sec)
2017-06-01T00:32:52.603 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant/1.8.4/ant-1.8.4.pom
2017-06-01T00:32:52.615 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant/1.8.4/ant-1.8.4.pom (10 
KB at 785.6 KB/sec)
2017-06-01T00:32:52.620 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/ant/ant-parent/1.8.4/ant-parent-1.8.4.pom
2017-06-01T00:32:52.632 

Build failed in Jenkins: beam_PostCommit_Java_JDK_Versions_Test » JDK 1.7 (latest),beam #28

2017-05-31 Thread Apache Jenkins Server
See 


Changes:

[dhalperi] Flink*DoFnFunction: fix check for single-output dofns

[dhalperi] Add RawUnion code to FlinkDoFnFunction

[jbonofre] [BEAM-2246] Use CLIENT_ACK instead of AUTO_ACK in JmsIO

--
[...truncated 870.66 KB...]
2017-06-01\T\00:11:46.080 [INFO] Excluding 
com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 from the shaded 
jar.
2017-06-01\T\00:11:46.080 [INFO] Including com.google.guava:guava:jar:20.0 in 
the shaded jar.
2017-06-01\T\00:11:46.080 [INFO] Excluding org.apache.avro:avro:jar:1.8.2 from 
the shaded jar.
2017-06-01\T\00:11:46.080 [INFO] Excluding 
org.codehaus.jackson:jackson-core-asl:jar:1.9.13 from the shaded jar.
2017-06-01\T\00:11:46.080 [INFO] Excluding 
org.codehaus.jackson:jackson-mapper-asl:jar:1.9.13 from the shaded jar.
2017-06-01\T\00:11:46.080 [INFO] Excluding 
com.thoughtworks.paranamer:paranamer:jar:2.7 from the shaded jar.
2017-06-01\T\00:11:46.080 [INFO] Excluding org.tukaani:xz:jar:1.5 from the 
shaded jar.
2017-06-01\T\00:11:46.080 [INFO] Excluding 
com.google.errorprone:error_prone_annotations:jar:2.0.15 from the shaded jar.
2017-06-01\T\00:11:46.080 [INFO] Excluding joda-time:joda-time:jar:2.4 from the 
shaded jar.
2017-06-01\T\00:11:46.080 [INFO] Excluding 
com.google.code.findbugs:jsr305:jar:3.0.1 from the shaded jar.
2017-06-01\T\00:11:46.081 [INFO] Excluding 
com.fasterxml.jackson.core:jackson-core:jar:2.8.8 from the shaded jar.
2017-06-01\T\00:11:46.081 [INFO] Excluding 
com.fasterxml.jackson.core:jackson-annotations:jar:2.8.8 from the shaded jar.
2017-06-01\T\00:11:46.081 [INFO] Excluding 
com.fasterxml.jackson.core:jackson-databind:jar:2.8.8 from the shaded jar.
2017-06-01\T\00:11:46.081 [INFO] Excluding org.slf4j:slf4j-api:jar:1.7.14 from 
the shaded jar.
2017-06-01\T\00:11:46.081 [INFO] Excluding 
com.google.auto.service:auto-service:jar:1.0-rc2 from the shaded jar.
2017-06-01\T\00:11:46.081 [INFO] Excluding com.google.auto:auto-common:jar:0.3 
from the shaded jar.
2017-06-01\T\00:11:46.081 [INFO] Excluding 
com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from the shaded jar.
2017-06-01\T\00:11:46.081 [INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from 
the shaded jar.
2017-06-01\T\00:11:52.938 [INFO] Replacing original artifact with shaded 
artifact.
2017-06-01\T\00:11:52.938 [INFO] Replacing 

 with 

2017-06-01\T\00:11:52.938 [INFO] Replacing original test artifact with shaded 
test artifact.
2017-06-01\T\00:11:52.938 [INFO] Replacing 

 with 

2017-06-01\T\00:11:52.939 [INFO] Dependency-reduced POM written at: 

2017-06-01\T\00:11:53.105 [INFO] 
2017-06-01\T\00:11:53.106 [INFO] --- 
maven-failsafe-plugin:2.20:integration-test (integration-test) @ 
beam-runners-google-cloud-dataflow-java ---
2017-06-01\T\00:11:53.255 [INFO] Failsafe report directory: 

2017-06-01\T\00:11:53.261 [INFO] parallel='all', perCoreThreadCount=true, 
threadCount=4, useUnlimitedThreads=false, threadCountSuites=0, 
threadCountClasses=0, threadCountMethods=0, parallelOptimized=true
2017-06-01\T\00:11:53.282 [INFO] 
2017-06-01\T\00:11:53.282 [INFO] 
---
2017-06-01\T\00:11:53.282 [INFO]  T E S T S
2017-06-01\T\00:11:53.282 [INFO] 
---
2017-06-01\T\00:17:12.664 [INFO] Running 
org.apache.beam.sdk.io.gcp.datastore.SplitQueryFnIT
2017-06-01\T\00:17:12.694 [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 
0, Time elapsed: 9.188 s - in 
org.apache.beam.sdk.io.gcp.datastore.SplitQueryFnIT
2017-06-01\T\00:17:12.695 [INFO] Running 
org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteIT

[jira] [Commented] (BEAM-1938) Side Inputs should be part of the expanded inputs

2017-05-31 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16032153#comment-16032153
 ] 

ASF GitHub Bot commented on BEAM-1938:
--

GitHub user tgroh opened a pull request:

https://github.com/apache/beam/pull/3272

[BEAM-1938] Remove the requirement to visit PCollectionViews in Dataflow

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`.
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tgroh/beam reduce_pvalues_in_dataflow

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3272.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3272


commit f106c405daf90544ef1a9b54dbeac5a7142f0700
Author: Thomas Groh 
Date:   2017-05-31T22:42:03Z

Remove the requirement to visit PCollectionViews in Dataflow




> Side Inputs should be part of the expanded inputs
> -
>
> Key: BEAM-1938
> URL: https://issues.apache.org/jira/browse/BEAM-1938
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>
> Required for the Java SDK to construct the runner API graphs without 
> inspecting arbitrary transforms.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #3272: [BEAM-1938] Remove the requirement to visit PCollec...

2017-05-31 Thread tgroh
GitHub user tgroh opened a pull request:

https://github.com/apache/beam/pull/3272

[BEAM-1938] Remove the requirement to visit PCollectionViews in Dataflow

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`.
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tgroh/beam reduce_pvalues_in_dataflow

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3272.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3272


commit f106c405daf90544ef1a9b54dbeac5a7142f0700
Author: Thomas Groh 
Date:   2017-05-31T22:42:03Z

Remove the requirement to visit PCollectionViews in Dataflow




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Dataflow #3255

2017-05-31 Thread Apache Jenkins Server
See 




Jenkins build is still unstable: beam_PostCommit_Java_ValidatesRunner_Dataflow #3254

2017-05-31 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PostCommit_Java_MavenInstall #3975

2017-05-31 Thread Apache Jenkins Server
See 




Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Flink #2988

2017-05-31 Thread Apache Jenkins Server
See 




[jira] [Resolved] (BEAM-2246) JmsIO should use CLIENT_ACK instead of AUTO_ACK

2017-05-31 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-2246?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jean-Baptiste Onofré resolved BEAM-2246.

   Resolution: Fixed
Fix Version/s: 2.1.0

> JmsIO should use CLIENT_ACK instead of AUTO_ACK
> ---
>
> Key: BEAM-2246
> URL: https://issues.apache.org/jira/browse/BEAM-2246
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Affects Versions: 0.6.0
>Reporter: Jean-Baptiste Onofré
>Assignee: Jean-Baptiste Onofré
> Fix For: 2.1.0
>
>
> As {{JmsIO}} manages the messages acknowledgement in its checkpoint, it 
> should not use {{AUTO_ACKNOWLEDGE}} but {{CLIENT_ACKNOWLEDGE}}.
> With {{AUTO_ACK}}, the ack is sent to the JMS broker as soon as the message 
> is consumed. Basically, it means that the checkpoint is useless as, if 
> something goes wrong in the pipeline, the messages is already consumed from 
> the broker (so no way to redeliver the message).
> Instead, {{JmsIO}} should use {{CLIENT_ACK}} and explicitly send the ack 
> (commit) in the checkpoint (the code is already in place). 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-2246) JmsIO should use CLIENT_ACK instead of AUTO_ACK

2017-05-31 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2246?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16031803#comment-16031803
 ] 

ASF GitHub Bot commented on BEAM-2246:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3073


> JmsIO should use CLIENT_ACK instead of AUTO_ACK
> ---
>
> Key: BEAM-2246
> URL: https://issues.apache.org/jira/browse/BEAM-2246
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Affects Versions: 0.6.0
>Reporter: Jean-Baptiste Onofré
>Assignee: Jean-Baptiste Onofré
>
> As {{JmsIO}} manages the messages acknowledgement in its checkpoint, it 
> should not use {{AUTO_ACKNOWLEDGE}} but {{CLIENT_ACKNOWLEDGE}}.
> With {{AUTO_ACK}}, the ack is sent to the JMS broker as soon as the message 
> is consumed. Basically, it means that the checkpoint is useless as, if 
> something goes wrong in the pipeline, the messages is already consumed from 
> the broker (so no way to redeliver the message).
> Instead, {{JmsIO}} should use {{CLIENT_ACK}} and explicitly send the ack 
> (commit) in the checkpoint (the code is already in place). 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[1/2] beam git commit: [BEAM-2246] Use CLIENT_ACK instead of AUTO_ACK in JmsIO

2017-05-31 Thread jbonofre
Repository: beam
Updated Branches:
  refs/heads/master 4884d4867 -> 2df9dbd24


[BEAM-2246] Use CLIENT_ACK instead of AUTO_ACK in JmsIO


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/a158fc17
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/a158fc17
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/a158fc17

Branch: refs/heads/master
Commit: a158fc178e1297f04f4f18975383ec1dc69bc0d8
Parents: 4884d48
Author: Jean-Baptiste Onofré 
Authored: Wed May 10 07:39:56 2017 +0200
Committer: Jean-Baptiste Onofré 
Committed: Wed May 31 21:24:22 2017 +0200

--
 .../java/org/apache/beam/sdk/io/jms/JmsIO.java  |  5 +-
 .../org/apache/beam/sdk/io/jms/JmsIOTest.java   | 78 
 2 files changed, 81 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/a158fc17/sdks/java/io/jms/src/main/java/org/apache/beam/sdk/io/jms/JmsIO.java
--
diff --git 
a/sdks/java/io/jms/src/main/java/org/apache/beam/sdk/io/jms/JmsIO.java 
b/sdks/java/io/jms/src/main/java/org/apache/beam/sdk/io/jms/JmsIO.java
index b8355ad..c5e5150 100644
--- a/sdks/java/io/jms/src/main/java/org/apache/beam/sdk/io/jms/JmsIO.java
+++ b/sdks/java/io/jms/src/main/java/org/apache/beam/sdk/io/jms/JmsIO.java
@@ -379,7 +379,8 @@ public class JmsIO {
 
   }
 
-  private static class UnboundedJmsReader extends UnboundedReader {
+  @VisibleForTesting
+  static class UnboundedJmsReader extends UnboundedReader {
 
 private UnboundedJmsSource source;
 private JmsCheckpointMark checkpointMark;
@@ -421,7 +422,7 @@ public class JmsIO {
   }
 
   try {
-this.session = this.connection.createSession(false, 
Session.AUTO_ACKNOWLEDGE);
+this.session = this.connection.createSession(false, 
Session.CLIENT_ACKNOWLEDGE);
   } catch (Exception e) {
 throw new IOException("Error creating JMS session", e);
   }

http://git-wip-us.apache.org/repos/asf/beam/blob/a158fc17/sdks/java/io/jms/src/test/java/org/apache/beam/sdk/io/jms/JmsIOTest.java
--
diff --git 
a/sdks/java/io/jms/src/test/java/org/apache/beam/sdk/io/jms/JmsIOTest.java 
b/sdks/java/io/jms/src/test/java/org/apache/beam/sdk/io/jms/JmsIOTest.java
index 7edda1a..43c050e 100644
--- a/sdks/java/io/jms/src/test/java/org/apache/beam/sdk/io/jms/JmsIOTest.java
+++ b/sdks/java/io/jms/src/test/java/org/apache/beam/sdk/io/jms/JmsIOTest.java
@@ -23,10 +23,12 @@ import static org.hamcrest.Matchers.instanceOf;
 import static org.junit.Assert.assertEquals;
 import static org.junit.Assert.assertNull;
 import static org.junit.Assert.assertThat;
+import static org.junit.Assert.assertTrue;
 import static org.junit.Assert.fail;
 
 import java.io.IOException;
 import java.util.ArrayList;
+import java.util.Enumeration;
 import java.util.List;
 import javax.jms.Connection;
 import javax.jms.ConnectionFactory;
@@ -34,6 +36,7 @@ import javax.jms.JMSException;
 import javax.jms.Message;
 import javax.jms.MessageConsumer;
 import javax.jms.MessageProducer;
+import javax.jms.QueueBrowser;
 import javax.jms.Session;
 import javax.jms.TextMessage;
 import org.apache.activemq.ActiveMQConnectionFactory;
@@ -71,6 +74,7 @@ public class JmsIOTest {
 
   private BrokerService broker;
   private ConnectionFactory connectionFactory;
+  private ConnectionFactory connectionFactoryWithoutPrefetch;
 
   @Rule
   public final transient TestPipeline pipeline = TestPipeline.create();
@@ -98,6 +102,8 @@ public class JmsIOTest {
 
 // create JMS connection factory
 connectionFactory = new ActiveMQConnectionFactory(BROKER_URL);
+connectionFactoryWithoutPrefetch =
+new ActiveMQConnectionFactory(BROKER_URL + 
"?jms.prefetchPolicy.all=0");
   }
 
   @After
@@ -236,4 +242,76 @@ public class JmsIOTest {
 assertEquals(1, splits.size());
   }
 
+  @Test
+  public void testCheckpointMark() throws Exception {
+// we are using no prefetch here
+// prefetch is an ActiveMQ feature: to make efficient use of network 
resources the broker
+// utilizes a 'push' model to dispatch messages to consumers. However, in 
the case of our
+// test, it means that we can have some latency between the 
receiveNoWait() method used by
+// the consumer and the prefetch buffer populated by the broker. Using a 
prefetch to 0 means
+// that the consumer will poll for message, which is exactly what we want 
for the test.
+Connection connection = 
connectionFactoryWithoutPrefetch.createConnection(USERNAME, PASSWORD);
+connection.start();
+Session session = connection.createSession(false, 
Session.AUTO_ACKNOWLEDGE);
+MessageProducer producer = 

[GitHub] beam pull request #3073: [BEAM-2246] Use CLIENT_ACK instead of AUTO_ACK in J...

2017-05-31 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3073


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[2/2] beam git commit: [BEAM-2246] This closes #3073

2017-05-31 Thread jbonofre
[BEAM-2246] This closes #3073


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/2df9dbd2
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/2df9dbd2
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/2df9dbd2

Branch: refs/heads/master
Commit: 2df9dbd24f6fd3bcfb14316859f4b38c38d95eca
Parents: 4884d48 a158fc1
Author: Jean-Baptiste Onofré 
Authored: Wed May 31 21:27:34 2017 +0200
Committer: Jean-Baptiste Onofré 
Committed: Wed May 31 21:27:34 2017 +0200

--
 .../java/org/apache/beam/sdk/io/jms/JmsIO.java  |  5 +-
 .../org/apache/beam/sdk/io/jms/JmsIOTest.java   | 78 
 2 files changed, 81 insertions(+), 2 deletions(-)
--




[jira] [Resolved] (BEAM-1544) Move Java cross-JDK tests from Travis to Jenkins

2017-05-31 Thread Mark Liu (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1544?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Liu resolved BEAM-1544.

   Resolution: Done
Fix Version/s: Not applicable

> Move Java cross-JDK tests from Travis to Jenkins
> 
>
> Key: BEAM-1544
> URL: https://issues.apache.org/jira/browse/BEAM-1544
> Project: Beam
>  Issue Type: Task
>  Components: build-system, testing
>Reporter: Mark Liu
>Assignee: Mark Liu
> Fix For: Not applicable
>
>
> JDK versions we test in Travis:
> JDK 1.7 (latest)
> OpenJDK 7 (on Ubuntu only)
> OpenJDK 8 (on Ubuntu only)



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (BEAM-1544) Move Java cross-JDK tests from Travis to Jenkins

2017-05-31 Thread Mark Liu (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1544?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Liu closed BEAM-1544.
--

> Move Java cross-JDK tests from Travis to Jenkins
> 
>
> Key: BEAM-1544
> URL: https://issues.apache.org/jira/browse/BEAM-1544
> Project: Beam
>  Issue Type: Task
>  Components: build-system, testing
>Reporter: Mark Liu
>Assignee: Mark Liu
> Fix For: Not applicable
>
>
> JDK versions we test in Travis:
> JDK 1.7 (latest)
> OpenJDK 7 (on Ubuntu only)
> OpenJDK 8 (on Ubuntu only)



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (BEAM-1544) Move Java cross-JDK tests from Travis to Jenkins

2017-05-31 Thread Mark Liu (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1544?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Liu updated BEAM-1544:
---
Description: 
JDK versions we test in Travis:

JDK 1.7 (latest)
OpenJDK 7 (on Ubuntu only)
OpenJDK 8 (on Ubuntu only)

  was:
JDK versions we test in Travis:

JDK 1.8 (latest)
JDK 1.7 (latest)
OpenJDK 7 (on Ubuntu only)
OpenJDK 8 (on Ubuntu only)


> Move Java cross-JDK tests from Travis to Jenkins
> 
>
> Key: BEAM-1544
> URL: https://issues.apache.org/jira/browse/BEAM-1544
> Project: Beam
>  Issue Type: Task
>  Components: build-system, testing
>Reporter: Mark Liu
>Assignee: Mark Liu
>
> JDK versions we test in Travis:
> JDK 1.7 (latest)
> OpenJDK 7 (on Ubuntu only)
> OpenJDK 8 (on Ubuntu only)



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Build failed in Jenkins: beam_PostCommit_Java_MavenInstall #3974

2017-05-31 Thread Apache Jenkins Server
See 


--
[...truncated 3.07 MB...]
2017-05-31T19:08:46.997 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.jar
 (47 KB at 54.6 KB/sec)
2017-05-31T19:08:46.997 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-runtime_2.10/1.2.1/flink-runtime_2.10-1.2.1-tests.jar
2017-05-31T19:08:47.052 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-core/1.2.1/flink-core-1.2.1-tests.jar
 (716 KB at 784.9 KB/sec)
2017-05-31T19:08:47.052 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.2.1/flink-streaming-java_2.10-1.2.1-tests.jar
2017-05-31T19:08:47.160 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar
 (6960 KB at 6822.9 KB/sec)
2017-05-31T19:08:47.160 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.2.1/flink-test-utils_2.10-1.2.1.jar
2017-05-31T19:08:47.180 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.2.1/flink-streaming-java_2.10-1.2.1-tests.jar
 (953 KB at 916.2 KB/sec)
2017-05-31T19:08:47.180 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.2.1/flink-test-utils-junit-1.2.1.jar
2017-05-31T19:08:47.198 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.2.1/flink-streaming-java_2.10-1.2.1.jar
 (3035 KB at 2867.9 KB/sec)
2017-05-31T19:08:47.198 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/curator/curator-test/2.8.0/curator-test-2.8.0.jar
2017-05-31T19:08:47.210 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.2.1/flink-test-utils-junit-1.2.1.jar
 (24 KB at 21.5 KB/sec)
2017-05-31T19:08:47.227 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/curator/curator-test/2.8.0/curator-test-2.8.0.jar
 (39 KB at 35.9 KB/sec)
2017-05-31T19:08:47.253 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-runtime_2.10/1.2.1/flink-runtime_2.10-1.2.1-tests.jar
 (2432 KB at 2186.8 KB/sec)
2017-05-31T19:08:47.367 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.2.1/flink-test-utils_2.10-1.2.1.jar
 (2366 KB at 1929.6 KB/sec)
2017-05-31T19:08:47.693 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop2/1.2.1/flink-shaded-hadoop2-1.2.1.jar
 (17860 KB at 11499.9 KB/sec)
2017-05-31T19:08:47.710 [INFO] 
2017-05-31T19:08:47.710 [INFO] --- maven-clean-plugin:3.0.0:clean 
(default-clean) @ beam-runners-flink_2.10 ---
2017-05-31T19:08:47.713 [INFO] Deleting 

 (includes = [**/*.pyc, **/*.egg-info/, **/sdks/python/LICENSE, 
**/sdks/python/NOTICE, **/sdks/python/README.md], excludes = [])
2017-05-31T19:08:47.767 [INFO] 
2017-05-31T19:08:47.767 [INFO] --- maven-enforcer-plugin:1.4.1:enforce 
(enforce) @ beam-runners-flink_2.10 ---
2017-05-31T19:08:50.124 [INFO] 
2017-05-31T19:08:50.124 [INFO] --- maven-enforcer-plugin:1.4.1:enforce 
(enforce-banned-dependencies) @ beam-runners-flink_2.10 ---
2017-05-31T19:08:50.220 [INFO] 
2017-05-31T19:08:50.220 [INFO] --- maven-remote-resources-plugin:1.5:process 
(process-resource-bundles) @ beam-runners-flink_2.10 ---
2017-05-31T19:08:50.762 [INFO] 
2017-05-31T19:08:50.762 [INFO] --- maven-resources-plugin:3.0.2:resources 
(default-resources) @ beam-runners-flink_2.10 ---
2017-05-31T19:08:50.764 [INFO] Using 'UTF-8' encoding to copy filtered 
resources.
2017-05-31T19:08:50.764 [INFO] Copying 1 resource
2017-05-31T19:08:50.764 [INFO] Copying 3 resources
2017-05-31T19:08:50.855 [INFO] 
2017-05-31T19:08:50.855 [INFO] --- maven-compiler-plugin:3.6.1:compile 
(default-compile) @ beam-runners-flink_2.10 ---
2017-05-31T19:08:50.867 [INFO] Changes detected - recompiling the module!
2017-05-31T19:08:50.868 [INFO] Compiling 75 source files to 

2017-05-31T19:08:51.620 [WARNING] bootstrap class path not set in conjunction 
with -source 1.7
2017-05-31T19:08:51.620 [INFO] 
:
 Some input files use or override a deprecated API.
2017-05-31T19:08:51.620 [INFO] 
:
 Recompile with -Xlint:deprecation for details.

[jira] [Created] (BEAM-2394) Postcommit_Java_JDK_Version_Test is broken since SpannerWriteIT failed

2017-05-31 Thread Mark Liu (JIRA)
Mark Liu created BEAM-2394:
--

 Summary: Postcommit_Java_JDK_Version_Test is broken since 
SpannerWriteIT failed
 Key: BEAM-2394
 URL: https://issues.apache.org/jira/browse/BEAM-2394
 Project: Beam
  Issue Type: Bug
  Components: sdk-java-gcp, testing
Reporter: Mark Liu
Assignee: Mairbek Khadikov


SpannerWriteIT.testWrite failed in Postcommit_Java_JDK_Version_Test since 
database didn't setup successfully.

Error logs:
{code}
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT
2017-05-31\T\12:21:30.032 [ERROR] 
testWrite(org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT)  Time elapsed: 
283.011 s  <<< ERROR!
java.lang.RuntimeException: 
(b2cfd106d806288f): com.google.cloud.spanner.SpannerException: NOT_FOUND: 
io.grpc.StatusRuntimeException: NOT_FOUND: Database not found: 
projects/apache-beam-testing/instances/beam-test/databases/beam-testdb
resource_type: "type.googleapis.com/google.spanner.admin.database.v1.Database"
resource_name: 
"projects/apache-beam-testing/instances/beam-test/databases/beam-testdb"
description: "Database does not exist."

at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerExceptionPreformatted(SpannerExceptionFactory.java:119)
at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:71)
at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:58)
at 
com.google.cloud.spanner.SessionPool$Waiter.take(SessionPool.java:376)
at 
com.google.cloud.spanner.SessionPool$Waiter.access$2800(SessionPool.java:362)
at 
com.google.cloud.spanner.SessionPool.getReadSession(SessionPool.java:697)
at 
com.google.cloud.spanner.DatabaseClientImpl.writeAtLeastOnce(DatabaseClientImpl.java:37)
at 
org.apache.beam.sdk.io.gcp.spanner.SpannerIO$SpannerWriteFn.flushBatch(SpannerIO.java:322)
at 
org.apache.beam.sdk.io.gcp.spanner.SpannerIO$SpannerWriteFn.finishBundle(SpannerIO.java:281)
...
{code}

Jenkins link:
https://builds.apache.org/view/Beam/job/beam_PostCommit_Java_JDK_Versions_Test/26/jdk=OpenJDK%207%20(on%20Ubuntu%20only),label=beam/
https://builds.apache.org/view/Beam/job/beam_PostCommit_Java_JDK_Versions_Test/26/jdk=OpenJDK%208%20(on%20Ubuntu%20only),label=beam/

Note: the root directory of JDK version test contains space, which is the main 
difference with Postcommit_Java_MavenInstall. It can be like: 
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Java_JDK_Versions_Test/jdk/OpenJDK
 7 (on Ubuntu only)/label/beam/..."



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Jenkins build is still unstable: beam_PostCommit_Java_ValidatesRunner_Dataflow #3253

2017-05-31 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Java_MavenInstall #3973

2017-05-31 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Reduce Prevalence of PValue in the DirectRunner

--
[...truncated 3.07 MB...]
2017-05-31T18:51:34.267 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.jar
 (47 KB at 52.1 KB/sec)
2017-05-31T18:51:34.267 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-runtime_2.10/1.2.1/flink-runtime_2.10-1.2.1-tests.jar
2017-05-31T18:51:34.311 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar
 (6960 KB at 7387.8 KB/sec)
2017-05-31T18:51:34.311 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.2.1/flink-streaming-java_2.10-1.2.1-tests.jar
2017-05-31T18:51:34.337 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-core/1.2.1/flink-core-1.2.1-tests.jar
 (716 KB at 739.5 KB/sec)
2017-05-31T18:51:34.337 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.2.1/flink-test-utils_2.10-1.2.1.jar
2017-05-31T18:51:34.411 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.2.1/flink-streaming-java_2.10-1.2.1-tests.jar
 (953 KB at 914.5 KB/sec)
2017-05-31T18:51:34.411 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.2.1/flink-test-utils-junit-1.2.1.jar
2017-05-31T18:51:34.439 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.2.1/flink-test-utils-junit-1.2.1.jar
 (24 KB at 21.5 KB/sec)
2017-05-31T18:51:34.439 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/curator/curator-test/2.8.0/curator-test-2.8.0.jar
2017-05-31T18:51:34.471 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/curator/curator-test/2.8.0/curator-test-2.8.0.jar
 (39 KB at 35.3 KB/sec)
2017-05-31T18:51:34.498 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.2.1/flink-streaming-java_2.10-1.2.1.jar
 (3035 KB at 2687.5 KB/sec)
2017-05-31T18:51:34.519 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-runtime_2.10/1.2.1/flink-runtime_2.10-1.2.1-tests.jar
 (2432 KB at 2114.5 KB/sec)
2017-05-31T18:51:34.566 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.2.1/flink-test-utils_2.10-1.2.1.jar
 (2366 KB at 1976.4 KB/sec)
2017-05-31T18:51:34.898 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop2/1.2.1/flink-shaded-hadoop2-1.2.1.jar
 (17860 KB at 11672.8 KB/sec)
2017-05-31T18:51:34.912 [INFO] 
2017-05-31T18:51:34.912 [INFO] --- maven-clean-plugin:3.0.0:clean 
(default-clean) @ beam-runners-flink_2.10 ---
2017-05-31T18:51:34.928 [INFO] Deleting 

 (includes = [**/*.pyc, **/*.egg-info/, **/sdks/python/LICENSE, 
**/sdks/python/NOTICE, **/sdks/python/README.md], excludes = [])
2017-05-31T18:51:35.298 [INFO] 
2017-05-31T18:51:35.298 [INFO] --- maven-enforcer-plugin:1.4.1:enforce 
(enforce) @ beam-runners-flink_2.10 ---
2017-05-31T18:51:38.083 [INFO] 
2017-05-31T18:51:38.083 [INFO] --- maven-enforcer-plugin:1.4.1:enforce 
(enforce-banned-dependencies) @ beam-runners-flink_2.10 ---
2017-05-31T18:51:38.153 [INFO] 
2017-05-31T18:51:38.153 [INFO] --- maven-remote-resources-plugin:1.5:process 
(process-resource-bundles) @ beam-runners-flink_2.10 ---
2017-05-31T18:51:38.681 [INFO] 
2017-05-31T18:51:38.681 [INFO] --- maven-resources-plugin:3.0.2:resources 
(default-resources) @ beam-runners-flink_2.10 ---
2017-05-31T18:51:38.682 [INFO] Using 'UTF-8' encoding to copy filtered 
resources.
2017-05-31T18:51:38.682 [INFO] Copying 1 resource
2017-05-31T18:51:38.699 [INFO] Copying 3 resources
2017-05-31T18:51:38.771 [INFO] 
2017-05-31T18:51:38.771 [INFO] --- maven-compiler-plugin:3.6.1:compile 
(default-compile) @ beam-runners-flink_2.10 ---
2017-05-31T18:51:38.783 [INFO] Changes detected - recompiling the module!
2017-05-31T18:51:38.784 [INFO] Compiling 75 source files to 

2017-05-31T18:51:40.209 [WARNING] bootstrap class path not set in conjunction 
with -source 1.7
2017-05-31T18:51:40.209 [INFO] 
:
 Some input files use or override a deprecated API.
2017-05-31T18:51:40.209 [INFO] 

[jira] [Resolved] (BEAM-2340) Update contribution guide to include proper development instructions for python SDK

2017-05-31 Thread Chamikara Jayalath (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2340?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chamikara Jayalath resolved BEAM-2340.
--
   Resolution: Fixed
Fix Version/s: Not applicable

> Update contribution guide to include proper development instructions for 
> python SDK
> ---
>
> Key: BEAM-2340
> URL: https://issues.apache.org/jira/browse/BEAM-2340
> Project: Beam
>  Issue Type: Improvement
>  Components: website
>Reporter: Chamikara Jayalath
>Assignee: Chamikara Jayalath
> Fix For: Not applicable
>
>
> Currently Apache Beam contribution guide [1] gives proper developer 
> instructions for Java SDK developers including Maven and IDE setup.
> Contribution guide does not mention development guidelines for Python SDK 
> (other than instructions for cloning the repo using git).
> I suspect people who read this guide ends up assuming that Maven is needed to 
> develop Python SDK which is not the case (and AFAIK non of the Python SDK 
> developers regularly use Maven).
> Hence, I believe that contribution guide should be updated to include 
> following instructions.
> (1) Setting up a virtual environment (for a cloned repo)
> (2) Testing Python SDK
> (3) IDE setup (PyCharm ?)
> [1] 
> https://beam.apache.org/contribute/contribution-guide/#clone-the-repository-locally



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-2340) Update contribution guide to include proper development instructions for python SDK

2017-05-31 Thread Chamikara Jayalath (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2340?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16031693#comment-16031693
 ] 

Chamikara Jayalath commented on BEAM-2340:
--

https://github.com/apache/beam-site/pull/253 was merged.

> Update contribution guide to include proper development instructions for 
> python SDK
> ---
>
> Key: BEAM-2340
> URL: https://issues.apache.org/jira/browse/BEAM-2340
> Project: Beam
>  Issue Type: Improvement
>  Components: website
>Reporter: Chamikara Jayalath
>Assignee: Chamikara Jayalath
> Fix For: Not applicable
>
>
> Currently Apache Beam contribution guide [1] gives proper developer 
> instructions for Java SDK developers including Maven and IDE setup.
> Contribution guide does not mention development guidelines for Python SDK 
> (other than instructions for cloning the repo using git).
> I suspect people who read this guide ends up assuming that Maven is needed to 
> develop Python SDK which is not the case (and AFAIK non of the Python SDK 
> developers regularly use Maven).
> Hence, I believe that contribution guide should be updated to include 
> following instructions.
> (1) Setting up a virtual environment (for a cloned repo)
> (2) Testing Python SDK
> (3) IDE setup (PyCharm ?)
> [1] 
> https://beam.apache.org/contribute/contribution-guide/#clone-the-repository-locally



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Flink #2987

2017-05-31 Thread Apache Jenkins Server
See 




[1/3] beam git commit: Flink*DoFnFunction: fix check for single-output dofns

2017-05-31 Thread dhalperi
Repository: beam
Updated Branches:
  refs/heads/master 19c33dfa6 -> 4884d4867


Flink*DoFnFunction: fix check for single-output dofns

Fixes Findbugs and (presumably) increases efficiency


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/a0444b8c
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/a0444b8c
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/a0444b8c

Branch: refs/heads/master
Commit: a0444b8ce7f6d39b039612190102e146ef4148dd
Parents: 19c33df
Author: Dan Halperin 
Authored: Tue May 30 16:12:23 2017 -0700
Committer: Dan Halperin 
Committed: Wed May 31 11:36:15 2017 -0700

--
 .../runners/flink/translation/functions/FlinkDoFnFunction.java | 2 +-
 .../flink/translation/functions/FlinkStatefulDoFnFunction.java | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/a0444b8c/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
--
diff --git 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
index 42a8833..ab2ac6b 100644
--- 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
+++ 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
@@ -90,7 +90,7 @@ public class FlinkDoFnFunction
 RuntimeContext runtimeContext = getRuntimeContext();
 
 DoFnRunners.OutputManager outputManager;
-if (outputMap == null) {
+if (outputMap.size() == 1) {
   outputManager = new FlinkDoFnFunction.DoFnOutputManager(out);
 } else {
   // it has some additional outputs

http://git-wip-us.apache.org/repos/asf/beam/blob/a0444b8c/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkStatefulDoFnFunction.java
--
diff --git 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkStatefulDoFnFunction.java
 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkStatefulDoFnFunction.java
index b075768..11d4fee 100644
--- 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkStatefulDoFnFunction.java
+++ 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkStatefulDoFnFunction.java
@@ -91,7 +91,7 @@ public class FlinkStatefulDoFnFunction
 RuntimeContext runtimeContext = getRuntimeContext();
 
 DoFnRunners.OutputManager outputManager;
-if (outputMap == null) {
+if (outputMap.size() == 1) {
   outputManager = new FlinkDoFnFunction.DoFnOutputManager(out);
 } else {
   // it has some additional Outputs



[GitHub] beam pull request #3263: Flink*DoFnFunction: fix check for single-output dof...

2017-05-31 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3263


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[3/3] beam git commit: This closes #3263

2017-05-31 Thread dhalperi
This closes #3263


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/4884d486
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/4884d486
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/4884d486

Branch: refs/heads/master
Commit: 4884d486795e29a13a0765b11fb844d5257c468a
Parents: 19c33df 5780fc5
Author: Dan Halperin 
Authored: Wed May 31 11:36:17 2017 -0700
Committer: Dan Halperin 
Committed: Wed May 31 11:36:17 2017 -0700

--
 .../runners/flink/translation/functions/FlinkDoFnFunction.java | 6 --
 .../flink/translation/functions/FlinkStatefulDoFnFunction.java | 2 +-
 2 files changed, 5 insertions(+), 3 deletions(-)
--




[2/3] beam git commit: Add RawUnion code to FlinkDoFnFunction

2017-05-31 Thread dhalperi
Add RawUnion code to FlinkDoFnFunction


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/5780fc5e
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/5780fc5e
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/5780fc5e

Branch: refs/heads/master
Commit: 5780fc5e8cd64911d9612d89896b9d68be4f621f
Parents: a0444b8
Author: Dan Halperin 
Authored: Wed May 31 10:50:46 2017 -0700
Committer: Dan Halperin 
Committed: Wed May 31 11:36:16 2017 -0700

--
 .../runners/flink/translation/functions/FlinkDoFnFunction.java   | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/5780fc5e/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
--
diff --git 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
index ab2ac6b..d8ed622 100644
--- 
a/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
+++ 
b/runners/flink/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
@@ -146,7 +146,9 @@ public class FlinkDoFnFunction
 @Override
 @SuppressWarnings("unchecked")
 public  void output(TupleTag tag, WindowedValue output) {
-  collector.collect(output);
+  collector.collect(
+  WindowedValue.of(new RawUnionValue(0 /* single output */, 
output.getValue()),
+  output.getTimestamp(), output.getWindows(), output.getPane()));
 }
   }
 



Build failed in Jenkins: beam_PostCommit_Java_MavenInstall #3972

2017-05-31 Thread Apache Jenkins Server
See 


Changes:

[dhalperi] [TRIVIAL] InstantCoder: stop boxing Longs unnecessarily

--
[...truncated 3.07 MB...]
2017-05-31T18:32:19.796 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.jar
 (47 KB at 28.5 KB/sec)
2017-05-31T18:32:19.796 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-runtime_2.10/1.2.1/flink-runtime_2.10-1.2.1-tests.jar
2017-05-31T18:32:19.942 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-core/1.2.1/flink-core-1.2.1-tests.jar
 (716 KB at 401.0 KB/sec)
2017-05-31T18:32:19.942 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.2.1/flink-streaming-java_2.10-1.2.1-tests.jar
2017-05-31T18:32:20.194 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.2.1/flink-streaming-java_2.10-1.2.1-tests.jar
 (953 KB at 467.8 KB/sec)
2017-05-31T18:32:20.194 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.2.1/flink-test-utils_2.10-1.2.1.jar
2017-05-31T18:32:20.207 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar
 (6960 KB at 3394.8 KB/sec)
2017-05-31T18:32:20.207 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.2.1/flink-test-utils-junit-1.2.1.jar
2017-05-31T18:32:20.236 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.2.1/flink-test-utils-junit-1.2.1.jar
 (24 KB at 11.1 KB/sec)
2017-05-31T18:32:20.236 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/curator/curator-test/2.8.0/curator-test-2.8.0.jar
2017-05-31T18:32:20.270 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/curator/curator-test/2.8.0/curator-test-2.8.0.jar
 (39 KB at 18.4 KB/sec)
2017-05-31T18:32:20.313 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-runtime_2.10/1.2.1/flink-runtime_2.10-1.2.1-tests.jar
 (2432 KB at 1127.9 KB/sec)
2017-05-31T18:32:20.400 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.2.1/flink-streaming-java_2.10-1.2.1.jar
 (3035 KB at 1353.3 KB/sec)
2017-05-31T18:32:20.546 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.2.1/flink-test-utils_2.10-1.2.1.jar
 (2366 KB at 990.2 KB/sec)
2017-05-31T18:32:20.757 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop2/1.2.1/flink-shaded-hadoop2-1.2.1.jar
 (17860 KB at 6866.4 KB/sec)
2017-05-31T18:32:20.778 [INFO] 
2017-05-31T18:32:20.778 [INFO] --- maven-clean-plugin:3.0.0:clean 
(default-clean) @ beam-runners-flink_2.10 ---
2017-05-31T18:32:20.780 [INFO] Deleting 

 (includes = [**/*.pyc, **/*.egg-info/, **/sdks/python/LICENSE, 
**/sdks/python/NOTICE, **/sdks/python/README.md], excludes = [])
2017-05-31T18:32:20.953 [INFO] 
2017-05-31T18:32:20.953 [INFO] --- maven-enforcer-plugin:1.4.1:enforce 
(enforce) @ beam-runners-flink_2.10 ---
2017-05-31T18:32:23.970 [INFO] 
2017-05-31T18:32:23.970 [INFO] --- maven-enforcer-plugin:1.4.1:enforce 
(enforce-banned-dependencies) @ beam-runners-flink_2.10 ---
2017-05-31T18:32:24.060 [INFO] 
2017-05-31T18:32:24.061 [INFO] --- maven-remote-resources-plugin:1.5:process 
(process-resource-bundles) @ beam-runners-flink_2.10 ---
2017-05-31T18:32:24.952 [INFO] 
2017-05-31T18:32:24.952 [INFO] --- maven-resources-plugin:3.0.2:resources 
(default-resources) @ beam-runners-flink_2.10 ---
2017-05-31T18:32:24.953 [INFO] Using 'UTF-8' encoding to copy filtered 
resources.
2017-05-31T18:32:24.954 [INFO] Copying 1 resource
2017-05-31T18:32:24.954 [INFO] Copying 3 resources
2017-05-31T18:32:25.042 [INFO] 
2017-05-31T18:32:25.042 [INFO] --- maven-compiler-plugin:3.6.1:compile 
(default-compile) @ beam-runners-flink_2.10 ---
2017-05-31T18:32:25.098 [INFO] Changes detected - recompiling the module!
2017-05-31T18:32:25.098 [INFO] Compiling 75 source files to 

2017-05-31T18:32:26.906 [WARNING] bootstrap class path not set in conjunction 
with -source 1.7
2017-05-31T18:32:26.906 [INFO] 
:
 Some input files use or override a deprecated API.
2017-05-31T18:32:26.906 [INFO] 

Build failed in Jenkins: beam_PostCommit_Java_JDK_Versions_Test » OpenJDK 8 (on Ubuntu only),beam #27

2017-05-31 Thread Apache Jenkins Server
See 


Changes:

[dhalperi] [TRIVIAL] InstantCoder: stop boxing Longs unnecessarily

[tgroh] Reduce Prevalence of PValue in the DirectRunner

--
[...truncated 899.09 KB...]
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.ExecutionException: 
io.grpc.StatusRuntimeException: NOT_FOUND: Database not found: 
projects/apache-beam-testing/instances/beam-test/databases/beam-testdb
resource_type: "type.googleapis.com/google.spanner.admin.database.v1.Database"
resource_name: 
"projects/apache-beam-testing/instances/beam-test/databases/beam-testdb"
description: "Database does not exist."

at 
com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:500)
at 
com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:479)
at 
com.google.cloud.spanner.spi.v1.GrpcSpannerRpc.get(GrpcSpannerRpc.java:398)
... 13 more
Caused by: io.grpc.StatusRuntimeException: NOT_FOUND: Database not found: 
projects/apache-beam-testing/instances/beam-test/databases/beam-testdb
resource_type: "type.googleapis.com/google.spanner.admin.database.v1.Database"
resource_name: 
"projects/apache-beam-testing/instances/beam-test/databases/beam-testdb"
description: "Database does not exist."

at io.grpc.Status.asRuntimeException(Status.java:540)
at 
io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:439)
at 
io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:56)
at 
com.google.cloud.spanner.spi.v1.SpannerErrorInterceptor$1$1.onClose(SpannerErrorInterceptor.java:100)
at 
io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:56)
at 
com.google.cloud.spanner.spi.v1.WatchdogInterceptor$MonitoredCall$1.onClose(WatchdogInterceptor.java:190)
at 
io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:428)
at io.grpc.internal.ClientCallImpl.access$100(ClientCallImpl.java:76)
at 
io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.close(ClientCallImpl.java:514)
at 
io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.access$700(ClientCallImpl.java:431)
at 
io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:546)
at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:52)
at 
io.grpc.internal.SerializingExecutor$TaskRunner.run(SerializingExecutor.java:152)
... 3 more
(2dc7ef506b2ddc94): com.google.cloud.spanner.SpannerException: NOT_FOUND: 
io.grpc.StatusRuntimeException: NOT_FOUND: Database not found: 
projects/apache-beam-testing/instances/beam-test/databases/beam-testdb
resource_type: "type.googleapis.com/google.spanner.admin.database.v1.Database"
resource_name: 
"projects/apache-beam-testing/instances/beam-test/databases/beam-testdb"
description: "Database does not exist."

at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerExceptionPreformatted(SpannerExceptionFactory.java:119)
at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:71)
at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:58)
at 
com.google.cloud.spanner.SessionPool$Waiter.take(SessionPool.java:376)
at 
com.google.cloud.spanner.SessionPool$Waiter.access$2800(SessionPool.java:362)
at 
com.google.cloud.spanner.SessionPool.getReadSession(SessionPool.java:697)
at 
com.google.cloud.spanner.DatabaseClientImpl.writeAtLeastOnce(DatabaseClientImpl.java:37)
at 
org.apache.beam.sdk.io.gcp.spanner.SpannerIO$SpannerWriteFn.flushBatch(SpannerIO.java:322)
at 
org.apache.beam.sdk.io.gcp.spanner.SpannerIO$SpannerWriteFn.finishBundle(SpannerIO.java:281)
Caused by: com.google.cloud.spanner.SpannerException: NOT_FOUND: 
io.grpc.StatusRuntimeException: NOT_FOUND: Database not found: 
projects/apache-beam-testing/instances/beam-test/databases/beam-testdb
resource_type: "type.googleapis.com/google.spanner.admin.database.v1.Database"
resource_name: 
"projects/apache-beam-testing/instances/beam-test/databases/beam-testdb"
description: "Database does not exist."

at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerExceptionPreformatted(SpannerExceptionFactory.java:119)
at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:43)
at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:80)
at 
com.google.cloud.spanner.spi.v1.GrpcSpannerRpc.get(GrpcSpannerRpc.java:404)

Build failed in Jenkins: beam_PostCommit_Java_JDK_Versions_Test » JDK 1.7 (latest),beam #27

2017-05-31 Thread Apache Jenkins Server
See 


Changes:

[dhalperi] [TRIVIAL] InstantCoder: stop boxing Longs unnecessarily

[tgroh] Reduce Prevalence of PValue in the DirectRunner

--
[...truncated 883.15 KB...]
at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at 
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at 
org.apache.maven.surefire.junitcore.pc.Scheduler$1.run(Scheduler.java:393)
at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 
429 Too Many Requests
{
  "code" : 429,
  "errors" : [ {
"domain" : "global",
"message" : "(f1e61e3e1b38a68f): The workflow could not be created. Causes: 
(72b6459705e7582d): Too many running jobs. Project apache-beam-testing is 
running 25 jobs and project limit for active jobs is 25. To fix this, cancel an 
existing workflow via the UI, wait for a workflow to finish or contact 
dataflow-feedb...@google.com to request an increase in quota.",
"reason" : "rateLimitExceeded"
  } ],
  "message" : "(f1e61e3e1b38a68f): The workflow could not be created. Causes: 
(72b6459705e7582d): Too many running jobs. Project apache-beam-testing is 
running 25 jobs and project limit for active jobs is 25. To fix this, cancel an 
existing workflow via the UI, wait for a workflow to finish or contact 
dataflow-feedb...@google.com to request an increase in quota.",
  "status" : "RESOURCE_EXHAUSTED"
}
at 
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:321)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1065)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:419)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at 
org.apache.beam.runners.dataflow.DataflowClient.createJob(DataflowClient.java:61)
at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:600)
... 27 more

2017-05-31\T\18:28:19.006 [INFO] Running 
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT
2017-05-31\T\18:28:19.019 [ERROR] Tests run: 1, Failures: 0, Errors: 1, 
Skipped: 0, Time elapsed: 266.614 s <<< FAILURE! - in 
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT
2017-05-31\T\18:28:19.019 [ERROR] 
testWrite(org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT)  Time elapsed: 
266.614 s  <<< ERROR!
com.google.cloud.spanner.SpannerException: 
NOT_FOUND: io.grpc.StatusRuntimeException: NOT_FOUND: Database not found: 
projects/apache-beam-testing/instances/beam-test/databases/beam-testdb
resource_type: "type.googleapis.com/google.spanner.admin.database.v1.Database"
resource_name: 
"projects/apache-beam-testing/instances/beam-test/databases/beam-testdb"
description: "Database does not exist."

at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerExceptionPreformatted(SpannerExceptionFactory.java:119)
at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:71)
at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:58)
at 
com.google.cloud.spanner.SessionPool$Waiter.take(SessionPool.java:376)
at 
com.google.cloud.spanner.SessionPool$Waiter.access$2800(SessionPool.java:362)
at 
com.google.cloud.spanner.SessionPool.getReadSession(SessionPool.java:697)
at 

Build failed in Jenkins: beam_PostCommit_Java_JDK_Versions_Test » OpenJDK 7 (on Ubuntu only),beam #27

2017-05-31 Thread Apache Jenkins Server
See 


Changes:

[dhalperi] [TRIVIAL] InstantCoder: stop boxing Longs unnecessarily

[tgroh] Reduce Prevalence of PValue in the DirectRunner

--
[...truncated 918.12 KB...]
at com.google.cloud.spanner.SpannerImpl$2.call(SpannerImpl.java:223)
at com.google.cloud.spanner.SpannerImpl$2.call(SpannerImpl.java:220)
at 
com.google.cloud.spanner.SpannerImpl.runWithRetries(SpannerImpl.java:202)
at 
com.google.cloud.spanner.SpannerImpl.createSession(SpannerImpl.java:219)
at com.google.cloud.spanner.SessionPool$4.run(SessionPool.java:987)
at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.ExecutionException: 
io.grpc.StatusRuntimeException: NOT_FOUND: Database not found: 
projects/apache-beam-testing/instances/beam-test/databases/beam-testdb
resource_type: "type.googleapis.com/google.spanner.admin.database.v1.Database"
resource_name: 
"projects/apache-beam-testing/instances/beam-test/databases/beam-testdb"
description: "Database does not exist."

at 
com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:500)
at 
com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:479)
at 
com.google.cloud.spanner.spi.v1.GrpcSpannerRpc.get(GrpcSpannerRpc.java:398)
... 13 more
Caused by: io.grpc.StatusRuntimeException: NOT_FOUND: Database not found: 
projects/apache-beam-testing/instances/beam-test/databases/beam-testdb
resource_type: "type.googleapis.com/google.spanner.admin.database.v1.Database"
resource_name: 
"projects/apache-beam-testing/instances/beam-test/databases/beam-testdb"
description: "Database does not exist."

at io.grpc.Status.asRuntimeException(Status.java:540)
at 
io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:439)
at 
io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:56)
at 
com.google.cloud.spanner.spi.v1.SpannerErrorInterceptor$1$1.onClose(SpannerErrorInterceptor.java:100)
at 
io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:56)
at 
com.google.cloud.spanner.spi.v1.WatchdogInterceptor$MonitoredCall$1.onClose(WatchdogInterceptor.java:190)
at 
io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:428)
at io.grpc.internal.ClientCallImpl.access$100(ClientCallImpl.java:76)
at 
io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.close(ClientCallImpl.java:514)
at 
io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.access$700(ClientCallImpl.java:431)
at 
io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:546)
at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:52)
at 
io.grpc.internal.SerializingExecutor$TaskRunner.run(SerializingExecutor.java:152)
... 3 more
(79326f249265fee9): Workflow failed. Causes: (79326f249265f127): 
S01:GenerateSequence/Read(BoundedCountingSource)+ParDo(GenerateMutations)+SpannerIO.Write/Write
 mutations to Cloud Spanner failed.
at 
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:133)
at 
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:89)
at 
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:54)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:297)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:283)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:340)
at 
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT.testWrite(SpannerWriteIT.java:120)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)

Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Flink #2986

2017-05-31 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Java_MavenInstall_Windows #77

2017-05-31 Thread Apache Jenkins Server
See 


Changes:

[dhalperi] [TRIVIAL] InstantCoder: stop boxing Longs unnecessarily

[tgroh] Reduce Prevalence of PValue in the DirectRunner

--
[...truncated 351.83 KB...]
2017-05-31T18:03:02.199 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/errorprone/error_prone_annotations/2.0.15/error_prone_annotations-2.0.15.pom
2017-05-31T18:03:02.210 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/errorprone/error_prone_annotations/2.0.15/error_prone_annotations-2.0.15.pom
 (2 KB at 145.1 KB/sec)
2017-05-31T18:03:02.214 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/errorprone/error_prone_parent/2.0.15/error_prone_parent-2.0.15.pom
2017-05-31T18:03:02.226 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/errorprone/error_prone_parent/2.0.15/error_prone_parent-2.0.15.pom
 (6 KB at 417.2 KB/sec)
2017-05-31T18:03:02.231 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/code/findbugs/jsr305/3.0.1/jsr305-3.0.1.pom
2017-05-31T18:03:02.243 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/code/findbugs/jsr305/3.0.1/jsr305-3.0.1.pom
 (5 KB at 348.8 KB/sec)
2017-05-31T18:03:02.248 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-context/1.2.0/grpc-context-1.2.0.pom
2017-05-31T18:03:02.267 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-context/1.2.0/grpc-context-1.2.0.pom
 (2 KB at 89.8 KB/sec)
2017-05-31T18:03:02.272 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/instrumentation/instrumentation-api/0.3.0/instrumentation-api-0.3.0.pom
2017-05-31T18:03:02.294 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/instrumentation/instrumentation-api/0.3.0/instrumentation-api-0.3.0.pom
 (2 KB at 66.5 KB/sec)
2017-05-31T18:03:02.299 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-protobuf/1.2.0/grpc-protobuf-1.2.0.pom
2017-05-31T18:03:02.311 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-protobuf/1.2.0/grpc-protobuf-1.2.0.pom
 (3 KB at 216.1 KB/sec)
2017-05-31T18:03:02.316 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java-util/3.2.0/protobuf-java-util-3.2.0.pom
2017-05-31T18:03:02.328 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java-util/3.2.0/protobuf-java-util-3.2.0.pom
 (5 KB at 345.8 KB/sec)
2017-05-31T18:03:02.334 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/code/gson/gson/2.7/gson-2.7.pom
2017-05-31T18:03:02.349 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/code/gson/gson/2.7/gson-2.7.pom 
(2 KB at 94.1 KB/sec)
2017-05-31T18:03:02.354 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/code/gson/gson-parent/2.7/gson-parent-2.7.pom
2017-05-31T18:03:02.364 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/code/gson/gson-parent/2.7/gson-parent-2.7.pom
 (4 KB at 349.6 KB/sec)
2017-05-31T18:03:02.369 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-protobuf-lite/1.2.0/grpc-protobuf-lite-1.2.0.pom
2017-05-31T18:03:02.379 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-protobuf-lite/1.2.0/grpc-protobuf-lite-1.2.0.pom
 (3 KB at 207.4 KB/sec)
2017-05-31T18:03:02.384 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-stub/1.2.0/grpc-stub-1.2.0.pom
2017-05-31T18:03:02.394 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-stub/1.2.0/grpc-stub-1.2.0.pom
 (3 KB at 202.9 KB/sec)
2017-05-31T18:03:02.406 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/3.2.0/protobuf-java-3.2.0.jar
2017-05-31T18:03:02.406 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-core/1.2.0/grpc-core-1.2.0.jar
2017-05-31T18:03:02.407 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/errorprone/error_prone_annotations/2.0.15/error_prone_annotations-2.0.15.jar
2017-05-31T18:03:02.407 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/code/findbugs/jsr305/3.0.1/jsr305-3.0.1.jar
2017-05-31T18:03:02.408 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-context/1.2.0/grpc-context-1.2.0.jar
2017-05-31T18:03:02.423 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/errorprone/error_prone_annotations/2.0.15/error_prone_annotations-2.0.15.jar
 (11 KB at 624.2 KB/sec)
2017-05-31T18:03:02.423 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/instrumentation/instrumentation-api/0.3.0/instrumentation-api-0.3.0.jar
2017-05-31T18:03:02.424 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/code/findbugs/jsr305/3.0.1/jsr305-3.0.1.jar
 (20 KB at 1082.0 KB/sec)

[GitHub] beam-site pull request #253: [BEAM-3240] Improves development and testing in...

2017-05-31 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam-site/pull/253


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/3] beam-site git commit: Updates contribution guide to include development and testing instructions for Python SDK.

2017-05-31 Thread altay
Repository: beam-site
Updated Branches:
  refs/heads/asf-site 8d92f7155 -> 369b331db


Updates contribution guide to include development and testing instructions for 
Python SDK.


Project: http://git-wip-us.apache.org/repos/asf/beam-site/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam-site/commit/13e2e31f
Tree: http://git-wip-us.apache.org/repos/asf/beam-site/tree/13e2e31f
Diff: http://git-wip-us.apache.org/repos/asf/beam-site/diff/13e2e31f

Branch: refs/heads/asf-site
Commit: 13e2e31fd82c2098f659e361f23a1fa0b16e806c
Parents: 8d92f71
Author: chamik...@google.com 
Authored: Thu May 25 17:12:50 2017 -0700
Committer: chamik...@google.com 
Committed: Tue May 30 16:16:56 2017 -0700

--
 src/contribute/contribution-guide.md | 33 ++-
 1 file changed, 28 insertions(+), 5 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam-site/blob/13e2e31f/src/contribute/contribution-guide.md
--
diff --git a/src/contribute/contribution-guide.md 
b/src/contribute/contribution-guide.md
index 62ec895..efccb7f 100644
--- a/src/contribute/contribution-guide.md
+++ b/src/contribute/contribution-guide.md
@@ -80,8 +80,6 @@ If you do not already have a personal GitHub account, sign up 
[here](https://git
  Fork the repository on GitHub
 Go to the [Beam GitHub mirror](https://github.com/apache/beam/) and fork the 
repository to your own private account. This will be your private workspace for 
staging changes.
 
-We recommend enabling Travis-CI continuous integration coverage on your 
private fork in order to easily test your changes before proposing a pull 
request. Go to [Travis-CI](https://travis-ci.org), log in with your GitHub 
account, and enable coverage for your repository.
-
  Clone the repository locally
 You are now ready to create the development environment on your local machine. 
Feel free to repeat these steps on all machines that you want to use for 
development.
 
@@ -98,6 +96,10 @@ Add your forked repository as an additional Git remote, 
where you’ll push your
 
 You are now ready to start developing!
 
+ [Python SDK Only] Set up a virtual environemt
+
+We recommend setting up a virtual envioment for developing Python SDK. Please 
see instructions available in [Quickstart (Python)]({{ site.baseurl 
}}/get-started/quickstart-py/) for setting up a virtual environment.
+
  [Optional] IDE Setup
 
 Depending on your preferred development environment, you may need to prepare 
it to develop Beam code.
@@ -231,10 +233,31 @@ Then you can push your local, committed changes to your 
(forked) repository on G
 ### Testing
 All code should have appropriate unit testing coverage. New code should have 
new tests in the same contribution. Bug fixes should include a regression test 
to prevent the issue from reoccurring.
 
-For contributions to the Java code, run unit tests locally via Maven. 
Alternatively, you can use Travis-CI.
+ Java SDK
+
+For contributions to the Java code, run unit tests locally via Maven.
 
 $ mvn clean verify
 
+ Python SDK
+
+For contributions to the Python code, you can use command given below to run 
unit tests locally. If you update any of the [cythonized](http://cython.org) 
files in Python SDK, you must install "cython" package before running following 
command to properly test your code. We recommend setting up a virtual 
environment before testing your code.
+
+$ python setup.py test
+
+You can use following command to run a single test method.
+
+$ python setup.py test -s ..
+
+To Check for lint errors locally, install "tox" package and run following 
command.
+
+$ pip install tox
+$ tox -e lint
+
+Beam supports running Python SDK tests using Maven. For this, navigate to root 
directory of your Apache Beam clone and execute following command. Currently 
this cannot be run from a virtual environment.
+
+$ mvn clean verify -pl sdks/python
+
 ## Review
 Once the initial code is complete and the tests pass, it’s time to start the 
code review process. We review and discuss all code, no matter who authors it. 
It’s a great way to build community, since you can learn from other 
developers, and they become familiar with your contribution. It also builds a 
strong project by encouraging a high quality bar and keeping code consistent 
throughout the project.
 
@@ -260,7 +283,7 @@ When choosing a reviewer, think about who is the expert on 
the relevant code, wh
 ### Code Review and Revision
 During the code review process, don’t rebase your branch or otherwise modify 
published commits, since this can remove existing comment history and be 
confusing to the reviewer. When you make a revision, always push it in a new 
commit.
 
-Our GitHub mirror automatically provides 

[3/3] beam-site git commit: This closes #253

2017-05-31 Thread altay
This closes #253


Project: http://git-wip-us.apache.org/repos/asf/beam-site/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam-site/commit/369b331d
Tree: http://git-wip-us.apache.org/repos/asf/beam-site/tree/369b331d
Diff: http://git-wip-us.apache.org/repos/asf/beam-site/diff/369b331d

Branch: refs/heads/asf-site
Commit: 369b331dbc992f2a2e2b7a478c20f3be0566b109
Parents: 8d92f71 c59f312
Author: Ahmet Altay 
Authored: Wed May 31 10:55:11 2017 -0700
Committer: Ahmet Altay 
Committed: Wed May 31 10:55:11 2017 -0700

--
 .../contribute/contribution-guide/index.html| 48 +---
 src/contribute/contribution-guide.md| 33 --
 2 files changed, 70 insertions(+), 11 deletions(-)
--




[2/3] beam-site git commit: Regenerate website

2017-05-31 Thread altay
Regenerate website


Project: http://git-wip-us.apache.org/repos/asf/beam-site/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam-site/commit/c59f3122
Tree: http://git-wip-us.apache.org/repos/asf/beam-site/tree/c59f3122
Diff: http://git-wip-us.apache.org/repos/asf/beam-site/diff/c59f3122

Branch: refs/heads/asf-site
Commit: c59f31220f73badccb4ddb9b74e7533ee13829b8
Parents: 13e2e31
Author: Ahmet Altay 
Authored: Wed May 31 10:55:11 2017 -0700
Committer: Ahmet Altay 
Committed: Wed May 31 10:55:11 2017 -0700

--
 .../contribute/contribution-guide/index.html| 48 +---
 1 file changed, 42 insertions(+), 6 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam-site/blob/c59f3122/content/contribute/contribution-guide/index.html
--
diff --git a/content/contribute/contribution-guide/index.html 
b/content/contribute/contribution-guide/index.html
index 110f966..a651497 100644
--- a/content/contribute/contribution-guide/index.html
+++ b/content/contribute/contribution-guide/index.html
@@ -152,6 +152,7 @@
   Obtain a GitHub account
   Fork the repository on 
GitHub
   Clone the repository 
locally
+  [Python SDK Only] 
Set up a virtual environemt
   [Optional] IDE Setup
   IntelliJ  
  
   Enable Annotation 
Processing
@@ -171,7 +172,11 @@
   
   Create a branch in your 
fork
   Syncing and pushing your 
branch
-  Testing
+  Testing
+  Java SDK
+  Python 
SDK
+
+  
 
   
   Review
@@ -280,8 +285,6 @@
 Fork the repository on GitHub
 Go to the https://github.com/apache/beam/;>Beam GitHub mirror 
and fork the repository to your own private account. This will be your private 
workspace for staging changes.
 
-We recommend enabling Travis-CI continuous integration coverage on your 
private fork in order to easily test your changes before proposing a pull 
request. Go to https://travis-ci.org;>Travis-CI, log in with your 
GitHub account, and enable coverage for your repository.
-
 Clone the repository locally
 You are now ready to create the development environment on your local 
machine. Feel free to repeat these steps on all machines that you want to use 
for development.
 
@@ -302,6 +305,10 @@ $ cd beam
 
 You are now ready to start developing!
 
+[Python SDK Only] Set up 
a virtual environemt
+
+We recommend setting up a virtual envioment for developing Python SDK. 
Please see instructions available in Quickstart (Python) for setting up a 
virtual environment.
+
 [Optional] IDE Setup
 
 Depending on your preferred development environment, you may need to 
prepare it to develop Beam code.
@@ -475,12 +482,41 @@ $ git checkout -b my-branch origin/master
 Testing
 All code should have appropriate unit testing coverage. New code should 
have new tests in the same contribution. Bug fixes should include a regression 
test to prevent the issue from reoccurring.
 
-For contributions to the Java code, run unit tests locally via Maven. 
Alternatively, you can use Travis-CI.
+Java SDK
+
+For contributions to the Java code, run unit tests locally via Maven.
 
 $ mvn clean verify
 
 
 
+Python SDK
+
+For contributions to the Python code, you can use command given below to 
run unit tests locally. If you update any of the http://cython.org;>cythonized files in Python SDK, you must install 
“cython” package before running following command to properly test your 
code. We recommend setting up a virtual environment before testing your 
code.
+
+$ python setup.py 
test
+
+
+
+You can use following command to run a single test method.
+
+$ python setup.py 
test -s module.test class.test method
+
+
+
+To Check for lint errors locally, install “tox” package and run 
following command.
+
+$ pip install tox
+$ tox -e lint
+
+
+
+Beam supports running Python SDK tests using Maven. For this, navigate to 
root directory of your Apache Beam clone and execute following command. 
Currently this cannot be run from a virtual environment.
+
+$ mvn clean verify 
-pl sdks/python
+
+
+
 Review
 Once the initial code is complete and the tests pass, it’s time to start 
the code review process. We review and discuss all code, no matter who authors 
it. It’s a great way to build community, since you can learn from other 
developers, and they become familiar with your contribution. It also builds a 
strong project by encouraging a high quality bar and keeping code consistent 
throughout the project.
 
@@ -512,7 +548,7 @@ $ git checkout -b my-branch origin/master
 Code Review and Revision
 During the code review process, don’t rebase your branch or otherwise 
modify published commits, since this can remove existing comment 

Jenkins build is still unstable: beam_PostCommit_Java_ValidatesRunner_Flink #2985

2017-05-31 Thread Apache Jenkins Server
See 




Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Spark #2223

2017-05-31 Thread Apache Jenkins Server
See 




[jira] [Resolved] (BEAM-2366) Post commit failure: import not found gen_protos

2017-05-31 Thread Ahmet Altay (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2366?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ahmet Altay resolved BEAM-2366.
---
   Resolution: Fixed
Fix Version/s: 2.1.0

> Post commit failure: import not found gen_protos
> 
>
> Key: BEAM-2366
> URL: https://issues.apache.org/jira/browse/BEAM-2366
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py
>Reporter: Ahmet Altay
>Assignee: Robert Bradshaw
> Fix For: 2.1.0
>
>
> https://builds.apache.org/view/Beam/job/beam_PostCommit_Python_Verify/2317/consoleFull
> root: INFO: 2017-05-25T20:38:41.442Z: JOB_MESSAGE_ERROR: (9eea00070bd5c095): 
> Traceback (most recent call last):
>   File 
> "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 
> 706, in run
> self._load_main_session(self.local_staging_directory)
>   File 
> "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 
> 446, in _load_main_session
> pickler.load_session(session_file)
>   File 
> "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", 
> line 247, in load_session
> return dill.load_session(file_path)
>   File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in 
> load_session
> module = unpickler.load()
>   File "/usr/lib/python2.7/pickle.py", line 858, in load
> dispatch[key](self)
>   File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
> value = func(*args)
>   File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 766, in 
> _import_module
> return __import__(import_name)
> ImportError: No module named gen_protos



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-1938) Side Inputs should be part of the expanded inputs

2017-05-31 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16031574#comment-16031574
 ] 

ASF GitHub Bot commented on BEAM-1938:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3269


> Side Inputs should be part of the expanded inputs
> -
>
> Key: BEAM-1938
> URL: https://issues.apache.org/jira/browse/BEAM-1938
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>
> Required for the Java SDK to construct the runner API graphs without 
> inspecting arbitrary transforms.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #3269: [BEAM-1938] Reduce Prevalence of PValue in the Dire...

2017-05-31 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3269


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[2/2] beam git commit: Reduce Prevalence of PValue in the DirectRunner

2017-05-31 Thread tgroh
Reduce Prevalence of PValue in the DirectRunner

Use PCollection or PCollectionView explicitly.

Retrieve views from the WriteView transform rather than visiting the
view as an output PValue.


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/6a78bd3f
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/6a78bd3f
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/6a78bd3f

Branch: refs/heads/master
Commit: 6a78bd3f09c99f49c5f27d15b3791f200bf5d53d
Parents: dc70383
Author: Thomas Groh 
Authored: Wed May 31 09:38:02 2017 -0700
Committer: Thomas Groh 
Committed: Wed May 31 10:38:45 2017 -0700

--
 .../apache/beam/runners/direct/DirectGraph.java | 26 +++-
 .../beam/runners/direct/DirectGraphVisitor.java | 25 ++-
 .../beam/runners/direct/DirectRunner.java   |  4 ++-
 .../beam/runners/direct/EvaluationContext.java  | 17 -
 .../beam/runners/direct/WatermarkManager.java   | 19 +++---
 .../runners/direct/DirectGraphVisitorTest.java  |  3 +++
 .../beam/runners/direct/DirectGraphs.java   | 10 +++-
 .../runners/direct/EvaluationContextTest.java   |  6 -
 8 files changed, 80 insertions(+), 30 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/6a78bd3f/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectGraph.java
--
diff --git 
a/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectGraph.java
 
b/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectGraph.java
index 83b214a..c2c0afa 100644
--- 
a/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectGraph.java
+++ 
b/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectGraph.java
@@ -24,9 +24,9 @@ import java.util.Map;
 import java.util.Set;
 import org.apache.beam.sdk.Pipeline;
 import org.apache.beam.sdk.runners.AppliedPTransform;
+import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionView;
 import org.apache.beam.sdk.values.PInput;
-import org.apache.beam.sdk.values.POutput;
 import org.apache.beam.sdk.values.PValue;
 
 /**
@@ -34,39 +34,43 @@ import org.apache.beam.sdk.values.PValue;
  * executed with the {@link DirectRunner}.
  */
 class DirectGraph {
-  private final Map producers;
+  private final Map producers;
+  private final Map 
viewWriters;
   private final ListMultimap 
primitiveConsumers;
-  private final Set views;
 
   private final Set rootTransforms;
   private final Map stepNames;
 
   public static DirectGraph create(
-  Map producers,
+  Map producers,
+  Map viewWriters,
   ListMultimap primitiveConsumers,
-  Set views,
   Set rootTransforms,
   Map stepNames) {
-return new DirectGraph(producers, primitiveConsumers, views, 
rootTransforms, stepNames);
+return new DirectGraph(producers, viewWriters, primitiveConsumers, 
rootTransforms, stepNames);
   }
 
   private DirectGraph(
-  Map producers,
+  Map producers,
+  Map viewWriters,
   ListMultimap primitiveConsumers,
-  Set views,
   Set rootTransforms,
   Map stepNames) {
 this.producers = producers;
+this.viewWriters = viewWriters;
 this.primitiveConsumers = primitiveConsumers;
-this.views = views;
 this.rootTransforms = rootTransforms;
 this.stepNames = stepNames;
   }
 
-  AppliedPTransform getProducer(PValue produced) {
+  AppliedPTransform getProducer(PCollection produced) {
 return producers.get(produced);
   }
 
+  AppliedPTransform getWriter(PCollectionView view) {
+return viewWriters.get(view);
+  }
+
   List getPrimitiveConsumers(PValue consumed) {
 return primitiveConsumers.get(consumed);
   }
@@ -76,7 +80,7 @@ class DirectGraph {
   }
 
   Set getViews() {
-return views;
+return viewWriters.keySet();
   }
 
   String getStepName(AppliedPTransform step) {

http://git-wip-us.apache.org/repos/asf/beam/blob/6a78bd3f/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectGraphVisitor.java
--
diff --git 

[1/2] beam git commit: This closes #3269

2017-05-31 Thread tgroh
Repository: beam
Updated Branches:
  refs/heads/master dc70383cb -> 19c33dfa6


This closes #3269


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/19c33dfa
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/19c33dfa
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/19c33dfa

Branch: refs/heads/master
Commit: 19c33dfa6b8a64a102192c7ff47acc27a0db548a
Parents: dc70383 6a78bd3
Author: Thomas Groh 
Authored: Wed May 31 10:38:45 2017 -0700
Committer: Thomas Groh 
Committed: Wed May 31 10:38:45 2017 -0700

--
 .../apache/beam/runners/direct/DirectGraph.java | 26 +++-
 .../beam/runners/direct/DirectGraphVisitor.java | 25 ++-
 .../beam/runners/direct/DirectRunner.java   |  4 ++-
 .../beam/runners/direct/EvaluationContext.java  | 17 -
 .../beam/runners/direct/WatermarkManager.java   | 19 +++---
 .../runners/direct/DirectGraphVisitorTest.java  |  3 +++
 .../beam/runners/direct/DirectGraphs.java   | 10 +++-
 .../runners/direct/EvaluationContextTest.java   |  6 -
 8 files changed, 80 insertions(+), 30 deletions(-)
--




Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Flink #2984

2017-05-31 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-1348) Model the Fn Api

2017-05-31 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1348?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16031544#comment-16031544
 ] 

ASF GitHub Bot commented on BEAM-1348:
--

GitHub user lukecwik opened a pull request:

https://github.com/apache/beam/pull/3271

[BEAM-1348] Add Runner API constructs to ProcessBundleDescriptor.

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [x] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [x] Make sure tests pass via `mvn clean verify`.
 - [x] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [x] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---
This is towards removing the Fn API constructs once code has been migrated.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/lukecwik/incubator-beam 
migrate_to_runner_constructs

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3271.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3271


commit d944a8684d1c47327e15dbdde15650d06db1ece5
Author: Luke Cwik 
Date:   2017-05-31T17:10:06Z

[BEAM-1348] Add Runner API constructs to ProcessBundleDescriptor.

This is towards removing the Fn API constructs once code has been migrated.




> Model the Fn Api
> 
>
> Key: BEAM-1348
> URL: https://issues.apache.org/jira/browse/BEAM-1348
> Project: Beam
>  Issue Type: Improvement
>  Components: beam-model-fn-api
>Reporter: Luke Cwik
>Assignee: Luke Cwik
>
> Create a proto representation of the services and data types required to 
> execute the Fn Api.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #3271: [BEAM-1348] Add Runner API constructs to ProcessBun...

2017-05-31 Thread lukecwik
GitHub user lukecwik opened a pull request:

https://github.com/apache/beam/pull/3271

[BEAM-1348] Add Runner API constructs to ProcessBundleDescriptor.

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [x] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [x] Make sure tests pass via `mvn clean verify`.
 - [x] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [x] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---
This is towards removing the Fn API constructs once code has been migrated.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/lukecwik/incubator-beam 
migrate_to_runner_constructs

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3271.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3271


commit d944a8684d1c47327e15dbdde15650d06db1ece5
Author: Luke Cwik 
Date:   2017-05-31T17:10:06Z

[BEAM-1348] Add Runner API constructs to ProcessBundleDescriptor.

This is towards removing the Fn API constructs once code has been migrated.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] beam git commit: [TRIVIAL] InstantCoder: stop boxing Longs unnecessarily

2017-05-31 Thread dhalperi
Repository: beam
Updated Branches:
  refs/heads/master 006fde46c -> dc70383cb


[TRIVIAL] InstantCoder: stop boxing Longs unnecessarily

In encode, and similar in decode, the existing path goes
  Instant->long (inside of converter)
  long->Long (returned from converter)
  Long->long (inside of LongCoder).

This is a relatively small improvement, but as we encode timestamps
for every single element, this is likely to make a difference in
lightweight stages of pipelines.


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/11bf8253
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/11bf8253
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/11bf8253

Branch: refs/heads/master
Commit: 11bf82537a000f74b9690598cc5fb8e9173904d4
Parents: 006fde4
Author: Dan Halperin 
Authored: Fri May 26 16:39:13 2017 -0700
Committer: Dan Halperin 
Committed: Wed May 31 10:10:43 2017 -0700

--
 .../apache/beam/sdk/coders/InstantCoder.java| 81 +---
 1 file changed, 38 insertions(+), 43 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/11bf8253/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/InstantCoder.java
--
diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/InstantCoder.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/InstantCoder.java
index 648493e..e4fadef 100644
--- a/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/InstantCoder.java
+++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/coders/InstantCoder.java
@@ -17,11 +17,13 @@
  */
 package org.apache.beam.sdk.coders;
 
-import com.google.common.base.Converter;
+import java.io.DataInputStream;
+import java.io.DataOutputStream;
+import java.io.EOFException;
 import java.io.IOException;
 import java.io.InputStream;
 import java.io.OutputStream;
-import org.apache.beam.sdk.util.common.ElementByteSizeObserver;
+import java.io.UTFDataFormatException;
 import org.apache.beam.sdk.values.TypeDescriptor;
 import org.joda.time.Instant;
 
@@ -39,53 +41,46 @@ public class InstantCoder extends AtomicCoder {
   private static final InstantCoder INSTANCE = new InstantCoder();
   private static final TypeDescriptor TYPE_DESCRIPTOR = new 
TypeDescriptor() {};
 
-  private static final BigEndianLongCoder LONG_CODER = BigEndianLongCoder.of();
-
   private InstantCoder() {}
 
-  private static final Converter ORDER_PRESERVING_CONVERTER =
-  new LexicographicLongConverter();
-
-  /**
-   * Converts {@link Instant} to a {@code Long} representing its 
millis-since-epoch,
-   * but shifted so that the byte representation of negative values are 
lexicographically
-   * ordered before the byte representation of positive values.
-   *
-   * This deliberately utilizes the well-defined overflow for {@code Long} 
values.
-   * See 
http://docs.oracle.com/javase/specs/jls/se7/html/jls-15.html#jls-15.18.2
-   */
-  private static class LexicographicLongConverter extends Converter {
-
-@Override
-protected Long doForward(Instant instant) {
-  return instant.getMillis() - Long.MIN_VALUE;
-}
-
-@Override
-protected Instant doBackward(Long shiftedMillis) {
-  return new Instant(shiftedMillis + Long.MIN_VALUE);
-}
-  }
-
   @Override
-  public void encode(Instant value, OutputStream outStream)
-  throws CoderException, IOException {
+  public void encode(Instant value, OutputStream outStream) throws 
CoderException, IOException {
 if (value == null) {
   throw new CoderException("cannot encode a null Instant");
 }
-LONG_CODER.encode(ORDER_PRESERVING_CONVERTER.convert(value), outStream);
+
+// Converts {@link Instant} to a {@code long} representing its 
millis-since-epoch,
+// but shifted so that the byte representation of negative values are 
lexicographically
+// ordered before the byte representation of positive values.
+//
+// This deliberately utilizes the well-defined underflow for {@code long} 
values.
+// See 
http://docs.oracle.com/javase/specs/jls/se7/html/jls-15.html#jls-15.18.2
+long shiftedMillis = value.getMillis() - Long.MIN_VALUE;
+new DataOutputStream(outStream).writeLong(shiftedMillis);
   }
 
   @Override
-  public Instant decode(InputStream inStream)
-  throws CoderException, IOException {
-return 
ORDER_PRESERVING_CONVERTER.reverse().convert(LONG_CODER.decode(inStream));
+  public Instant decode(InputStream inStream) throws CoderException, 
IOException {
+long shiftedMillis;
+try {
+  shiftedMillis = new DataInputStream(inStream).readLong();
+} catch (EOFException | UTFDataFormatException exn) {
+  // These exceptions correspond to decoding 

[GitHub] beam pull request #3270: InstantCoder: stop boxing Longs unnecessarily

2017-05-31 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/3270


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[2/2] beam git commit: This closes #3270

2017-05-31 Thread dhalperi
This closes #3270


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/dc70383c
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/dc70383c
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/dc70383c

Branch: refs/heads/master
Commit: dc70383cbfac2b62d60ebdaff73bd7772df9bfea
Parents: 006fde4 11bf825
Author: Dan Halperin 
Authored: Wed May 31 10:10:51 2017 -0700
Committer: Dan Halperin 
Committed: Wed May 31 10:10:51 2017 -0700

--
 .../apache/beam/sdk/coders/InstantCoder.java| 81 +---
 1 file changed, 38 insertions(+), 43 deletions(-)
--




[GitHub] beam pull request #3269: [BEAM-1938] Reduce Prevalence of PValue in the Dire...

2017-05-31 Thread tgroh
GitHub user tgroh opened a pull request:

https://github.com/apache/beam/pull/3269

[BEAM-1938] Reduce Prevalence of PValue in the DirectRunner

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`.
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---
Use PCollection or PCollectionView explicitly.

Retrieve views from the WriteView transform rather than visiting the
view as an output PValue.



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tgroh/beam remove_pvalue_from_direct_runner

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3269.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3269


commit d4d6122cc54347036cfcce726142ed8910deea91
Author: Thomas Groh 
Date:   2017-05-31T16:38:02Z

Reduce Prevalence of PValue in the DirectRunner

Use PCollection or PCollectionView explicitly.

Retrieve views from the WriteView transform rather than visiting the
view as an output PValue.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-1938) Side Inputs should be part of the expanded inputs

2017-05-31 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16031479#comment-16031479
 ] 

ASF GitHub Bot commented on BEAM-1938:
--

GitHub user tgroh opened a pull request:

https://github.com/apache/beam/pull/3269

[BEAM-1938] Reduce Prevalence of PValue in the DirectRunner

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`.
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---
Use PCollection or PCollectionView explicitly.

Retrieve views from the WriteView transform rather than visiting the
view as an output PValue.



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tgroh/beam remove_pvalue_from_direct_runner

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3269.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3269


commit d4d6122cc54347036cfcce726142ed8910deea91
Author: Thomas Groh 
Date:   2017-05-31T16:38:02Z

Reduce Prevalence of PValue in the DirectRunner

Use PCollection or PCollectionView explicitly.

Retrieve views from the WriteView transform rather than visiting the
view as an output PValue.




> Side Inputs should be part of the expanded inputs
> -
>
> Key: BEAM-1938
> URL: https://issues.apache.org/jira/browse/BEAM-1938
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>
> Required for the Java SDK to construct the runner API graphs without 
> inspecting arbitrary transforms.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #3270: InstantCoder: stop boxing Longs unnecessarily

2017-05-31 Thread dhalperi
GitHub user dhalperi opened a pull request:

https://github.com/apache/beam/pull/3270

InstantCoder: stop boxing Longs unnecessarily

In encode, and similar in decode, the existing path goes
  Instant->long (inside of converter)
  long->Long (returned from converter)
  Long->long (inside of LongCoder).

This is a relatively small improvement, but as we encode timestamps
for every single element, this is likely to make a difference in
lightweight stages of pipelines.

R: @lukecwik @bjchambers 

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/dhalperi/beam instantcoder

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3270.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3270


commit 1945c9ddf0aa9545709413c262737bc84a444d4c
Author: Dan Halperin 
Date:   2017-05-26T23:39:13Z

InstantCoder: stop boxing Longs unnecessarily

In encode, and similar in decode, the existing path goes
  Instant->long (inside of converter)
  long->Long (returned from converter)
  Long->long (inside of LongCoder).

This is a relatively small improvement, but as we encode timestamps
for every single element, this is likely to make a difference in
lightweight stages of pipelines.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-1348) Model the Fn Api

2017-05-31 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1348?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16031418#comment-16031418
 ] 

ASF GitHub Bot commented on BEAM-1348:
--

GitHub user lukecwik opened a pull request:

https://github.com/apache/beam/pull/3268

[BEAM-1348] Model the Fn State Api as per 
https://s.apache.org/beam-fn-state-api-and-bundle-processing

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [x] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [x] Make sure tests pass via `mvn clean verify`.
 - [x] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [x] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---
These types/services are unimplemented and this change is towards 
prototyping the client/service using this definition.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/lukecwik/incubator-beam state_api

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3268.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3268


commit 21e4f3582e9dcf8f6ee2a6eac3201e8c0cb928ea
Author: Luke Cwik 
Date:   2017-05-31T15:54:16Z

[BEAM-1348] Model the Fn State Api as per 
https://s.apache.org/beam-fn-state-api-and-bundle-processing




> Model the Fn Api
> 
>
> Key: BEAM-1348
> URL: https://issues.apache.org/jira/browse/BEAM-1348
> Project: Beam
>  Issue Type: Improvement
>  Components: beam-model-fn-api
>Reporter: Luke Cwik
>Assignee: Luke Cwik
>
> Create a proto representation of the services and data types required to 
> execute the Fn Api.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #3268: [BEAM-1348] Model the Fn State Api as per https://s...

2017-05-31 Thread lukecwik
GitHub user lukecwik opened a pull request:

https://github.com/apache/beam/pull/3268

[BEAM-1348] Model the Fn State Api as per 
https://s.apache.org/beam-fn-state-api-and-bundle-processing

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [x] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [x] Make sure tests pass via `mvn clean verify`.
 - [x] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [x] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).

---
These types/services are unimplemented and this change is towards 
prototyping the client/service using this definition.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/lukecwik/incubator-beam state_api

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/3268.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3268


commit 21e4f3582e9dcf8f6ee2a6eac3201e8c0cb928ea
Author: Luke Cwik 
Date:   2017-05-31T15:54:16Z

[BEAM-1348] Model the Fn State Api as per 
https://s.apache.org/beam-fn-state-api-and-bundle-processing




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Spark #2222

2017-05-31 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-1717) Maven release/deploy tries to uploads some artifacts more than once

2017-05-31 Thread JIRA

[ 
https://issues.apache.org/jira/browse/BEAM-1717?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16031221#comment-16031221
 ] 

Jean-Baptiste Onofré commented on BEAM-1717:


I found that the {{maven-shade-plugin}} is the source of the problem:

https://github.com/gatling/maven-shade-plugin/blob/master/src/main/java/org/apache/maven/plugins/shade/mojo/ShadeMojo.java#L537

As the shade plugin attach the test artifact, and do as well, it explains why 
we have two artifacts attached to the module.

I'm evaluating two possible solutions:
1. filter/exclude the test artifact in the {{maven-shade-plugin}} configuration
2. don't attach the test artifact in our {{pom.xml}} and let the shade plugin 
does it


> Maven release/deploy tries to uploads some artifacts more than once 
> 
>
> Key: BEAM-1717
> URL: https://issues.apache.org/jira/browse/BEAM-1717
> Project: Beam
>  Issue Type: Bug
>  Components: build-system
>Reporter: Amit Sela
>Assignee: Jean-Baptiste Onofré
>Priority: Minor
>
> Running maven {{release}} or {{deploy}} causes some artifacts to deploy more 
> than once which fails deployments to release Nexus.
> While this is not an issue for the Apache release process (because it uses a 
> staging Nexus), this affects users who wish to deploy their own fork. 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Dataflow #3252

2017-05-31 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-2393) BoundedSource is not fault-tolerant in FlinkRunner Streaming mode

2017-05-31 Thread Aljoscha Krettek (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16031150#comment-16031150
 ] 

Aljoscha Krettek commented on BEAM-2393:


Ha! This is true. I'm wondering what we could do, since the {{BoundedSource}} 
interface doesn't have a "snapshot" method or something else to that effect.

> BoundedSource is not fault-tolerant in FlinkRunner Streaming mode
> -
>
> Key: BEAM-2393
> URL: https://issues.apache.org/jira/browse/BEAM-2393
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Reporter: Jingsong Lee
>
> {{BoundedSourceWrapper}} does not implement snapshot() and restore(), when 
> the failure to restart, it will send duplicate data.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Build failed in Jenkins: beam_PostCommit_Java_MavenInstall #3971

2017-05-31 Thread Apache Jenkins Server
See 


--
[...truncated 3.10 MB...]
2017-05-31T13:14:10.267 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.jar
 (47 KB at 51.1 KB/sec)
2017-05-31T13:14:10.267 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-runtime_2.10/1.2.1/flink-runtime_2.10-1.2.1-tests.jar
2017-05-31T13:14:10.344 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-core/1.2.1/flink-core-1.2.1-tests.jar
 (716 KB at 721.6 KB/sec)
2017-05-31T13:14:10.344 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.2.1/flink-streaming-java_2.10-1.2.1-tests.jar
2017-05-31T13:14:10.397 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar
 (6960 KB at 6659.7 KB/sec)
2017-05-31T13:14:10.397 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.2.1/flink-test-utils_2.10-1.2.1.jar
2017-05-31T13:14:10.459 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.2.1/flink-streaming-java_2.10-1.2.1-tests.jar
 (953 KB at 860.8 KB/sec)
2017-05-31T13:14:10.459 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.2.1/flink-test-utils-junit-1.2.1.jar
2017-05-31T13:14:10.486 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.2.1/flink-test-utils-junit-1.2.1.jar
 (24 KB at 20.3 KB/sec)
2017-05-31T13:14:10.486 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/curator/curator-test/2.8.0/curator-test-2.8.0.jar
2017-05-31T13:14:10.504 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.2.1/flink-streaming-java_2.10-1.2.1.jar
 (3035 KB at 2633.8 KB/sec)
2017-05-31T13:14:10.507 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-runtime_2.10/1.2.1/flink-runtime_2.10-1.2.1-tests.jar
 (2432 KB at 2105.3 KB/sec)
2017-05-31T13:14:10.515 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/curator/curator-test/2.8.0/curator-test-2.8.0.jar
 (39 KB at 33.6 KB/sec)
2017-05-31T13:14:10.607 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.2.1/flink-test-utils_2.10-1.2.1.jar
 (2366 KB at 1885.0 KB/sec)
2017-05-31T13:14:10.796 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop2/1.2.1/flink-shaded-hadoop2-1.2.1.jar
 (17860 KB at 12368.0 KB/sec)
2017-05-31T13:14:10.808 [INFO] 
2017-05-31T13:14:10.808 [INFO] --- maven-clean-plugin:3.0.0:clean 
(default-clean) @ beam-runners-flink_2.10 ---
2017-05-31T13:14:10.811 [INFO] Deleting 

 (includes = [**/*.pyc, **/*.egg-info/, **/sdks/python/LICENSE, 
**/sdks/python/NOTICE, **/sdks/python/README.md], excludes = [])
2017-05-31T13:14:11.004 [INFO] 
2017-05-31T13:14:11.004 [INFO] --- maven-enforcer-plugin:1.4.1:enforce 
(enforce) @ beam-runners-flink_2.10 ---
2017-05-31T13:14:14.110 [INFO] 
2017-05-31T13:14:14.110 [INFO] --- maven-enforcer-plugin:1.4.1:enforce 
(enforce-banned-dependencies) @ beam-runners-flink_2.10 ---
2017-05-31T13:14:14.176 [INFO] 
2017-05-31T13:14:14.176 [INFO] --- maven-remote-resources-plugin:1.5:process 
(process-resource-bundles) @ beam-runners-flink_2.10 ---
2017-05-31T13:14:14.727 [INFO] 
2017-05-31T13:14:14.727 [INFO] --- maven-resources-plugin:3.0.2:resources 
(default-resources) @ beam-runners-flink_2.10 ---
2017-05-31T13:14:14.728 [INFO] Using 'UTF-8' encoding to copy filtered 
resources.
2017-05-31T13:14:14.728 [INFO] Copying 1 resource
2017-05-31T13:14:14.729 [INFO] Copying 3 resources
2017-05-31T13:14:14.817 [INFO] 
2017-05-31T13:14:14.817 [INFO] --- maven-compiler-plugin:3.6.1:compile 
(default-compile) @ beam-runners-flink_2.10 ---
2017-05-31T13:14:14.834 [INFO] Changes detected - recompiling the module!
2017-05-31T13:14:14.835 [INFO] Compiling 75 source files to 

2017-05-31T13:14:15.782 [WARNING] bootstrap class path not set in conjunction 
with -source 1.7
2017-05-31T13:14:15.782 [INFO] 
:
 Some input files use or override a deprecated API.
2017-05-31T13:14:15.782 [INFO] 
:
 Recompile with -Xlint:deprecation for details.

[jira] [Commented] (BEAM-2377) Cross compile flink runner to scala 2.11

2017-05-31 Thread Aljoscha Krettek (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2377?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16031082#comment-16031082
 ] 

Aljoscha Krettek commented on BEAM-2377:


[~davor] I'm not to familiar with our CI infra but I imagine it would be enough 
to add a new Jenkins job that also does nightly snapshot releases for Scala 
2.11. In addition we would add this to the release process. What do you think?

> Cross compile flink runner to scala 2.11
> 
>
> Key: BEAM-2377
> URL: https://issues.apache.org/jira/browse/BEAM-2377
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-flink
>Reporter: Ole Langbehn
>Assignee: Aljoscha Krettek
>
> The flink runner is compiled for flink built against scala 2.10. flink cross 
> compiles its scala artifacts against 2.10 and 2.11.
> In order to make it possible to use beam with the flink runner in scala 2.11 
> projects, it would be nice if you could publish the flink runner for 2.11 
> next to 2.10.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Build failed in Jenkins: beam_PostCommit_Java_JDK_Versions_Test » OpenJDK 7 (on Ubuntu only),beam #26

2017-05-31 Thread Apache Jenkins Server
See 


Changes:

[altay] Comply with byte limit for Datastore Commit.

[dhalperi] Re-rename fat jar so that install doesn't install the bundled jar as 
the

[tgroh] Revert "Include Additional PTransform inputs in Transform Nodes"

[robertwb] Automatically convert examples to use with syntax.

[klk] Move StepContext to top level

[klk] Remove StepContext.noteOutput

[klk] Rename BaseExecutionContext.StepContext to BaseStepContext

[klk] Move BaseStepContext to the top level

[klk] Remove extraneous ExecutionContext parameter to BaseStepContext

[klk] Implement StepContext directly in the DirectRunner

[klk] Remove writePCollectionViewData from the Beam codebase

[klk] Inline and delete BaseExecutionContext

[klk] Delete unused ExecutionContext

[klk] Remove unused StepContext name methods

[klk] Delete unused remnants in DirectExecutionContext

[klk] Delete unused BaseStepContext

[klk] Shorten excessive name in DirectExecutionContext

[klk] Revise StepContext javadoc

[klk] Remove unused pieces of DirectStepContext

[altay] Fix lint error in datastoreio_test

[chamikara] [BEAM-2338] Fix the limit counter in gcsio reads

[klk] Rename PCollections to PCollectionTranslation

[klk] Rename PTransforms to PTransformTranslation

[klk] Rename ParDos to ParDoTranslation

[klk] Rename WindowIntoTranslator to WindowIntoTranslation

[klk] Rename ReadTranslator to ReadTranslation

[klk] Rename Coders to CoderTranslation

[klk] Rename WindowingStrategies to WindowingStrategyTranslation

[klk] Rename Triggers to TriggerTranslation

[altay] upgrading python sdk dependencies

[klk] Do not ever shrink allowed lateness

[klk] Adds large key tests to GroupByKeyTest

[tgroh] Add CombineTranslation

[dhalperi] Initial implementation of SpannerIO.Write

[dhalperi] Minor style, compilation, javadoc fixups

[dhalperi] Fix spanner dependency management

[dhalperi] Delete SpannerCSVLoader

[dhalperi] Refine Spanner API tests

[dhalperi] SpannerIO.Write cleanup and style fixes

[robertwb] Automatically generate Python proto and grpc files.

[robertwb] Remove auto-generated proto and grpc files.

[robertwb] A couple of worker fixes.

[robertwb] Adding a snippet for metrics

[iemejia] Update maven-dependency-plugin to version 3.0.1

[klk] Make SdkComponents public for TransformPayloadTranslator

[klk] Centralize primitive URNs in PTransformTranslation class

[klk] Add URN for Splittable ProcessElement pseudo-primitive

[klk] Allow getting URN for class of transform via translator

[klk] Add registration for Read and WindowInto translators

[klk] Register ReadTranslator

[klk] Register WindowIntoTranslator

[klk] Add trivial FlattenTranslator to access URN

[klk] Add transform-analysis helpers to ReadTranslation

[klk] Add RawPTransform, which can just vend its URN and payload

[klk] URNs for DirectRunner TransformEvaluator and RootInputProvider

[robertwb] More robust gen_protos on jenkins.

[tgroh] Clarify that PTransform#expand shouldn't be called

[klk] Fixup CombineTranslation

[kirpichov] Cleanups in SimpleDoFnRunner and ParDoEvaluator

[kirpichov] Fixes pb2.py path in gitignore

[robertwb] Additional explicit file cleanup in gen_protos.

[lcwik] [BEAM-2354] Add a ReadStringsFromPubSub/WriteStringsToPubSub PTransform

[jbonofre] [BEAM-2276] Add windowing into default filename policy

[hepei.hp] fix FlinkAccumulatorCombiningStateWithContext read null accum bug

[tgroh] Visit a Transform Hierarchy in Topological Order

[tgroh] Update Apex Overrides

[tgroh] Roll-forward Include Additional PTransform inputs in Transform Nodes

[klk] Mark CombineFnWithContext StateSpecs internal

[klk] Allow translation to throw IOException

[klk] Flesh out TimerSpec and StateSpec in Runner API

[klk] Add case dispatch to StateSpec

[klk] Make Java serialized CombineFn URN public

[klk] Implement TimerSpec and StateSpec translation

[tgroh] Revert "Roll-forward Include Additional PTransform inputs in Transform

[tgroh] Revert "Visit a Transform Hierarchy in Topological Order"

[dhalperi] [BEAM-1542] SpannerIO sink updates

[robertwb] [BEAM-2366] Don't try to pickle gen_protos in tests.

[tgroh] Add Missing Space in DataflowMetrics

[dhalperi] [BEAM-2372] Only run Apache RAT at root pom.xml

[lcwik] [BEAM-2369] HadoopFileSystem: prevent NPE on match of non existing file

[robertwb] [BEAM-2365] Use the highest pickle protocol available.

[chamikara] Fix Python Dataflow execution errors due to #3223

[iemejia] Upgrade Avro dependency to version 1.8.2

[aljoscha.krettek] [BEAM-2380] Forward additional outputs to DoFnRunner in 
Flink Batch

[aljoscha.krettek] Fix flushing of pushed-back data in Flink Runner on +Inf 
watermark

[dhalperi] fix javadoc of View

[jbonofre] [BEAM-2379] Avoid reading projectId from environment variable in 
tests.

[lcwik] [BEAM-1347] Remove the usage of a thread local on a 

Build failed in Jenkins: beam_PostCommit_Java_JDK_Versions_Test » OpenJDK 8 (on Ubuntu only),beam #26

2017-05-31 Thread Apache Jenkins Server
See 


Changes:

[altay] Comply with byte limit for Datastore Commit.

[dhalperi] Re-rename fat jar so that install doesn't install the bundled jar as 
the

[tgroh] Revert "Include Additional PTransform inputs in Transform Nodes"

[robertwb] Automatically convert examples to use with syntax.

[klk] Move StepContext to top level

[klk] Remove StepContext.noteOutput

[klk] Rename BaseExecutionContext.StepContext to BaseStepContext

[klk] Move BaseStepContext to the top level

[klk] Remove extraneous ExecutionContext parameter to BaseStepContext

[klk] Implement StepContext directly in the DirectRunner

[klk] Remove writePCollectionViewData from the Beam codebase

[klk] Inline and delete BaseExecutionContext

[klk] Delete unused ExecutionContext

[klk] Remove unused StepContext name methods

[klk] Delete unused remnants in DirectExecutionContext

[klk] Delete unused BaseStepContext

[klk] Shorten excessive name in DirectExecutionContext

[klk] Revise StepContext javadoc

[klk] Remove unused pieces of DirectStepContext

[altay] Fix lint error in datastoreio_test

[chamikara] [BEAM-2338] Fix the limit counter in gcsio reads

[klk] Rename PCollections to PCollectionTranslation

[klk] Rename PTransforms to PTransformTranslation

[klk] Rename ParDos to ParDoTranslation

[klk] Rename WindowIntoTranslator to WindowIntoTranslation

[klk] Rename ReadTranslator to ReadTranslation

[klk] Rename Coders to CoderTranslation

[klk] Rename WindowingStrategies to WindowingStrategyTranslation

[klk] Rename Triggers to TriggerTranslation

[altay] upgrading python sdk dependencies

[klk] Do not ever shrink allowed lateness

[klk] Adds large key tests to GroupByKeyTest

[tgroh] Add CombineTranslation

[dhalperi] Initial implementation of SpannerIO.Write

[dhalperi] Minor style, compilation, javadoc fixups

[dhalperi] Fix spanner dependency management

[dhalperi] Delete SpannerCSVLoader

[dhalperi] Refine Spanner API tests

[dhalperi] SpannerIO.Write cleanup and style fixes

[robertwb] Automatically generate Python proto and grpc files.

[robertwb] Remove auto-generated proto and grpc files.

[robertwb] A couple of worker fixes.

[robertwb] Adding a snippet for metrics

[iemejia] Update maven-dependency-plugin to version 3.0.1

[klk] Make SdkComponents public for TransformPayloadTranslator

[klk] Centralize primitive URNs in PTransformTranslation class

[klk] Add URN for Splittable ProcessElement pseudo-primitive

[klk] Allow getting URN for class of transform via translator

[klk] Add registration for Read and WindowInto translators

[klk] Register ReadTranslator

[klk] Register WindowIntoTranslator

[klk] Add trivial FlattenTranslator to access URN

[klk] Add transform-analysis helpers to ReadTranslation

[klk] Add RawPTransform, which can just vend its URN and payload

[klk] URNs for DirectRunner TransformEvaluator and RootInputProvider

[robertwb] More robust gen_protos on jenkins.

[tgroh] Clarify that PTransform#expand shouldn't be called

[klk] Fixup CombineTranslation

[kirpichov] Cleanups in SimpleDoFnRunner and ParDoEvaluator

[kirpichov] Fixes pb2.py path in gitignore

[robertwb] Additional explicit file cleanup in gen_protos.

[lcwik] [BEAM-2354] Add a ReadStringsFromPubSub/WriteStringsToPubSub PTransform

[jbonofre] [BEAM-2276] Add windowing into default filename policy

[hepei.hp] fix FlinkAccumulatorCombiningStateWithContext read null accum bug

[tgroh] Visit a Transform Hierarchy in Topological Order

[tgroh] Update Apex Overrides

[tgroh] Roll-forward Include Additional PTransform inputs in Transform Nodes

[klk] Mark CombineFnWithContext StateSpecs internal

[klk] Allow translation to throw IOException

[klk] Flesh out TimerSpec and StateSpec in Runner API

[klk] Add case dispatch to StateSpec

[klk] Make Java serialized CombineFn URN public

[klk] Implement TimerSpec and StateSpec translation

[tgroh] Revert "Roll-forward Include Additional PTransform inputs in Transform

[tgroh] Revert "Visit a Transform Hierarchy in Topological Order"

[dhalperi] [BEAM-1542] SpannerIO sink updates

[robertwb] [BEAM-2366] Don't try to pickle gen_protos in tests.

[tgroh] Add Missing Space in DataflowMetrics

[dhalperi] [BEAM-2372] Only run Apache RAT at root pom.xml

[lcwik] [BEAM-2369] HadoopFileSystem: prevent NPE on match of non existing file

[robertwb] [BEAM-2365] Use the highest pickle protocol available.

[chamikara] Fix Python Dataflow execution errors due to #3223

[iemejia] Upgrade Avro dependency to version 1.8.2

[aljoscha.krettek] [BEAM-2380] Forward additional outputs to DoFnRunner in 
Flink Batch

[aljoscha.krettek] Fix flushing of pushed-back data in Flink Runner on +Inf 
watermark

[dhalperi] fix javadoc of View

[jbonofre] [BEAM-2379] Avoid reading projectId from environment variable in 
tests.

[lcwik] [BEAM-1347] Remove the usage of a thread local on a 

[jira] [Updated] (BEAM-2194) JOIN: cross join

2017-05-31 Thread James Xu (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2194?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

James Xu updated BEAM-2194:
---
Summary: JOIN: cross join  (was: JOIN: cross join, full outer join)

> JOIN: cross join
> 
>
> Key: BEAM-2194
> URL: https://issues.apache.org/jira/browse/BEAM-2194
> Project: Beam
>  Issue Type: New Feature
>  Components: dsl-sql
>Reporter: James Xu
>Assignee: Xu Mingmin
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-2194) JOIN: cross join, full outer join

2017-05-31 Thread James Xu (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2194?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16030928#comment-16030928
 ] 

James Xu commented on BEAM-2194:


merge full outer join to BEAM-2193, but still keep cross join in this issue.

> JOIN: cross join, full outer join
> -
>
> Key: BEAM-2194
> URL: https://issues.apache.org/jira/browse/BEAM-2194
> Project: Beam
>  Issue Type: New Feature
>  Components: dsl-sql
>Reporter: James Xu
>Assignee: Xu Mingmin
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-2193) JOIN: inner join, left join, right join, cross join, full outer join

2017-05-31 Thread James Xu (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2193?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16030922#comment-16030922
 ] 

James Xu commented on BEAM-2193:


Remove cross join from this issue, since it requires totally different 
implementation.

> JOIN: inner join, left join, right join, cross join, full outer join
> 
>
> Key: BEAM-2193
> URL: https://issues.apache.org/jira/browse/BEAM-2193
> Project: Beam
>  Issue Type: New Feature
>  Components: dsl-sql
>Reporter: James Xu
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (BEAM-2193) JOIN: inner join, left join, right join, full outer join

2017-05-31 Thread James Xu (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2193?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

James Xu updated BEAM-2193:
---
Summary: JOIN: inner join, left join, right join, full outer join  (was: 
JOIN: inner join, left join, right join, cross join, full outer join)

> JOIN: inner join, left join, right join, full outer join
> 
>
> Key: BEAM-2193
> URL: https://issues.apache.org/jira/browse/BEAM-2193
> Project: Beam
>  Issue Type: New Feature
>  Components: dsl-sql
>Reporter: James Xu
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Dataflow #3251

2017-05-31 Thread Apache Jenkins Server
See 




[jira] [Created] (BEAM-2393) BoundedSource is not fault-tolerant in FlinkRunner Streaming mode

2017-05-31 Thread Jingsong Lee (JIRA)
Jingsong Lee created BEAM-2393:
--

 Summary: BoundedSource is not fault-tolerant in FlinkRunner 
Streaming mode
 Key: BEAM-2393
 URL: https://issues.apache.org/jira/browse/BEAM-2393
 Project: Beam
  Issue Type: Bug
  Components: runner-flink
Reporter: Jingsong Lee


{{BoundedSourceWrapper}} does not implement snapshot() and restore(), when the 
failure to restart, it will send duplicate data.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Build failed in Jenkins: beam_PostCommit_Java_MavenInstall #3970

2017-05-31 Thread Apache Jenkins Server
See 


Changes:

[iemejia] Refactor HadoopInputFormatIO to use SerializableConfiguration from

[iemejia] Make SerializableConfiguration more robust by using Hadoop based

--
[...truncated 3.07 MB...]
2017-05-31T08:24:59.784 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.jar
 (47 KB at 55.8 KB/sec)
2017-05-31T08:24:59.784 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-runtime_2.10/1.2.1/flink-runtime_2.10-1.2.1-tests.jar
2017-05-31T08:24:59.840 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-core/1.2.1/flink-core-1.2.1-tests.jar
 (716 KB at 800.7 KB/sec)
2017-05-31T08:24:59.840 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.2.1/flink-streaming-java_2.10-1.2.1-tests.jar
2017-05-31T08:24:59.971 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar
 (6960 KB at 6789.6 KB/sec)
2017-05-31T08:24:59.972 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.2.1/flink-test-utils_2.10-1.2.1.jar
2017-05-31T08:24:59.979 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.2.1/flink-streaming-java_2.10-1.2.1-tests.jar
 (953 KB at 922.4 KB/sec)
2017-05-31T08:24:59.979 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.2.1/flink-test-utils-junit-1.2.1.jar
2017-05-31T08:24:59.987 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-streaming-java_2.10/1.2.1/flink-streaming-java_2.10-1.2.1.jar
 (3035 KB at 2914.7 KB/sec)
2017-05-31T08:24:59.987 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/apache/curator/curator-test/2.8.0/curator-test-2.8.0.jar
2017-05-31T08:24:59.992 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-runtime_2.10/1.2.1/flink-runtime_2.10-1.2.1-tests.jar
 (2432 KB at 2324.7 KB/sec)
2017-05-31T08:25:00.005 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils-junit/1.2.1/flink-test-utils-junit-1.2.1.jar
 (24 KB at 21.7 KB/sec)
2017-05-31T08:25:00.016 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/curator/curator-test/2.8.0/curator-test-2.8.0.jar
 (39 KB at 36.4 KB/sec)
2017-05-31T08:25:00.159 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-test-utils_2.10/1.2.1/flink-test-utils_2.10-1.2.1.jar
 (2366 KB at 1950.3 KB/sec)
2017-05-31T08:25:00.607 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop2/1.2.1/flink-shaded-hadoop2-1.2.1.jar
 (17860 KB at 10745.7 KB/sec)
2017-05-31T08:25:00.618 [INFO] 
2017-05-31T08:25:00.618 [INFO] --- maven-clean-plugin:3.0.0:clean 
(default-clean) @ beam-runners-flink_2.10 ---
2017-05-31T08:25:00.621 [INFO] Deleting 

 (includes = [**/*.pyc, **/*.egg-info/, **/sdks/python/LICENSE, 
**/sdks/python/NOTICE, **/sdks/python/README.md], excludes = [])
2017-05-31T08:25:00.795 [INFO] 
2017-05-31T08:25:00.795 [INFO] --- maven-enforcer-plugin:1.4.1:enforce 
(enforce) @ beam-runners-flink_2.10 ---
2017-05-31T08:25:03.047 [INFO] 
2017-05-31T08:25:03.047 [INFO] --- maven-enforcer-plugin:1.4.1:enforce 
(enforce-banned-dependencies) @ beam-runners-flink_2.10 ---
2017-05-31T08:25:03.116 [INFO] 
2017-05-31T08:25:03.116 [INFO] --- maven-remote-resources-plugin:1.5:process 
(process-resource-bundles) @ beam-runners-flink_2.10 ---
2017-05-31T08:25:03.759 [INFO] 
2017-05-31T08:25:03.759 [INFO] --- maven-resources-plugin:3.0.2:resources 
(default-resources) @ beam-runners-flink_2.10 ---
2017-05-31T08:25:03.761 [INFO] Using 'UTF-8' encoding to copy filtered 
resources.
2017-05-31T08:25:03.761 [INFO] Copying 1 resource
2017-05-31T08:25:03.762 [INFO] Copying 3 resources
2017-05-31T08:25:03.854 [INFO] 
2017-05-31T08:25:03.854 [INFO] --- maven-compiler-plugin:3.6.1:compile 
(default-compile) @ beam-runners-flink_2.10 ---
2017-05-31T08:25:03.866 [INFO] Changes detected - recompiling the module!
2017-05-31T08:25:03.867 [INFO] Compiling 75 source files to 

2017-05-31T08:25:04.622 [WARNING] bootstrap class path not set in conjunction 
with -source 1.7
2017-05-31T08:25:04.622 [INFO] 
:
 Some input files use or override a deprecated API.
2017-05-31T08:25:04.622 [INFO] 

Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Dataflow #3250

2017-05-31 Thread Apache Jenkins Server
See 




Jenkins build is back to stable : beam_PostCommit_Java_ValidatesRunner_Flink #2982

2017-05-31 Thread Apache Jenkins Server
See 




[1/3] beam git commit: Make SerializableConfiguration more robust by using Hadoop based serialization

2017-05-31 Thread iemejia
Repository: beam
Updated Branches:
  refs/heads/master 2fa24d89c -> 006fde46c


Make SerializableConfiguration more robust by using Hadoop based serialization


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/185deeba
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/185deeba
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/185deeba

Branch: refs/heads/master
Commit: 185deebaa52bbf34592a21d86f316b4204fa09ba
Parents: 636eaff
Author: Ismaël Mejía 
Authored: Sun May 28 11:38:08 2017 +0200
Committer: Ismaël Mejía 
Committed: Wed May 31 09:17:00 2017 +0200

--
 .../sdk/io/hadoop/SerializableConfiguration.java  | 18 +-
 1 file changed, 9 insertions(+), 9 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/185deeba/sdks/java/io/hadoop-common/src/main/java/org/apache/beam/sdk/io/hadoop/SerializableConfiguration.java
--
diff --git 
a/sdks/java/io/hadoop-common/src/main/java/org/apache/beam/sdk/io/hadoop/SerializableConfiguration.java
 
b/sdks/java/io/hadoop-common/src/main/java/org/apache/beam/sdk/io/hadoop/SerializableConfiguration.java
index 8101f4b..33c660a 100644
--- 
a/sdks/java/io/hadoop-common/src/main/java/org/apache/beam/sdk/io/hadoop/SerializableConfiguration.java
+++ 
b/sdks/java/io/hadoop-common/src/main/java/org/apache/beam/sdk/io/hadoop/SerializableConfiguration.java
@@ -49,21 +49,21 @@ public class SerializableConfiguration implements 
Externalizable {
 return conf;
   }
 
+
   @Override
   public void writeExternal(ObjectOutput out) throws IOException {
-out.writeInt(conf.size());
-for (Map.Entry entry : conf) {
-  out.writeUTF(entry.getKey());
-  out.writeUTF(entry.getValue());
-}
+out.writeUTF(conf.getClass().getCanonicalName());
+conf.write(out);
   }
 
   @Override
   public void readExternal(ObjectInput in) throws IOException, 
ClassNotFoundException {
-conf = new Configuration(false);
-int size = in.readInt();
-for (int i = 0; i < size; i++) {
-  conf.set(in.readUTF(), in.readUTF());
+String className = in.readUTF();
+try {
+  conf = (Configuration) Class.forName(className).newInstance();
+  conf.readFields(in);
+} catch (InstantiationException | IllegalAccessException e) {
+  throw new IOException("Unable to create configuration: " + e);
 }
   }
 



[GitHub] beam pull request #2812: Refactor HadoopInputFormatIO to use SerializableCon...

2017-05-31 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2812


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[3/3] beam git commit: This closes #2812

2017-05-31 Thread iemejia
This closes #2812


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/006fde46
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/006fde46
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/006fde46

Branch: refs/heads/master
Commit: 006fde46ccef4c57d04b884dd8ed39c5c8e5a1f9
Parents: 2fa24d8 185deeb
Author: Ismaël Mejía 
Authored: Wed May 31 09:17:56 2017 +0200
Committer: Ismaël Mejía 
Committed: Wed May 31 09:17:56 2017 +0200

--
 .../io/hadoop/SerializableConfiguration.java| 18 ++---
 .../hadoop/inputformat/HadoopInputFormatIO.java | 53 ++---
 .../inputformat/HadoopInputFormatIOTest.java| 80 ++--
 3 files changed, 56 insertions(+), 95 deletions(-)
--




[2/3] beam git commit: Refactor HadoopInputFormatIO to use SerializableConfiguration from hadoop-common

2017-05-31 Thread iemejia
Refactor HadoopInputFormatIO to use SerializableConfiguration from hadoop-common


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/636eaff0
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/636eaff0
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/636eaff0

Branch: refs/heads/master
Commit: 636eaff03646113daf868949734199f5697bdf0d
Parents: 2fa24d8
Author: Ismaël Mejía 
Authored: Tue May 2 01:33:27 2017 +0200
Committer: Ismaël Mejía 
Committed: Wed May 31 09:17:00 2017 +0200

--
 .../hadoop/inputformat/HadoopInputFormatIO.java | 53 ++---
 .../inputformat/HadoopInputFormatIOTest.java| 80 ++--
 2 files changed, 47 insertions(+), 86 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/636eaff0/sdks/java/io/hadoop/input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java
--
diff --git 
a/sdks/java/io/hadoop/input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java
 
b/sdks/java/io/hadoop/input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java
index 336740c..efd47fd 100644
--- 
a/sdks/java/io/hadoop/input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java
+++ 
b/sdks/java/io/hadoop/input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java
@@ -23,11 +23,8 @@ import com.google.common.base.Function;
 import com.google.common.collect.ImmutableList;
 import com.google.common.collect.Lists;
 import com.google.common.util.concurrent.AtomicDouble;
-import java.io.Externalizable;
 import java.io.IOException;
-import java.io.ObjectInput;
 import java.io.ObjectInputStream;
-import java.io.ObjectOutput;
 import java.io.ObjectOutputStream;
 import java.io.Serializable;
 import java.math.BigDecimal;
@@ -46,6 +43,7 @@ import org.apache.beam.sdk.coders.CoderException;
 import org.apache.beam.sdk.coders.CoderRegistry;
 import org.apache.beam.sdk.coders.KvCoder;
 import org.apache.beam.sdk.io.BoundedSource;
+import org.apache.beam.sdk.io.hadoop.SerializableConfiguration;
 import org.apache.beam.sdk.io.hadoop.WritableCoder;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.transforms.PTransform;
@@ -432,7 +430,7 @@ public class HadoopInputFormatIO {
 @Override
 public void populateDisplayData(DisplayData.Builder builder) {
   super.populateDisplayData(builder);
-  Configuration hadoopConfig = getConfiguration().getHadoopConfiguration();
+  Configuration hadoopConfig = getConfiguration().get();
   if (hadoopConfig != null) {
 
builder.addIfNotNull(DisplayData.item("mapreduce.job.inputformat.class",
 hadoopConfig.get("mapreduce.job.inputformat.class"))
@@ -493,7 +491,7 @@ public class HadoopInputFormatIO {
   }
   createInputFormatInstance();
   List splits =
-  
inputFormatObj.getSplits(Job.getInstance(conf.getHadoopConfiguration()));
+  inputFormatObj.getSplits(Job.getInstance(conf.get()));
   if (splits == null) {
 throw new IOException("Error in computing splits, getSplits() returns 
null.");
   }
@@ -520,12 +518,12 @@ public class HadoopInputFormatIO {
   if (inputFormatObj == null) {
 try {
   taskAttemptContext =
-  new TaskAttemptContextImpl(conf.getHadoopConfiguration(), new 
TaskAttemptID());
+  new TaskAttemptContextImpl(conf.get(), new TaskAttemptID());
   inputFormatObj =
   (InputFormat) conf
-  .getHadoopConfiguration()
+  .get()
   .getClassByName(
-  
conf.getHadoopConfiguration().get("mapreduce.job.inputformat.class"))
+  conf.get().get("mapreduce.job.inputformat.class"))
   .newInstance();
   /*
* If InputFormat explicitly implements interface {@link 
Configurable}, then setConf()
@@ -535,7 +533,7 @@ public class HadoopInputFormatIO {
* org.apache.hadoop.hbase.mapreduce.TableInputFormat 
TableInputFormat}, etc.
*/
   if (Configurable.class.isAssignableFrom(inputFormatObj.getClass())) {
-((Configurable) 
inputFormatObj).setConf(conf.getHadoopConfiguration());
+((Configurable) inputFormatObj).setConf(conf.get());
   }
 } catch (InstantiationException | IllegalAccessException | 
ClassNotFoundException e) {
   throw new IOException("Unable to create InputFormat object: ", e);
@@ -802,41 +800,4 @@ public class HadoopInputFormatIO {
   new ObjectWritable(inputSplit).write(out);
 }
   }
-
- 

Jenkins build is still unstable: beam_PostCommit_Java_ValidatesRunner_Flink #2981

2017-05-31 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Java_MavenInstall #3969

2017-05-31 Thread Apache Jenkins Server
See 


--
[...truncated 537.13 KB...]
2017-05-31T06:09:08.127 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/woodstox/stax2-api/3.1.4/stax2-api-3.1.4.pom
[INFO] I/O exception (java.net.SocketException) caught when processing request 
to {s}->https://repo.maven.apache.org:443: Connection reset
[INFO] Retrying request to {s}->https://repo.maven.apache.org:443
[INFO] I/O exception (java.net.SocketException) caught when processing request 
to {s}->https://repo.maven.apache.org:443: Connection reset
[INFO] Retrying request to {s}->https://repo.maven.apache.org:443
[INFO] I/O exception (java.net.SocketException) caught when processing request 
to {s}->https://repo.maven.apache.org:443: Connection reset
[INFO] Retrying request to {s}->https://repo.maven.apache.org:443
2017-05-31T06:09:08.218 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/org/codehaus/woodstox/woodstox-core-asl/4.4.1/woodstox-core-asl-4.4.1.pom
2017-05-31T06:09:08.247 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/org/codehaus/woodstox/woodstox-core-asl/4.4.1/woodstox-core-asl-4.4.1.pom
 (2 KB at 60.8 KB/sec)
2017-05-31T06:09:08.249 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/cloud/dataflow/google-cloud-dataflow-java-proto-library-all/0.5.160304/google-cloud-dataflow-java-proto-library-all-0.5.160304.pom
2017-05-31T06:09:08.276 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/cloud/dataflow/google-cloud-dataflow-java-proto-library-all/0.5.160304/google-cloud-dataflow-java-proto-library-all-0.5.160304.pom
 (3 KB at 99.2 KB/sec)
2017-05-31T06:09:08.277 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/cloud/dataflow/google-cloud-dataflow-java-sdk-parent/1.4.0/google-cloud-dataflow-java-sdk-parent-1.4.0.pom
2017-05-31T06:09:08.303 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/cloud/dataflow/google-cloud-dataflow-java-sdk-parent/1.4.0/google-cloud-dataflow-java-sdk-parent-1.4.0.pom
 (12 KB at 441.7 KB/sec)
2017-05-31T06:09:08.305 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/google/5/google-5.pom
2017-05-31T06:09:08.329 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/google/5/google-5.pom (3 KB at 
100.0 KB/sec)
[JENKINS] Archiving disabled
2017-05-31T06:09:08.892 [INFO]  
   
2017-05-31T06:09:08.892 [INFO] 

2017-05-31T06:09:08.892 [INFO] Skipping Apache Beam :: Parent
2017-05-31T06:09:08.892 [INFO] This project has been banned from the build due 
to previous failures.
2017-05-31T06:09:08.892 [INFO] 

[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled
[JENKINS] Archiving disabled2017-05-31T06:09:22.374 [INFO] 

2017-05-31T06:09:22.374 [INFO] Reactor Summary:
2017-05-31T06:09:22.374 [INFO] 
2017-05-31T06:09:22.374 [INFO] Apache Beam :: Parent 
.. SUCCESS [ 32.621 s]
2017-05-31T06:09:22.374 [INFO] Apache Beam :: SDKs :: Java :: Build Tools 
. SUCCESS [ 10.617 s]
2017-05-31T06:09:22.374 [INFO] Apache Beam :: SDKs 
 SUCCESS [  5.613 s]
2017-05-31T06:09:22.374 [INFO] Apache Beam :: SDKs :: Common 
.. SUCCESS [  2.359 s]
2017-05-31T06:09:22.374 [INFO] Apache Beam :: SDKs :: Common :: Fn API 
 SUCCESS [ 18.680 s]
2017-05-31T06:09:22.374 [INFO] Apache Beam 

Build failed in Jenkins: beam_PostCommit_Java_MavenInstall_Windows #75

2017-05-31 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-1544] Java cross-JDK version tests on Jenkins

--
[...truncated 351.79 KB...]
2017-05-31T06:03:15.254 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/errorprone/error_prone_annotations/2.0.15/error_prone_annotations-2.0.15.pom
2017-05-31T06:03:15.261 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/errorprone/error_prone_annotations/2.0.15/error_prone_annotations-2.0.15.pom
 (2 KB at 228.0 KB/sec)
2017-05-31T06:03:15.264 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/errorprone/error_prone_parent/2.0.15/error_prone_parent-2.0.15.pom
2017-05-31T06:03:15.272 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/errorprone/error_prone_parent/2.0.15/error_prone_parent-2.0.15.pom
 (6 KB at 625.7 KB/sec)
2017-05-31T06:03:15.276 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/code/findbugs/jsr305/3.0.1/jsr305-3.0.1.pom
2017-05-31T06:03:15.285 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/code/findbugs/jsr305/3.0.1/jsr305-3.0.1.pom
 (5 KB at 465.1 KB/sec)
2017-05-31T06:03:15.289 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-context/1.2.0/grpc-context-1.2.0.pom
2017-05-31T06:03:15.461 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-context/1.2.0/grpc-context-1.2.0.pom
 (2 KB at 9.9 KB/sec)
2017-05-31T06:03:15.465 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/instrumentation/instrumentation-api/0.3.0/instrumentation-api-0.3.0.pom
2017-05-31T06:03:15.640 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/instrumentation/instrumentation-api/0.3.0/instrumentation-api-0.3.0.pom
 (2 KB at 8.4 KB/sec)
2017-05-31T06:03:15.645 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-protobuf/1.2.0/grpc-protobuf-1.2.0.pom
2017-05-31T06:03:15.657 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-protobuf/1.2.0/grpc-protobuf-1.2.0.pom
 (3 KB at 216.1 KB/sec)
2017-05-31T06:03:15.660 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java-util/3.2.0/protobuf-java-util-3.2.0.pom
2017-05-31T06:03:15.668 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java-util/3.2.0/protobuf-java-util-3.2.0.pom
 (5 KB at 518.7 KB/sec)
2017-05-31T06:03:15.672 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/code/gson/gson/2.7/gson-2.7.pom
2017-05-31T06:03:15.680 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/code/gson/gson/2.7/gson-2.7.pom 
(2 KB at 176.4 KB/sec)
2017-05-31T06:03:15.684 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/code/gson/gson-parent/2.7/gson-parent-2.7.pom
2017-05-31T06:03:15.692 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/code/gson/gson-parent/2.7/gson-parent-2.7.pom
 (4 KB at 437.0 KB/sec)
2017-05-31T06:03:15.696 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-protobuf-lite/1.2.0/grpc-protobuf-lite-1.2.0.pom
2017-05-31T06:03:15.869 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-protobuf-lite/1.2.0/grpc-protobuf-lite-1.2.0.pom
 (3 KB at 12.0 KB/sec)
2017-05-31T06:03:15.873 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-stub/1.2.0/grpc-stub-1.2.0.pom
2017-05-31T06:03:16.046 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-stub/1.2.0/grpc-stub-1.2.0.pom
 (3 KB at 11.7 KB/sec)
2017-05-31T06:03:16.058 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/3.2.0/protobuf-java-3.2.0.jar
2017-05-31T06:03:16.058 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/errorprone/error_prone_annotations/2.0.15/error_prone_annotations-2.0.15.jar
2017-05-31T06:03:16.059 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/code/findbugs/jsr305/3.0.1/jsr305-3.0.1.jar
2017-05-31T06:03:16.059 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-core/1.2.0/grpc-core-1.2.0.jar
2017-05-31T06:03:16.059 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/io/grpc/grpc-context/1.2.0/grpc-context-1.2.0.jar
2017-05-31T06:03:16.074 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/errorprone/error_prone_annotations/2.0.15/error_prone_annotations-2.0.15.jar
 (11 KB at 663.2 KB/sec)
2017-05-31T06:03:16.074 [INFO] Downloading: 
https://repo.maven.apache.org/maven2/com/google/instrumentation/instrumentation-api/0.3.0/instrumentation-api-0.3.0.jar
2017-05-31T06:03:16.075 [INFO] Downloaded: 
https://repo.maven.apache.org/maven2/com/google/code/findbugs/jsr305/3.0.1/jsr305-3.0.1.jar
 (20 KB at 1145.6 KB/sec)
2017-05-31T06:03:16.075 [INFO] Downloading: