[jira] [Resolved] (BEAM-388) Update Beam Incubation Status page on main Apache site

2016-08-24 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-388?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jean-Baptiste Onofré resolved BEAM-388.
---
   Resolution: Fixed
Fix Version/s: Not applicable

> Update Beam Incubation Status page on main Apache site
> --
>
> Key: BEAM-388
> URL: https://issues.apache.org/jira/browse/BEAM-388
> Project: Beam
>  Issue Type: Improvement
>  Components: project-management
>Affects Versions: Not applicable
>Reporter: Daniel Halperin
>Assignee: Jean-Baptiste Onofré
> Fix For: Not applicable
>
>
> Looking at http://incubator.apache.org/projects/beam.html , it has not been 
> updated since 2/1. Some proposed changes, from top to bottom.
> News
> * Add Release 0.1.0-incubating
> Project info
> * Add Apache IDs for all committers now that we have them.
> Incubation work items
> * Add dates for all those that have been completed



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] incubator-beam pull request #881: Updates sources to report consumed and rem...

2016-08-24 Thread chamikaramj
GitHub user chamikaramj opened a pull request:

https://github.com/apache/incubator-beam/pull/881

Updates sources to report consumed and remaining number of split points.

Adds several methods to the RangeTracker interface to support this. Please 
see comments for details.

Updates AvroSource and LineSource (test) to report split points properly.

Runners can use this information to determine the amount of remaining and 
consumed parallelism of source read operations. Java SDK sources framework 
already supports reporting these signals.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/chamikaramj/incubator-beam limited_parallelism

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/881.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #881


commit 8b78a7bfb945f80b346000bfbd39bb6fa4a933e3
Author: Chamikara Jayalath 
Date:   2016-08-24T22:31:47Z

Updates sources to report consumed and remaining number of split points.

Adds several methods to the RangeTracker interface to support this. Please 
see comments for details.

Updates AvroSource and LineSource (test) to report split points properly.

Runners can use this information to determine the amount of remaining and 
consumed parallelism of source read operations. Java SDK sources framework 
already supports reporting these signals.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Build failed in Jenkins: beam_PostCommit_RunnableOnService_GearpumpLocal #136

2016-08-24 Thread Apache Jenkins Server
See 


--
[...truncated 9008 lines...]
[WARNING]   - javax.annotation.RegEx
[WARNING]   - javax.annotation.concurrent.Immutable
[WARNING]   - javax.annotation.meta.TypeQualifierDefault
[WARNING]   - javax.annotation.meta.TypeQualifier
[WARNING]   - javax.annotation.Syntax
[WARNING]   - javax.annotation.Nonnull
[WARNING]   - javax.annotation.CheckReturnValue
[WARNING]   - javax.annotation.CheckForNull
[WARNING]   - javax.annotation.meta.TypeQualifierNickname
[WARNING]   - javax.annotation.MatchesPattern
[WARNING]   - 25 more...
[WARNING] grpc-all-0.13.1.jar, grpc-okhttp-0.13.1.jar define 75 overlapping 
classes: 
[WARNING]   - io.grpc.okhttp.OkHttpSettingsUtil
[WARNING]   - io.grpc.okhttp.NegotiationType
[WARNING]   - io.grpc.okhttp.AsyncFrameWriter$12
[WARNING]   - io.grpc.okhttp.OkHttpTlsUpgrader
[WARNING]   - io.grpc.okhttp.AsyncFrameWriter$WriteRunnable
[WARNING]   - io.grpc.okhttp.Utils
[WARNING]   - io.grpc.okhttp.OkHttpProtocolNegotiator$AndroidNegotiator
[WARNING]   - io.grpc.okhttp.OkHttpChannelBuilder$2
[WARNING]   - io.grpc.okhttp.internal.framed.Huffman$Node
[WARNING]   - io.grpc.okhttp.AsyncFrameWriter$7
[WARNING]   - 65 more...
[WARNING] grpc-all-0.13.1.jar, grpc-auth-0.13.1.jar define 2 overlapping 
classes: 
[WARNING]   - io.grpc.auth.ClientAuthInterceptor$1
[WARNING]   - io.grpc.auth.ClientAuthInterceptor
[WARNING] grpc-all-0.13.1.jar, grpc-protobuf-nano-0.13.1.jar define 4 
overlapping classes: 
[WARNING]   - io.grpc.protobuf.nano.NanoProtoInputStream
[WARNING]   - io.grpc.protobuf.nano.NanoUtils
[WARNING]   - io.grpc.protobuf.nano.NanoUtils$1
[WARNING]   - io.grpc.protobuf.nano.MessageNanoFactory
[WARNING] annotations-3.0.1.jar, jcip-annotations-1.0.jar define 4 overlapping 
classes: 
[WARNING]   - net.jcip.annotations.GuardedBy
[WARNING]   - net.jcip.annotations.NotThreadSafe
[WARNING]   - net.jcip.annotations.ThreadSafe
[WARNING]   - net.jcip.annotations.Immutable
[WARNING] grpc-all-0.13.1.jar, grpc-netty-0.13.1.jar define 78 overlapping 
classes: 
[WARNING]   - io.grpc.netty.AbstractNettyHandler
[WARNING]   - io.grpc.netty.NettyClientTransport
[WARNING]   - io.grpc.netty.NettyClientStream$1
[WARNING]   - io.grpc.netty.SendResponseHeadersCommand
[WARNING]   - io.grpc.netty.NettyServer$2
[WARNING]   - io.grpc.netty.NettyClientHandler$4
[WARNING]   - io.grpc.netty.NettyClientTransport$3
[WARNING]   - io.grpc.netty.NettyClientHandler$FrameListener
[WARNING]   - io.grpc.netty.ProtocolNegotiators$1
[WARNING]   - io.grpc.netty.JettyTlsUtil
[WARNING]   - 68 more...
[WARNING] beam-runners-core-java-0.2.0-incubating-SNAPSHOT.jar, 
beam-sdks-java-core-0.2.0-incubating-SNAPSHOT.jar define 1717 overlapping 
classes: 
[WARNING]   - 
org.apache.beam.sdk.repackaged.com.google.common.collect.TreeRangeSet$ComplementRangesByLowerBound$2
[WARNING]   - 
org.apache.beam.sdk.repackaged.com.google.common.collect.WellBehavedMap$EntrySet$1$1
[WARNING]   - 
org.apache.beam.sdk.repackaged.com.google.common.util.concurrent.CycleDetectingLockFactory$Policies$1
[WARNING]   - org.apache.beam.sdk.repackaged.com.google.common.collect.Maps$6
[WARNING]   - 
org.apache.beam.sdk.repackaged.com.google.common.primitives.UnsignedBytes$LexicographicalComparatorHolder$UnsafeComparator$1
[WARNING]   - 
org.apache.beam.sdk.repackaged.com.google.common.collect.Collections2$OrderedPermutationCollection
[WARNING]   - org.apache.beam.sdk.repackaged.com.google.common.collect.Range$1
[WARNING]   - org.apache.beam.sdk.repackaged.com.google.common.base.Splitter$2
[WARNING]   - 
org.apache.beam.sdk.repackaged.com.google.common.base.Equivalence$Identity
[WARNING]   - org.apache.beam.sdk.repackaged.com.google.common.collect.Lists$1
[WARNING]   - 1707 more...
[WARNING] grpc-all-0.13.1.jar, grpc-protobuf-0.13.1.jar define 4 overlapping 
classes: 
[WARNING]   - io.grpc.protobuf.ProtoUtils$2
[WARNING]   - io.grpc.protobuf.ProtoUtils
[WARNING]   - io.grpc.protobuf.ProtoUtils$1
[WARNING]   - io.grpc.protobuf.ProtoInputStream
[WARNING] grpc-all-0.13.1.jar, grpc-stub-0.13.1.jar define 30 overlapping 
classes: 
[WARNING]   - io.grpc.stub.ServerCalls$UnaryRequestMethod
[WARNING]   - io.grpc.stub.StreamObserver
[WARNING]   - io.grpc.stub.ClientCalls$CallToStreamObserverAdapter
[WARNING]   - io.grpc.stub.ServerCalls$EmptyServerCallListener
[WARNING]   - io.grpc.stub.MetadataUtils$2$1
[WARNING]   - io.grpc.stub.MetadataUtils$1$1
[WARNING]   - io.grpc.stub.MetadataUtils$2$1$1
[WARNING]   - io.grpc.stub.MetadataUtils
[WARNING]   - io.grpc.stub.ServerCalls$1
[WARNING]   - io.grpc.stub.ServerCalls$ClientStreamingMethod
[WARNING]   - 20 more...
[WARNING] maven-shade-plugin has detected that some class files are
[WARNING] present in two or more JARs. When this happens, only one
[WARNING] single version of the class is copied to the uber jar.
[WARNING] Usually this is not harmful and you can skip these warnings,
[WARNING] otherwise try 

[GitHub] incubator-beam pull request #880: Cleanup some javadoc that refers Dataflow

2016-08-24 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/880


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[2/2] incubator-beam git commit: Cleanup some javadoc that referring Dataflow

2016-08-24 Thread dhalperi
Cleanup some javadoc that referring Dataflow


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/ddda21bf
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/ddda21bf
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/ddda21bf

Branch: refs/heads/master
Commit: ddda21bf96956cc18d8768aefaff9c317710d65c
Parents: 2f4321e
Author: Pei He 
Authored: Wed Aug 24 14:47:11 2016 -0700
Committer: Dan Halperin 
Committed: Wed Aug 24 16:10:34 2016 -0700

--
 .../main/java/org/apache/beam/examples/DebuggingWordCount.java   | 2 +-
 .../main/java/org/apache/beam/examples/WindowedWordCount.java| 3 +--
 .../java/org/apache/beam/examples/common/ExampleOptions.java | 2 +-
 .../src/main/java/org/apache/beam/examples/complete/README.md| 4 ++--
 .../org/apache/beam/examples/complete/StreamingWordExtract.java  | 2 +-
 .../org/apache/beam/examples/complete/TopWikipediaSessions.java  | 2 +-
 .../org/apache/beam/examples/complete/TrafficMaxLaneFlow.java| 4 ++--
 .../java/org/apache/beam/examples/complete/TrafficRoutes.java| 4 ++--
 8 files changed, 11 insertions(+), 12 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/ddda21bf/examples/java/src/main/java/org/apache/beam/examples/DebuggingWordCount.java
--
diff --git 
a/examples/java/src/main/java/org/apache/beam/examples/DebuggingWordCount.java 
b/examples/java/src/main/java/org/apache/beam/examples/DebuggingWordCount.java
index 4a9aba9..5a0930c 100644
--- 
a/examples/java/src/main/java/org/apache/beam/examples/DebuggingWordCount.java
+++ 
b/examples/java/src/main/java/org/apache/beam/examples/DebuggingWordCount.java
@@ -37,7 +37,7 @@ import org.slf4j.LoggerFactory;
 
 
 /**
- * An example that verifies word counts in Shakespeare and includes Dataflow 
best practices.
+ * An example that verifies word counts in Shakespeare and includes Beam best 
practices.
  *
  * This class, {@link DebuggingWordCount}, is the third in a series of four 
successively more
  * detailed 'word count' examples. You may first want to take a look at {@link 
MinimalWordCount}

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/ddda21bf/examples/java/src/main/java/org/apache/beam/examples/WindowedWordCount.java
--
diff --git 
a/examples/java/src/main/java/org/apache/beam/examples/WindowedWordCount.java 
b/examples/java/src/main/java/org/apache/beam/examples/WindowedWordCount.java
index 6d69f14..5f60524 100644
--- 
a/examples/java/src/main/java/org/apache/beam/examples/WindowedWordCount.java
+++ 
b/examples/java/src/main/java/org/apache/beam/examples/WindowedWordCount.java
@@ -183,8 +183,7 @@ public class WindowedWordCount {
   public static void main(String[] args) throws IOException {
 Options options = 
PipelineOptionsFactory.fromArgs(args).withValidation().as(Options.class);
 options.setBigQuerySchema(getSchema());
-// DataflowExampleUtils creates the necessary input sources to simplify 
execution of this
-// Pipeline.
+// ExampleUtils creates the necessary input sources to simplify execution 
of this Pipeline.
 ExampleUtils exampleUtils = new ExampleUtils(options);
 exampleUtils.setup();
 

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/ddda21bf/examples/java/src/main/java/org/apache/beam/examples/common/ExampleOptions.java
--
diff --git 
a/examples/java/src/main/java/org/apache/beam/examples/common/ExampleOptions.java
 
b/examples/java/src/main/java/org/apache/beam/examples/common/ExampleOptions.java
index a7dcc7c..8b7ed07 100644
--- 
a/examples/java/src/main/java/org/apache/beam/examples/common/ExampleOptions.java
+++ 
b/examples/java/src/main/java/org/apache/beam/examples/common/ExampleOptions.java
@@ -33,7 +33,7 @@ import org.joda.time.format.DateTimeFormatter;
  * Options that can be used to configure the Beam examples.
  */
 public interface ExampleOptions extends PipelineOptions {
-  @Description("Whether to keep jobs running on the Dataflow service after 
local process exit")
+  @Description("Whether to keep jobs running after local process exit")
   @Default.Boolean(false)
   boolean getKeepJobsRunning();
   void setKeepJobsRunning(boolean keepJobsRunning);

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/ddda21bf/examples/java/src/main/java/org/apache/beam/examples/complete/README.md
--
diff --git 
a/examples/java/src/main/java/org/apache/beam/examples/complete/README.md 

[1/2] incubator-beam git commit: Closes #880

2016-08-24 Thread dhalperi
Repository: incubator-beam
Updated Branches:
  refs/heads/master 2f4321ef4 -> f7384e1a6


Closes #880


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/f7384e1a
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/f7384e1a
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/f7384e1a

Branch: refs/heads/master
Commit: f7384e1a6fb592500495bef8f4498f56287aa159
Parents: 2f4321e ddda21b
Author: Dan Halperin 
Authored: Wed Aug 24 16:10:34 2016 -0700
Committer: Dan Halperin 
Committed: Wed Aug 24 16:10:34 2016 -0700

--
 .../main/java/org/apache/beam/examples/DebuggingWordCount.java   | 2 +-
 .../main/java/org/apache/beam/examples/WindowedWordCount.java| 3 +--
 .../java/org/apache/beam/examples/common/ExampleOptions.java | 2 +-
 .../src/main/java/org/apache/beam/examples/complete/README.md| 4 ++--
 .../org/apache/beam/examples/complete/StreamingWordExtract.java  | 2 +-
 .../org/apache/beam/examples/complete/TopWikipediaSessions.java  | 2 +-
 .../org/apache/beam/examples/complete/TrafficMaxLaneFlow.java| 4 ++--
 .../java/org/apache/beam/examples/complete/TrafficRoutes.java| 4 ++--
 8 files changed, 11 insertions(+), 12 deletions(-)
--




[jira] [Commented] (BEAM-383) BigQueryIO: update sink to shard into multiple write jobs

2016-08-24 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-383?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435869#comment-15435869
 ] 

ASF GitHub Bot commented on BEAM-383:
-

Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/877


> BigQueryIO: update sink to shard into multiple write jobs
> -
>
> Key: BEAM-383
> URL: https://issues.apache.org/jira/browse/BEAM-383
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-gcp
>Reporter: Daniel Halperin
>Assignee: Ian Zhou
> Fix For: 0.3.0-incubating
>
>
> BigQuery has global limits on both the # files that can be written in a 
> single job and the total bytes in those files. We should be able to modify 
> BigQueryIO.Write to chunk into multiple smaller jobs that meet these limits, 
> write to temp tables, and atomically copy into the destination table.
> This functionality will let us safely stay within BQ's load job limits.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] incubator-beam pull request #877: [BEAM-383] BigQueryIO.Write: raise size li...

2016-08-24 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/877


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] incubator-beam git commit: Closes #877

2016-08-24 Thread dhalperi
Repository: incubator-beam
Updated Branches:
  refs/heads/master 0f072414b -> 2f4321ef4


Closes #877


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/2f4321ef
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/2f4321ef
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/2f4321ef

Branch: refs/heads/master
Commit: 2f4321ef44a1c09a0f5cb31c7967881b051893ab
Parents: 0f07241 481a40f
Author: Dan Halperin 
Authored: Wed Aug 24 15:48:41 2016 -0700
Committer: Dan Halperin 
Committed: Wed Aug 24 15:48:41 2016 -0700

--
 .../java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
--




[2/2] incubator-beam git commit: BigQueryIO.Write: raise size limit to 11 TiB

2016-08-24 Thread dhalperi
BigQueryIO.Write: raise size limit to 11 TiB

BigQuery has changed their total size quota to 12 TiB.
https://cloud.google.com/bigquery/quota-policy#import


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/481a40f3
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/481a40f3
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/481a40f3

Branch: refs/heads/master
Commit: 481a40f38ec32b7999c2fa29299542655f2d1cca
Parents: 0f07241
Author: Dan Halperin 
Authored: Wed Aug 24 09:49:46 2016 -0700
Committer: Dan Halperin 
Committed: Wed Aug 24 15:48:41 2016 -0700

--
 .../java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/481a40f3/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
index 04fb041..01a8a1c 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
@@ -1420,8 +1420,8 @@ public class BigQueryIO {
   // Maximum number of files in a single partition.
   static final int MAX_NUM_FILES = 1;
 
-  // Maximum number of bytes in a single partition.
-  static final long MAX_SIZE_BYTES = 3 * (1L << 40);
+  // Maximum number of bytes in a single partition -- 11 TiB just under 
BQ's 12 TiB limit.
+  static final long MAX_SIZE_BYTES = 11 * (1L << 40);
 
   // The maximum number of retry jobs.
   static final int MAX_RETRY_JOBS = 3;



[jira] [Commented] (BEAM-583) Auto Register TestDataflowRunner

2016-08-24 Thread Luke Cwik (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-583?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435793#comment-15435793
 ] 

Luke Cwik commented on BEAM-583:


SGTM

> Auto Register TestDataflowRunner 
> -
>
> Key: BEAM-583
> URL: https://issues.apache.org/jira/browse/BEAM-583
> Project: Beam
>  Issue Type: Improvement
>Reporter: Mark Liu
>Assignee: Mark Liu
>
> Register TestDataflowRunner automatically.
> Simplify option's arguments when using TestDataflowRunner run end-to-end test 
> against dataflow service. Instead of using 
> {code}--runner=org.apache.beam.runners.dataflow.testing.TestDataflowRunner{code},
>  we can also use {code}--runner=TestDataflowRunner{code}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-583) Auto Register TestDataflowRunner

2016-08-24 Thread Mark Liu (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-583?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435790#comment-15435790
 ] 

Mark Liu commented on BEAM-583:
---

Make sense for me. 

For me, TestPipeline is more used in unit and functional test and people should 
really use it. But once a pipeline is build and people want to test it from end 
to end, TestYYYRunner + IT framework will be a better choice.

> Auto Register TestDataflowRunner 
> -
>
> Key: BEAM-583
> URL: https://issues.apache.org/jira/browse/BEAM-583
> Project: Beam
>  Issue Type: Improvement
>Reporter: Mark Liu
>Assignee: Mark Liu
>
> Register TestDataflowRunner automatically.
> Simplify option's arguments when using TestDataflowRunner run end-to-end test 
> against dataflow service. Instead of using 
> {code}--runner=org.apache.beam.runners.dataflow.testing.TestDataflowRunner{code},
>  we can also use {code}--runner=TestDataflowRunner{code}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] incubator-beam pull request #880: Cleanup some javadoc that refers Dataflow

2016-08-24 Thread peihe
GitHub user peihe opened a pull request:

https://github.com/apache/incubator-beam/pull/880

Cleanup some javadoc that refers Dataflow





You can merge this pull request into a Git repository by running:

$ git pull https://github.com/peihe/incubator-beam examples-cleanup

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/880.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #880


commit c0844cff5002b64a63194f364985d3124049a128
Author: Pei He 
Date:   2016-08-24T21:47:11Z

Cleanup some javadoc that referring Dataflow




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-583) Auto Register TestDataflowRunner

2016-08-24 Thread Luke Cwik (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-583?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435766#comment-15435766
 ] 

Luke Cwik commented on BEAM-583:


It would make sense for me for testing reasons that they should really use
TestPipeline. Having Test[Spark|Flink]Runners just adds needless noise to
the set of runners that pipeline authors can choose from. We also expect a
certain interaction pattern between TestPipeline and TestYYYRunner which a
pipeline author is unlikely to do.




> Auto Register TestDataflowRunner 
> -
>
> Key: BEAM-583
> URL: https://issues.apache.org/jira/browse/BEAM-583
> Project: Beam
>  Issue Type: Improvement
>Reporter: Mark Liu
>Assignee: Mark Liu
>
> Register TestDataflowRunner automatically.
> Simplify option's arguments when using TestDataflowRunner run end-to-end test 
> against dataflow service. Instead of using 
> {code}--runner=org.apache.beam.runners.dataflow.testing.TestDataflowRunner{code},
>  we can also use {code}--runner=TestDataflowRunner{code}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-583) Auto Register TestDataflowRunner

2016-08-24 Thread Mark Liu (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-583?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435760#comment-15435760
 ] 

Mark Liu commented on BEAM-583:
---

I can't think of a case that the pipeline author use TestDataflowRunner 
directly. it is used to kick off an integration test like WordCountIT. So do we 
want to only register runners that pipeline author use directly?

> Auto Register TestDataflowRunner 
> -
>
> Key: BEAM-583
> URL: https://issues.apache.org/jira/browse/BEAM-583
> Project: Beam
>  Issue Type: Improvement
>Reporter: Mark Liu
>Assignee: Mark Liu
>
> Register TestDataflowRunner automatically.
> Simplify option's arguments when using TestDataflowRunner run end-to-end test 
> against dataflow service. Instead of using 
> {code}--runner=org.apache.beam.runners.dataflow.testing.TestDataflowRunner{code},
>  we can also use {code}--runner=TestDataflowRunner{code}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-583) Auto Register TestDataflowRunner

2016-08-24 Thread Luke Cwik (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-583?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435742#comment-15435742
 ] 

Luke Cwik commented on BEAM-583:


When would a pipeline author try and use the TestDataflowRunner?
Why wouldn't they just use TestPipeline directly?

> Auto Register TestDataflowRunner 
> -
>
> Key: BEAM-583
> URL: https://issues.apache.org/jira/browse/BEAM-583
> Project: Beam
>  Issue Type: Improvement
>Reporter: Mark Liu
>Assignee: Mark Liu
>
> Register TestDataflowRunner automatically.
> Simplify option's arguments when using TestDataflowRunner run end-to-end test 
> against dataflow service. Instead of using 
> {code}--runner=org.apache.beam.runners.dataflow.testing.TestDataflowRunner{code},
>  we can also use {code}--runner=TestDataflowRunner{code}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-583) Auto Register TestDataflowRunner

2016-08-24 Thread Mark Liu (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-583?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435739#comment-15435739
 ] 

Mark Liu commented on BEAM-583:
---

using {code}--runner=TestDataflowRunner{code}will cause following error which 
may confuse people.

{code}
java.lang.IllegalArgumentException: Unknown 'runner' specified 
'TestDataflowRunner', supported pipeline runners [BlockingDataflowRunner, 
DataflowRunner, DirectRunner, FlinkRunner, SparkRunner, TestFlinkRunner, 
TestSparkRunner]
{code}

> Auto Register TestDataflowRunner 
> -
>
> Key: BEAM-583
> URL: https://issues.apache.org/jira/browse/BEAM-583
> Project: Beam
>  Issue Type: Improvement
>Reporter: Mark Liu
>Assignee: Mark Liu
>
> Register TestDataflowRunner automatically.
> Simplify option's arguments when using TestDataflowRunner run end-to-end test 
> against dataflow service. Instead of using 
> {code}--runner=org.apache.beam.runners.dataflow.testing.TestDataflowRunner{code},
>  we can also use {code}--runner=TestDataflowRunner{code}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-583) Auto Register TestDataflowRunner

2016-08-24 Thread Luke Cwik (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-583?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435727#comment-15435727
 ] 

Luke Cwik commented on BEAM-583:


I understand what is being done, I'm asking is this something that we really 
want and what was driving this idea for doing this change?


> Auto Register TestDataflowRunner 
> -
>
> Key: BEAM-583
> URL: https://issues.apache.org/jira/browse/BEAM-583
> Project: Beam
>  Issue Type: Improvement
>Reporter: Mark Liu
>Assignee: Mark Liu
>
> Register TestDataflowRunner automatically.
> Simplify option's arguments when using TestDataflowRunner run end-to-end test 
> against dataflow service. Instead of using 
> {code}--runner=org.apache.beam.runners.dataflow.testing.TestDataflowRunner{code},
>  we can also use {code}--runner=TestDataflowRunner{code}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-583) Auto Register TestDataflowRunner

2016-08-24 Thread Mark Liu (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-583?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435710#comment-15435710
 ] 

Mark Liu commented on BEAM-583:
---

Not figure out how to use --help, but according to PipelineOptionsFactory code, 
I only find registered options will be listed.

I'm intend to register TestDataflowRunner in DataflowPipelineRegistrar if it 
make sense to you.

> Auto Register TestDataflowRunner 
> -
>
> Key: BEAM-583
> URL: https://issues.apache.org/jira/browse/BEAM-583
> Project: Beam
>  Issue Type: Improvement
>Reporter: Mark Liu
>Assignee: Mark Liu
>
> Register TestDataflowRunner automatically.
> Simplify option's arguments when using TestDataflowRunner run end-to-end test 
> against dataflow service. Instead of using 
> {code}--runner=org.apache.beam.runners.dataflow.testing.TestDataflowRunner{code},
>  we can also use {code}--runner=TestDataflowRunner{code}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[07/18] incubator-beam git commit: Optimize imports

2016-08-24 Thread dhalperi
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PCollectionTuple.java
--
diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PCollectionTuple.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PCollectionTuple.java
index b44499b..f6776f0 100644
--- 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PCollectionTuple.java
+++ 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PCollectionTuple.java
@@ -17,6 +17,11 @@
  */
 package org.apache.beam.sdk.values;
 
+import com.google.common.collect.ImmutableMap;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.LinkedHashMap;
+import java.util.Map;
 import org.apache.beam.sdk.Pipeline;
 import org.apache.beam.sdk.transforms.AppliedPTransform;
 import org.apache.beam.sdk.transforms.PTransform;
@@ -24,13 +29,6 @@ import org.apache.beam.sdk.transforms.ParDo;
 import org.apache.beam.sdk.util.WindowingStrategy;
 import org.apache.beam.sdk.values.PCollection.IsBounded;
 
-import com.google.common.collect.ImmutableMap;
-
-import java.util.Collection;
-import java.util.Collections;
-import java.util.LinkedHashMap;
-import java.util.Map;
-
 /**
  * A {@link PCollectionTuple} is an immutable tuple of
  * heterogeneously-typed {@link PCollection PCollections}, "keyed" by

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PCollectionView.java
--
diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PCollectionView.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PCollectionView.java
index 20f1071..0e5f594 100644
--- 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PCollectionView.java
+++ 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PCollectionView.java
@@ -17,6 +17,7 @@
  */
 package org.apache.beam.sdk.values;
 
+import java.io.Serializable;
 import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.transforms.ParDo;
 import org.apache.beam.sdk.transforms.View;
@@ -24,8 +25,6 @@ import org.apache.beam.sdk.transforms.ViewFn;
 import org.apache.beam.sdk.util.WindowedValue;
 import org.apache.beam.sdk.util.WindowingStrategy;
 
-import java.io.Serializable;
-
 /**
  * A {@link PCollectionView PCollectionViewT} is an immutable view of 
a {@link PCollection}
  * as a value of type {@code T} that can be accessed

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PDone.java
--
diff --git a/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PDone.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PDone.java
index 7c05703..83d6a92 100644
--- a/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PDone.java
+++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PDone.java
@@ -17,11 +17,10 @@
  */
 package org.apache.beam.sdk.values;
 
-import org.apache.beam.sdk.Pipeline;
-import org.apache.beam.sdk.transforms.PTransform;
-
 import java.util.Collection;
 import java.util.Collections;
+import org.apache.beam.sdk.Pipeline;
+import org.apache.beam.sdk.transforms.PTransform;
 
 /**
  * {@link PDone} is the output of a {@link PTransform} that has a trivial 
result,

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PInput.java
--
diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PInput.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PInput.java
index 3faf6b9..98987cd 100644
--- a/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PInput.java
+++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/values/PInput.java
@@ -17,9 +17,8 @@
  */
 package org.apache.beam.sdk.values;
 
-import org.apache.beam.sdk.Pipeline;
-
 import java.util.Collection;
+import org.apache.beam.sdk.Pipeline;
 
 /**
  * The interface for things that might be input to a

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/main/java/org/apache/beam/sdk/values/POutput.java
--
diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/values/POutput.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/values/POutput.java
index 299d55d..6be9215 100644
--- a/sdks/java/core/src/main/java/org/apache/beam/sdk/values/POutput.java
+++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/values/POutput.java
@@ -17,12 +17,11 @@
  */
 package org.apache.beam.sdk.values;
 
+import java.util.Collection;
 import org.apache.beam.sdk.Pipeline;
 import 

[03/18] incubator-beam git commit: Optimize imports

2016-08-24 Thread dhalperi
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/datastore/DatastoreV1.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/datastore/DatastoreV1.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/datastore/DatastoreV1.java
index 852595a..c7433d3 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/datastore/DatastoreV1.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/datastore/DatastoreV1.java
@@ -30,27 +30,6 @@ import static 
com.google.datastore.v1.client.DatastoreHelper.makeOrder;
 import static com.google.datastore.v1.client.DatastoreHelper.makeUpsert;
 import static com.google.datastore.v1.client.DatastoreHelper.makeValue;
 
-import org.apache.beam.sdk.annotations.Experimental;
-import org.apache.beam.sdk.options.GcpOptions;
-import org.apache.beam.sdk.options.PipelineOptions;
-import org.apache.beam.sdk.transforms.Create;
-import org.apache.beam.sdk.transforms.DoFn;
-import org.apache.beam.sdk.transforms.Flatten;
-import org.apache.beam.sdk.transforms.GroupByKey;
-import org.apache.beam.sdk.transforms.MapElements;
-import org.apache.beam.sdk.transforms.PTransform;
-import org.apache.beam.sdk.transforms.ParDo;
-import org.apache.beam.sdk.transforms.SimpleFunction;
-import org.apache.beam.sdk.transforms.Values;
-import org.apache.beam.sdk.transforms.display.DisplayData;
-import org.apache.beam.sdk.transforms.display.DisplayData.Builder;
-import org.apache.beam.sdk.util.AttemptBoundedExponentialBackOff;
-import org.apache.beam.sdk.util.RetryHttpRequestInitializer;
-import org.apache.beam.sdk.values.KV;
-import org.apache.beam.sdk.values.PBegin;
-import org.apache.beam.sdk.values.PCollection;
-import org.apache.beam.sdk.values.PDone;
-
 import com.google.api.client.auth.oauth2.Credential;
 import com.google.api.client.util.BackOff;
 import com.google.api.client.util.BackOffUtils;
@@ -76,16 +55,34 @@ import com.google.datastore.v1.client.DatastoreHelper;
 import com.google.datastore.v1.client.DatastoreOptions;
 import com.google.datastore.v1.client.QuerySplitter;
 import com.google.protobuf.Int32Value;
-
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
 import java.io.IOException;
 import java.io.Serializable;
 import java.util.ArrayList;
 import java.util.List;
 import java.util.NoSuchElementException;
 import javax.annotation.Nullable;
+import org.apache.beam.sdk.annotations.Experimental;
+import org.apache.beam.sdk.options.GcpOptions;
+import org.apache.beam.sdk.options.PipelineOptions;
+import org.apache.beam.sdk.transforms.Create;
+import org.apache.beam.sdk.transforms.DoFn;
+import org.apache.beam.sdk.transforms.Flatten;
+import org.apache.beam.sdk.transforms.GroupByKey;
+import org.apache.beam.sdk.transforms.MapElements;
+import org.apache.beam.sdk.transforms.PTransform;
+import org.apache.beam.sdk.transforms.ParDo;
+import org.apache.beam.sdk.transforms.SimpleFunction;
+import org.apache.beam.sdk.transforms.Values;
+import org.apache.beam.sdk.transforms.display.DisplayData;
+import org.apache.beam.sdk.transforms.display.DisplayData.Builder;
+import org.apache.beam.sdk.util.AttemptBoundedExponentialBackOff;
+import org.apache.beam.sdk.util.RetryHttpRequestInitializer;
+import org.apache.beam.sdk.values.KV;
+import org.apache.beam.sdk.values.PBegin;
+import org.apache.beam.sdk.values.PCollection;
+import org.apache.beam.sdk.values.PDone;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
 
 /**
  * {@link DatastoreV1} provides an API to Read, Write and Delete {@link 
PCollection PCollections}

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryAvroUtilsTest.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryAvroUtilsTest.java
 
b/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryAvroUtilsTest.java
index 316392f..59cf1f7 100644
--- 
a/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryAvroUtilsTest.java
+++ 
b/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryAvroUtilsTest.java
@@ -19,25 +19,22 @@ package org.apache.beam.sdk.io.gcp.bigquery;
 
 import static org.junit.Assert.assertEquals;
 
-import org.apache.beam.sdk.coders.AvroCoder;
-import org.apache.beam.sdk.coders.DefaultCoder;
-
 import com.google.api.services.bigquery.model.TableFieldSchema;
 import com.google.api.services.bigquery.model.TableRow;
 import com.google.api.services.bigquery.model.TableSchema;
 import com.google.common.collect.Lists;

[GitHub] incubator-beam pull request #869: Remove legacy import order

2016-08-24 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/869


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[12/18] incubator-beam git commit: Optimize imports

2016-08-24 Thread dhalperi
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/MonitoringUtil.java
--
diff --git 
a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/MonitoringUtil.java
 
b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/MonitoringUtil.java
index 4d12e66..d014623 100644
--- 
a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/MonitoringUtil.java
+++ 
b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/MonitoringUtil.java
@@ -19,9 +19,6 @@ package org.apache.beam.runners.dataflow.util;
 
 import static org.apache.beam.runners.dataflow.util.TimeUtil.fromCloudTime;
 
-import org.apache.beam.runners.dataflow.options.DataflowPipelineOptions;
-import org.apache.beam.sdk.PipelineResult.State;
-
 import com.google.api.services.dataflow.Dataflow;
 import com.google.api.services.dataflow.Dataflow.Projects.Jobs.Messages;
 import com.google.api.services.dataflow.model.JobMessage;
@@ -29,11 +26,6 @@ import 
com.google.api.services.dataflow.model.ListJobMessagesResponse;
 import com.google.common.base.MoreObjects;
 import com.google.common.base.Strings;
 import com.google.common.collect.ImmutableMap;
-
-import org.joda.time.Instant;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
 import java.io.IOException;
 import java.io.UnsupportedEncodingException;
 import java.net.URLEncoder;
@@ -42,8 +34,12 @@ import java.util.Collections;
 import java.util.Comparator;
 import java.util.List;
 import java.util.Map;
-
 import javax.annotation.Nullable;
+import org.apache.beam.runners.dataflow.options.DataflowPipelineOptions;
+import org.apache.beam.sdk.PipelineResult.State;
+import org.joda.time.Instant;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
 
 /**
  * A helper class for monitoring jobs submitted to the service.

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/PackageUtil.java
--
diff --git 
a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/PackageUtil.java
 
b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/PackageUtil.java
index cff7e2b..bf1f666 100644
--- 
a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/PackageUtil.java
+++ 
b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/PackageUtil.java
@@ -17,11 +17,7 @@
  */
 package org.apache.beam.runners.dataflow.util;
 
-import org.apache.beam.sdk.util.AttemptBoundedExponentialBackOff;
-import org.apache.beam.sdk.util.IOChannelUtils;
-import org.apache.beam.sdk.util.MimeTypes;
-import org.apache.beam.sdk.util.ZipFiles;
-
+import com.fasterxml.jackson.core.Base64Variants;
 import com.google.api.client.util.BackOffUtils;
 import com.google.api.client.util.Sleeper;
 import com.google.api.services.dataflow.model.DataflowPackage;
@@ -31,12 +27,6 @@ import com.google.common.hash.Hasher;
 import com.google.common.hash.Hashing;
 import com.google.common.io.CountingOutputStream;
 import com.google.common.io.Files;
-
-import com.fasterxml.jackson.core.Base64Variants;
-
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
 import java.io.File;
 import java.io.FileNotFoundException;
 import java.io.IOException;
@@ -47,6 +37,12 @@ import java.util.ArrayList;
 import java.util.Collection;
 import java.util.List;
 import java.util.Objects;
+import org.apache.beam.sdk.util.AttemptBoundedExponentialBackOff;
+import org.apache.beam.sdk.util.IOChannelUtils;
+import org.apache.beam.sdk.util.MimeTypes;
+import org.apache.beam.sdk.util.ZipFiles;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
 
 /** Helper routines for packages. */
 public class PackageUtil {

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/RandomAccessData.java
--
diff --git 
a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/RandomAccessData.java
 
b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/RandomAccessData.java
index 9e10242..683e16b 100644
--- 
a/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/RandomAccessData.java
+++ 
b/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/util/RandomAccessData.java
@@ -20,26 +20,22 @@ package org.apache.beam.runners.dataflow.util;
 import static 

[17/18] incubator-beam git commit: Optimize imports

2016-08-24 Thread dhalperi
Optimize imports


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/c623a271
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/c623a271
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/c623a271

Branch: refs/heads/master
Commit: c623a271a3f58a834a803cf9d3b5e5c0920caea7
Parents: 5776b93
Author: bchambers 
Authored: Wed Aug 24 13:01:03 2016 -0700
Committer: Dan Halperin 
Committed: Wed Aug 24 13:39:04 2016 -0700

--
 .../beam/examples/DebuggingWordCount.java   |   8 +-
 .../apache/beam/examples/WindowedWordCount.java |  17 ++-
 .../org/apache/beam/examples/WordCount.java |   8 +-
 .../common/ExampleBigQueryTableOptions.java |   3 +-
 .../beam/examples/common/ExampleOptions.java|   7 +-
 .../beam/examples/common/ExampleUtils.java  |  14 +--
 .../examples/common/PubsubFileInjector.java |  14 +--
 .../beam/examples/complete/AutoComplete.java|  33 +++---
 .../examples/complete/StreamingWordExtract.java |  12 +-
 .../apache/beam/examples/complete/TfIdf.java|  14 +--
 .../examples/complete/TopWikipediaSessions.java |   7 +-
 .../examples/complete/TrafficMaxLaneFlow.java   |  19 ++--
 .../beam/examples/complete/TrafficRoutes.java   |  29 +++--
 .../examples/cookbook/BigQueryTornadoes.java|  12 +-
 .../cookbook/CombinePerKeyExamples.java |  12 +-
 .../examples/cookbook/DatastoreWordCount.java   |  18 ++-
 .../beam/examples/cookbook/FilterExamples.java  |  14 +--
 .../beam/examples/cookbook/JoinExamples.java|   3 +-
 .../examples/cookbook/MaxPerKeyExamples.java|  12 +-
 .../beam/examples/cookbook/TriggerExample.java  |  17 ++-
 .../beam/examples/DebuggingWordCountTest.java   |   6 +-
 .../org/apache/beam/examples/WordCountIT.java   |   4 +-
 .../org/apache/beam/examples/WordCountTest.java |   6 +-
 .../examples/complete/AutoCompleteTest.java |  12 +-
 .../beam/examples/complete/TfIdfTest.java   |   6 +-
 .../complete/TopWikipediaSessionsTest.java  |   7 +-
 .../examples/cookbook/BigQueryTornadoesIT.java  |   1 -
 .../cookbook/BigQueryTornadoesTest.java |   7 +-
 .../cookbook/CombinePerKeyExamplesTest.java |   7 +-
 .../examples/cookbook/DeDupExampleTest.java |   6 +-
 .../examples/cookbook/FilterExamplesTest.java   |   9 +-
 .../examples/cookbook/JoinExamplesTest.java |   9 +-
 .../cookbook/MaxPerKeyExamplesTest.java |   9 +-
 .../examples/cookbook/TriggerExampleTest.java   |  17 ++-
 .../beam/examples/MinimalWordCountJava8.java|   3 +-
 .../beam/examples/complete/game/GameStats.java  |   8 +-
 .../examples/complete/game/HourlyTeamScore.java |   8 +-
 .../examples/complete/game/LeaderBoard.java |   8 +-
 .../beam/examples/complete/game/UserScore.java  |   8 +-
 .../complete/game/injector/Injector.java|   8 +-
 .../complete/game/injector/InjectorUtils.java   |   1 -
 .../injector/RetryHttpInitializerWrapper.java   |   1 -
 .../complete/game/utils/WriteToBigQuery.java|  18 ++-
 .../game/utils/WriteWindowedToBigQuery.java |   6 +-
 .../examples/MinimalWordCountJava8Test.java |  21 ++--
 .../examples/complete/game/GameStatsTest.java   |   8 +-
 .../complete/game/HourlyTeamScoreTest.java  |   8 +-
 .../examples/complete/game/UserScoreTest.java   |   8 +-
 .../beam/runners/core/SideInputHandler.java |  17 ++-
 .../core/UnboundedReadFromBoundedSource.java|  37 +++
 .../apache/beam/sdk/util/AssignWindowsDoFn.java |   7 +-
 .../beam/sdk/util/BatchTimerInternals.java  |   8 +-
 .../apache/beam/sdk/util/DoFnRunnerBase.java|  22 ++--
 .../org/apache/beam/sdk/util/DoFnRunners.java   |   3 +-
 .../GroupAlsoByWindowsViaOutputBufferDoFn.java  |   7 +-
 .../sdk/util/GroupByKeyViaGroupByKeyOnly.java   |   9 +-
 .../sdk/util/LateDataDroppingDoFnRunner.java|  10 +-
 .../apache/beam/sdk/util/PaneInfoTracker.java   |   6 +-
 .../sdk/util/PushbackSideInputDoFnRunner.java   |   6 +-
 .../java/org/apache/beam/sdk/util/ReduceFn.java |   4 +-
 .../beam/sdk/util/ReduceFnContextFactory.java   |  12 +-
 .../apache/beam/sdk/util/ReduceFnRunner.java|  26 ++---
 .../apache/beam/sdk/util/SimpleDoFnRunner.java  |   2 +-
 .../apache/beam/sdk/util/SystemReduceFn.java|   3 +-
 .../org/apache/beam/sdk/util/TriggerRunner.java |  16 +--
 .../org/apache/beam/sdk/util/WatermarkHold.java |  12 +-
 .../beam/runners/core/SideInputHandlerTest.java |   4 +-
 .../UnboundedReadFromBoundedSourceTest.java |  27 ++---
 .../beam/sdk/util/BatchTimerInternalsTest.java  |   1 -
 .../sdk/util/GroupAlsoByWindowsProperties.java  |  19 ++--
 ...oupAlsoByWindowsViaOutputBufferDoFnTest.java |   1 -
 .../util/LateDataDroppingDoFnRunnerTest.java|   9 +-
 .../util/PushbackSideInputDoFnRunnerTest.java   |   9 +-
 .../beam/sdk/util/ReduceFnRunnerTest.java   |  13 +--
 

[06/18] incubator-beam git commit: Optimize imports

2016-08-24 Thread dhalperi
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/test/java/org/apache/beam/sdk/io/FileBasedSourceTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/io/FileBasedSourceTest.java 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/io/FileBasedSourceTest.java
index c9f4079..5208910 100644
--- 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/io/FileBasedSourceTest.java
+++ 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/io/FileBasedSourceTest.java
@@ -21,7 +21,6 @@ import static 
org.apache.beam.sdk.testing.SourceTestUtils.assertSplitAtFractionE
 import static 
org.apache.beam.sdk.testing.SourceTestUtils.assertSplitAtFractionFails;
 import static 
org.apache.beam.sdk.testing.SourceTestUtils.assertSplitAtFractionSucceedsAndConsistent;
 import static org.apache.beam.sdk.testing.SourceTestUtils.readFromSource;
-
 import static org.hamcrest.Matchers.containsInAnyOrder;
 import static org.junit.Assert.assertEquals;
 import static org.junit.Assert.assertThat;
@@ -29,6 +28,20 @@ import static org.junit.Assert.assertTrue;
 import static org.junit.Assert.fail;
 import static org.mockito.Mockito.when;
 
+import com.google.common.collect.ImmutableList;
+import java.io.ByteArrayOutputStream;
+import java.io.File;
+import java.io.IOException;
+import java.nio.ByteBuffer;
+import java.nio.channels.ReadableByteChannel;
+import java.nio.channels.SeekableByteChannel;
+import java.nio.charset.StandardCharsets;
+import java.nio.file.Files;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.List;
+import java.util.NoSuchElementException;
+import java.util.Random;
 import org.apache.beam.sdk.Pipeline;
 import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.coders.StringUtf8Coder;
@@ -43,9 +56,6 @@ import org.apache.beam.sdk.util.CoderUtils;
 import org.apache.beam.sdk.util.IOChannelFactory;
 import org.apache.beam.sdk.util.IOChannelUtils;
 import org.apache.beam.sdk.values.PCollection;
-
-import com.google.common.collect.ImmutableList;
-
 import org.junit.Rule;
 import org.junit.Test;
 import org.junit.experimental.categories.Category;
@@ -54,20 +64,6 @@ import org.junit.runner.RunWith;
 import org.junit.runners.JUnit4;
 import org.mockito.Mockito;
 
-import java.io.ByteArrayOutputStream;
-import java.io.File;
-import java.io.IOException;
-import java.nio.ByteBuffer;
-import java.nio.channels.ReadableByteChannel;
-import java.nio.channels.SeekableByteChannel;
-import java.nio.charset.StandardCharsets;
-import java.nio.file.Files;
-import java.util.ArrayList;
-import java.util.Arrays;
-import java.util.List;
-import java.util.NoSuchElementException;
-import java.util.Random;
-
 /**
  * Tests code common to all file-based sources.
  */

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/test/java/org/apache/beam/sdk/io/OffsetBasedSourceTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/io/OffsetBasedSourceTest.java
 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/io/OffsetBasedSourceTest.java
index f689f51..923b4b4 100644
--- 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/io/OffsetBasedSourceTest.java
+++ 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/io/OffsetBasedSourceTest.java
@@ -19,29 +19,26 @@ package org.apache.beam.sdk.io;
 
 import static 
org.apache.beam.sdk.testing.SourceTestUtils.assertSplitAtFractionExhaustive;
 import static org.apache.beam.sdk.testing.SourceTestUtils.readFromSource;
-
 import static org.junit.Assert.assertEquals;
 import static org.junit.Assert.assertFalse;
 import static org.junit.Assert.assertNotNull;
 import static org.junit.Assert.assertNull;
 import static org.junit.Assert.assertTrue;
 
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.NoSuchElementException;
 import org.apache.beam.sdk.coders.BigEndianIntegerCoder;
 import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.io.BoundedSource.BoundedReader;
 import org.apache.beam.sdk.io.OffsetBasedSource.OffsetBasedReader;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.options.PipelineOptionsFactory;
-
 import org.junit.Test;
 import org.junit.runner.RunWith;
 import org.junit.runners.JUnit4;
 
-import java.io.IOException;
-import java.util.ArrayList;
-import java.util.List;
-import java.util.NoSuchElementException;
-
 /**
  * Tests code common to all offset-based sources.
  */

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/test/java/org/apache/beam/sdk/io/PubsubIOTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/io/PubsubIOTest.java 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/io/PubsubIOTest.java

[14/18] incubator-beam git commit: Optimize imports

2016-08-24 Thread dhalperi
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/runners/direct-java/src/test/java/org/apache/beam/runners/direct/SideInputContainerTest.java
--
diff --git 
a/runners/direct-java/src/test/java/org/apache/beam/runners/direct/SideInputContainerTest.java
 
b/runners/direct-java/src/test/java/org/apache/beam/runners/direct/SideInputContainerTest.java
index ec589da..cc7d88a 100644
--- 
a/runners/direct-java/src/test/java/org/apache/beam/runners/direct/SideInputContainerTest.java
+++ 
b/runners/direct-java/src/test/java/org/apache/beam/runners/direct/SideInputContainerTest.java
@@ -25,6 +25,13 @@ import static org.junit.Assert.assertThat;
 import static org.junit.Assert.fail;
 import static org.mockito.Mockito.doAnswer;
 
+import com.google.common.collect.ImmutableList;
+import java.util.Map;
+import java.util.concurrent.Callable;
+import java.util.concurrent.CountDownLatch;
+import java.util.concurrent.Executors;
+import java.util.concurrent.Future;
+import java.util.concurrent.TimeUnit;
 import org.apache.beam.sdk.coders.KvCoder;
 import org.apache.beam.sdk.coders.StringUtf8Coder;
 import org.apache.beam.sdk.testing.TestPipeline;
@@ -44,9 +51,6 @@ import org.apache.beam.sdk.util.WindowingStrategy;
 import org.apache.beam.sdk.values.KV;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.PCollectionView;
-
-import com.google.common.collect.ImmutableList;
-
 import org.joda.time.Instant;
 import org.junit.Before;
 import org.junit.Rule;
@@ -60,13 +64,6 @@ import org.mockito.MockitoAnnotations;
 import org.mockito.invocation.InvocationOnMock;
 import org.mockito.stubbing.Answer;
 
-import java.util.Map;
-import java.util.concurrent.Callable;
-import java.util.concurrent.CountDownLatch;
-import java.util.concurrent.Executors;
-import java.util.concurrent.Future;
-import java.util.concurrent.TimeUnit;
-
 /**
  * Tests for {@link SideInputContainer}.
  */

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/runners/direct-java/src/test/java/org/apache/beam/runners/direct/StepTransformResultTest.java
--
diff --git 
a/runners/direct-java/src/test/java/org/apache/beam/runners/direct/StepTransformResultTest.java
 
b/runners/direct-java/src/test/java/org/apache/beam/runners/direct/StepTransformResultTest.java
index cfc69bc..c06eff9 100644
--- 
a/runners/direct-java/src/test/java/org/apache/beam/runners/direct/StepTransformResultTest.java
+++ 
b/runners/direct-java/src/test/java/org/apache/beam/runners/direct/StepTransformResultTest.java
@@ -29,7 +29,6 @@ import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.transforms.AppliedPTransform;
 import org.apache.beam.sdk.transforms.Create;
 import org.apache.beam.sdk.values.PCollection;
-
 import org.hamcrest.Matchers;
 import org.junit.Before;
 import org.junit.Test;

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/runners/direct-java/src/test/java/org/apache/beam/runners/direct/StructuralKeyTest.java
--
diff --git 
a/runners/direct-java/src/test/java/org/apache/beam/runners/direct/StructuralKeyTest.java
 
b/runners/direct-java/src/test/java/org/apache/beam/runners/direct/StructuralKeyTest.java
index 26514f0..18aeac6 100644
--- 
a/runners/direct-java/src/test/java/org/apache/beam/runners/direct/StructuralKeyTest.java
+++ 
b/runners/direct-java/src/test/java/org/apache/beam/runners/direct/StructuralKeyTest.java
@@ -27,7 +27,6 @@ import static org.junit.Assert.assertThat;
 import org.apache.beam.sdk.coders.ByteArrayCoder;
 import org.apache.beam.sdk.coders.StringUtf8Coder;
 import org.apache.beam.sdk.coders.VarIntCoder;
-
 import org.junit.Test;
 import org.junit.runner.RunWith;
 import org.junit.runners.JUnit4;

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/runners/direct-java/src/test/java/org/apache/beam/runners/direct/TransformExecutorServicesTest.java
--
diff --git 
a/runners/direct-java/src/test/java/org/apache/beam/runners/direct/TransformExecutorServicesTest.java
 
b/runners/direct-java/src/test/java/org/apache/beam/runners/direct/TransformExecutorServicesTest.java
index 04aa96f..b085723 100644
--- 
a/runners/direct-java/src/test/java/org/apache/beam/runners/direct/TransformExecutorServicesTest.java
+++ 
b/runners/direct-java/src/test/java/org/apache/beam/runners/direct/TransformExecutorServicesTest.java
@@ -22,7 +22,7 @@ import static org.mockito.Mockito.never;
 import static org.mockito.Mockito.verify;
 
 import com.google.common.util.concurrent.MoreExecutors;
-
+import java.util.concurrent.ExecutorService;
 import org.junit.Before;
 import org.junit.Rule;
 import org.junit.Test;
@@ -30,8 +30,6 @@ import org.junit.rules.ExpectedException;
 import org.junit.runner.RunWith;
 

[13/18] incubator-beam git commit: Optimize imports

2016-08-24 Thread dhalperi
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/SourceInputFormat.java
--
diff --git 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/SourceInputFormat.java
 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/SourceInputFormat.java
index 1d06b1a..443378f 100644
--- 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/SourceInputFormat.java
+++ 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/SourceInputFormat.java
@@ -17,6 +17,8 @@
  */
 package org.apache.beam.runners.flink.translation.wrappers;
 
+import java.io.IOException;
+import java.util.List;
 import 
org.apache.beam.runners.flink.translation.utils.SerializedPipelineOptions;
 import org.apache.beam.sdk.io.BoundedSource;
 import org.apache.beam.sdk.io.Source;
@@ -24,7 +26,6 @@ import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.transforms.windowing.GlobalWindow;
 import org.apache.beam.sdk.transforms.windowing.PaneInfo;
 import org.apache.beam.sdk.util.WindowedValue;
-
 import org.apache.flink.api.common.io.DefaultInputSplitAssigner;
 import org.apache.flink.api.common.io.InputFormat;
 import org.apache.flink.api.common.io.statistics.BaseStatistics;
@@ -34,9 +35,6 @@ import org.joda.time.Instant;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
-import java.io.IOException;
-import java.util.List;
-
 
 /**
  * Wrapper for executing a {@link Source} as a Flink {@link InputFormat}.

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/SourceInputSplit.java
--
diff --git 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/SourceInputSplit.java
 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/SourceInputSplit.java
index c3672c0..e4a7386 100644
--- 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/SourceInputSplit.java
+++ 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/SourceInputSplit.java
@@ -18,7 +18,6 @@
 package org.apache.beam.runners.flink.translation.wrappers;
 
 import org.apache.beam.sdk.io.Source;
-
 import org.apache.flink.core.io.InputSplit;
 
 /**

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
--
diff --git 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
index 092a226..000d69f 100644
--- 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
+++ 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
@@ -18,7 +18,15 @@
 package org.apache.beam.runners.flink.translation.wrappers.streaming;
 
 import avro.shaded.com.google.common.base.Preconditions;
-
+import com.google.common.collect.Iterables;
+import java.io.IOException;
+import java.io.Serializable;
+import java.nio.ByteBuffer;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
 import org.apache.beam.runners.core.SideInputHandler;
 import 
org.apache.beam.runners.flink.translation.utils.SerializedPipelineOptions;
 import 
org.apache.beam.runners.flink.translation.wrappers.SerializableFnAggregatorWrapper;
@@ -44,9 +52,6 @@ import org.apache.beam.sdk.util.WindowingStrategy;
 import org.apache.beam.sdk.util.state.StateInternals;
 import org.apache.beam.sdk.values.PCollectionView;
 import org.apache.beam.sdk.values.TupleTag;
-
-import com.google.common.collect.Iterables;
-
 import org.apache.flink.api.common.ExecutionConfig;
 import org.apache.flink.api.common.functions.ReduceFunction;
 import org.apache.flink.api.common.state.ListState;
@@ -69,15 +74,6 @@ import org.apache.flink.streaming.api.watermark.Watermark;
 import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
 import org.apache.flink.streaming.runtime.tasks.StreamTaskState;
 
-import java.io.IOException;
-import java.io.Serializable;
-import java.nio.ByteBuffer;
-import java.util.ArrayList;
-import java.util.Collection;
-import java.util.HashMap;
-import java.util.List;
-import java.util.Map;
-
 /**
  * Flink operator for executing {@link DoFn DoFns}.

[09/18] incubator-beam git commit: Optimize imports

2016-08-24 Thread dhalperi
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFn.java
--
diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFn.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFn.java
index 2348783..9f89826 100644
--- a/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFn.java
+++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFn.java
@@ -21,6 +21,14 @@ import static 
com.google.common.base.Preconditions.checkArgument;
 import static com.google.common.base.Preconditions.checkNotNull;
 import static com.google.common.base.Preconditions.checkState;
 
+import java.io.Serializable;
+import java.lang.annotation.Documented;
+import java.lang.annotation.ElementType;
+import java.lang.annotation.Retention;
+import java.lang.annotation.RetentionPolicy;
+import java.lang.annotation.Target;
+import java.util.HashMap;
+import java.util.Map;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.transforms.Combine.CombineFn;
 import org.apache.beam.sdk.transforms.OldDoFn.DelegatingAggregator;
@@ -32,19 +40,9 @@ import org.apache.beam.sdk.transforms.windowing.PaneInfo;
 import org.apache.beam.sdk.values.PCollectionView;
 import org.apache.beam.sdk.values.TupleTag;
 import org.apache.beam.sdk.values.TypeDescriptor;
-
 import org.joda.time.Duration;
 import org.joda.time.Instant;
 
-import java.io.Serializable;
-import java.lang.annotation.Documented;
-import java.lang.annotation.ElementType;
-import java.lang.annotation.Retention;
-import java.lang.annotation.RetentionPolicy;
-import java.lang.annotation.Target;
-import java.util.HashMap;
-import java.util.Map;
-
 /**
  * The argument to {@link ParDo} providing the code to use to process
  * elements of the input

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFnAdapters.java
--
diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFnAdapters.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFnAdapters.java
index 71a148f..4803d77 100644
--- 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFnAdapters.java
+++ 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFnAdapters.java
@@ -17,6 +17,7 @@
  */
 package org.apache.beam.sdk.transforms;
 
+import java.io.IOException;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.transforms.display.DisplayData;
 import org.apache.beam.sdk.transforms.reflect.DoFnInvoker;
@@ -28,12 +29,9 @@ import org.apache.beam.sdk.transforms.windowing.PaneInfo;
 import org.apache.beam.sdk.values.PCollectionView;
 import org.apache.beam.sdk.values.TupleTag;
 import org.apache.beam.sdk.values.TypeDescriptor;
-
 import org.joda.time.Duration;
 import org.joda.time.Instant;
 
-import java.io.IOException;
-
 /**
  * Utility class containing adapters for running a {@link DoFn} as an {@link 
OldDoFn}.
  *

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFnTester.java
--
diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFnTester.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFnTester.java
index 4cd410a..82c1293 100644
--- 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFnTester.java
+++ 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFnTester.java
@@ -17,6 +17,19 @@
  */
 package org.apache.beam.sdk.transforms;
 
+import com.google.common.base.Function;
+import com.google.common.base.MoreObjects;
+import com.google.common.collect.ImmutableList;
+import com.google.common.collect.Iterables;
+import com.google.common.collect.Lists;
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
 import org.apache.beam.sdk.annotations.Experimental;
 import org.apache.beam.sdk.coders.Coder;
 import org.apache.beam.sdk.options.PipelineOptions;
@@ -34,20 +47,7 @@ import org.apache.beam.sdk.util.state.StateInternals;
 import org.apache.beam.sdk.values.PCollectionView;
 import org.apache.beam.sdk.values.TimestampedValue;
 import org.apache.beam.sdk.values.TupleTag;
-import com.google.common.base.Function;
-import com.google.common.base.MoreObjects;
-import com.google.common.collect.ImmutableList;
-import com.google.common.collect.Iterables;
-import com.google.common.collect.Lists;
 import org.joda.time.Instant;
-import java.io.IOException;
-import java.util.ArrayList;
-import java.util.Arrays;
-import 

[15/18] incubator-beam git commit: Optimize imports

2016-08-24 Thread dhalperi
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectRunner.java
--
diff --git 
a/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectRunner.java
 
b/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectRunner.java
index 68184de..b2d61c3 100644
--- 
a/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectRunner.java
+++ 
b/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectRunner.java
@@ -17,6 +17,17 @@
  */
 package org.apache.beam.runners.direct;
 
+import com.google.common.base.MoreObjects;
+import com.google.common.base.Supplier;
+import com.google.common.collect.ImmutableList;
+import com.google.common.collect.ImmutableMap;
+import com.google.common.collect.ImmutableSet;
+import java.io.IOException;
+import java.util.Collection;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.concurrent.ExecutorService;
+import java.util.concurrent.Executors;
 import org.apache.beam.runners.direct.DirectGroupByKey.DirectGroupByKeyOnly;
 import org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult;
 import 
org.apache.beam.runners.direct.TestStreamEvaluatorFactory.DirectTestStreamFactory;
@@ -46,23 +57,9 @@ import org.apache.beam.sdk.values.PCollectionView;
 import org.apache.beam.sdk.values.PInput;
 import org.apache.beam.sdk.values.POutput;
 import org.apache.beam.sdk.values.PValue;
-
-import com.google.common.base.MoreObjects;
-import com.google.common.base.Supplier;
-import com.google.common.collect.ImmutableList;
-import com.google.common.collect.ImmutableMap;
-import com.google.common.collect.ImmutableSet;
-
 import org.joda.time.Duration;
 import org.joda.time.Instant;
 
-import java.io.IOException;
-import java.util.Collection;
-import java.util.HashMap;
-import java.util.Map;
-import java.util.concurrent.ExecutorService;
-import java.util.concurrent.Executors;
-
 /**
  * An In-Memory implementation of the Dataflow Programming Model. Supports 
Unbounded
  * {@link PCollection PCollections}.

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectTimerInternals.java
--
diff --git 
a/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectTimerInternals.java
 
b/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectTimerInternals.java
index a4705dd..4003983 100644
--- 
a/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectTimerInternals.java
+++ 
b/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DirectTimerInternals.java
@@ -17,15 +17,13 @@
  */
 package org.apache.beam.runners.direct;
 
+import javax.annotation.Nullable;
 import org.apache.beam.runners.direct.WatermarkManager.TimerUpdate;
 import 
org.apache.beam.runners.direct.WatermarkManager.TimerUpdate.TimerUpdateBuilder;
 import org.apache.beam.runners.direct.WatermarkManager.TransformWatermarks;
 import org.apache.beam.sdk.util.TimerInternals;
-
 import org.joda.time.Instant;
 
-import javax.annotation.Nullable;
-
 /**
  * An implementation of {@link TimerInternals} where all relevant data exists 
in memory.
  */

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DoFnLifecycleManager.java
--
diff --git 
a/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DoFnLifecycleManager.java
 
b/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DoFnLifecycleManager.java
index 3f4f2c6..0e15c18 100644
--- 
a/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DoFnLifecycleManager.java
+++ 
b/runners/direct-java/src/main/java/org/apache/beam/runners/direct/DoFnLifecycleManager.java
@@ -18,21 +18,18 @@
 
 package org.apache.beam.runners.direct;
 
-import org.apache.beam.sdk.runners.PipelineRunner;
-import org.apache.beam.sdk.transforms.DoFn;
-import org.apache.beam.sdk.transforms.OldDoFn;
-import org.apache.beam.sdk.util.SerializableUtils;
-
 import com.google.common.cache.CacheBuilder;
 import com.google.common.cache.CacheLoader;
 import com.google.common.cache.LoadingCache;
-
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
 import java.util.ArrayList;
 import java.util.Collection;
 import java.util.Iterator;
+import org.apache.beam.sdk.runners.PipelineRunner;
+import org.apache.beam.sdk.transforms.DoFn;
+import org.apache.beam.sdk.transforms.OldDoFn;
+import org.apache.beam.sdk.util.SerializableUtils;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
 
 /**
  * Manages {@link DoFn} setup, teardown, and serialization.


[16/18] incubator-beam git commit: Optimize imports

2016-08-24 Thread dhalperi
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/examples/java/src/test/java/org/apache/beam/examples/complete/TfIdfTest.java
--
diff --git 
a/examples/java/src/test/java/org/apache/beam/examples/complete/TfIdfTest.java 
b/examples/java/src/test/java/org/apache/beam/examples/complete/TfIdfTest.java
index c7ce67e..c2d654e 100644
--- 
a/examples/java/src/test/java/org/apache/beam/examples/complete/TfIdfTest.java
+++ 
b/examples/java/src/test/java/org/apache/beam/examples/complete/TfIdfTest.java
@@ -17,6 +17,8 @@
  */
 package org.apache.beam.examples.complete;
 
+import java.net.URI;
+import java.util.Arrays;
 import org.apache.beam.sdk.Pipeline;
 import org.apache.beam.sdk.coders.StringDelegateCoder;
 import org.apache.beam.sdk.testing.PAssert;
@@ -27,15 +29,11 @@ import org.apache.beam.sdk.transforms.Keys;
 import org.apache.beam.sdk.transforms.RemoveDuplicates;
 import org.apache.beam.sdk.values.KV;
 import org.apache.beam.sdk.values.PCollection;
-
 import org.junit.Test;
 import org.junit.experimental.categories.Category;
 import org.junit.runner.RunWith;
 import org.junit.runners.JUnit4;
 
-import java.net.URI;
-import java.util.Arrays;
-
 /**
  * Tests of {@link TfIdf}.
  */

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/examples/java/src/test/java/org/apache/beam/examples/complete/TopWikipediaSessionsTest.java
--
diff --git 
a/examples/java/src/test/java/org/apache/beam/examples/complete/TopWikipediaSessionsTest.java
 
b/examples/java/src/test/java/org/apache/beam/examples/complete/TopWikipediaSessionsTest.java
index d19998e..42fb06a 100644
--- 
a/examples/java/src/test/java/org/apache/beam/examples/complete/TopWikipediaSessionsTest.java
+++ 
b/examples/java/src/test/java/org/apache/beam/examples/complete/TopWikipediaSessionsTest.java
@@ -17,22 +17,19 @@
  */
 package org.apache.beam.examples.complete;
 
+import com.google.api.services.bigquery.model.TableRow;
+import java.util.Arrays;
 import org.apache.beam.sdk.Pipeline;
 import org.apache.beam.sdk.testing.PAssert;
 import org.apache.beam.sdk.testing.RunnableOnService;
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.transforms.Create;
 import org.apache.beam.sdk.values.PCollection;
-
-import com.google.api.services.bigquery.model.TableRow;
-
 import org.junit.Test;
 import org.junit.experimental.categories.Category;
 import org.junit.runner.RunWith;
 import org.junit.runners.JUnit4;
 
-import java.util.Arrays;
-
 /** Unit tests for {@link TopWikipediaSessions}. */
 @RunWith(JUnit4.class)
 public class TopWikipediaSessionsTest {

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/examples/java/src/test/java/org/apache/beam/examples/cookbook/BigQueryTornadoesIT.java
--
diff --git 
a/examples/java/src/test/java/org/apache/beam/examples/cookbook/BigQueryTornadoesIT.java
 
b/examples/java/src/test/java/org/apache/beam/examples/cookbook/BigQueryTornadoesIT.java
index fbd775c..8bcab4a 100644
--- 
a/examples/java/src/test/java/org/apache/beam/examples/cookbook/BigQueryTornadoesIT.java
+++ 
b/examples/java/src/test/java/org/apache/beam/examples/cookbook/BigQueryTornadoesIT.java
@@ -21,7 +21,6 @@ package org.apache.beam.examples.cookbook;
 import org.apache.beam.sdk.options.PipelineOptionsFactory;
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.testing.TestPipelineOptions;
-
 import org.junit.Test;
 import org.junit.runner.RunWith;
 import org.junit.runners.JUnit4;

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/examples/java/src/test/java/org/apache/beam/examples/cookbook/BigQueryTornadoesTest.java
--
diff --git 
a/examples/java/src/test/java/org/apache/beam/examples/cookbook/BigQueryTornadoesTest.java
 
b/examples/java/src/test/java/org/apache/beam/examples/cookbook/BigQueryTornadoesTest.java
index b986c0b..87e1614 100644
--- 
a/examples/java/src/test/java/org/apache/beam/examples/cookbook/BigQueryTornadoesTest.java
+++ 
b/examples/java/src/test/java/org/apache/beam/examples/cookbook/BigQueryTornadoesTest.java
@@ -17,21 +17,18 @@
  */
 package org.apache.beam.examples.cookbook;
 
+import com.google.api.services.bigquery.model.TableRow;
+import java.util.List;
 import org.apache.beam.examples.cookbook.BigQueryTornadoes.ExtractTornadoesFn;
 import org.apache.beam.examples.cookbook.BigQueryTornadoes.FormatCountsFn;
 import org.apache.beam.sdk.transforms.DoFnTester;
 import org.apache.beam.sdk.values.KV;
-
-import com.google.api.services.bigquery.model.TableRow;
-
 import org.hamcrest.CoreMatchers;
 import org.junit.Assert;
 import org.junit.Test;
 import org.junit.runner.RunWith;
 import org.junit.runners.JUnit4;
 
-import java.util.List;
-
 /**
  * Test 

[11/18] incubator-beam git commit: Optimize imports

2016-08-24 Thread dhalperi
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/runners/spark/src/test/java/org/apache/beam/runners/spark/io/hadoop/ShardNameBuilderTest.java
--
diff --git 
a/runners/spark/src/test/java/org/apache/beam/runners/spark/io/hadoop/ShardNameBuilderTest.java
 
b/runners/spark/src/test/java/org/apache/beam/runners/spark/io/hadoop/ShardNameBuilderTest.java
index e1620db..1f2cf63 100644
--- 
a/runners/spark/src/test/java/org/apache/beam/runners/spark/io/hadoop/ShardNameBuilderTest.java
+++ 
b/runners/spark/src/test/java/org/apache/beam/runners/spark/io/hadoop/ShardNameBuilderTest.java
@@ -23,7 +23,6 @@ import static 
org.apache.beam.runners.spark.io.hadoop.ShardNameBuilder.getOutput
 import static 
org.apache.beam.runners.spark.io.hadoop.ShardNameBuilder.getOutputFileTemplate;
 import static 
org.apache.beam.runners.spark.io.hadoop.ShardNameBuilder.replaceShardCount;
 import static 
org.apache.beam.runners.spark.io.hadoop.ShardNameBuilder.replaceShardNumber;
-
 import static org.junit.Assert.assertEquals;
 
 import org.junit.Test;

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/runners/spark/src/test/java/org/apache/beam/runners/spark/translation/CombineGloballyTest.java
--
diff --git 
a/runners/spark/src/test/java/org/apache/beam/runners/spark/translation/CombineGloballyTest.java
 
b/runners/spark/src/test/java/org/apache/beam/runners/spark/translation/CombineGloballyTest.java
index e4ef7d7..8022d06 100644
--- 
a/runners/spark/src/test/java/org/apache/beam/runners/spark/translation/CombineGloballyTest.java
+++ 
b/runners/spark/src/test/java/org/apache/beam/runners/spark/translation/CombineGloballyTest.java
@@ -20,6 +20,9 @@ package org.apache.beam.runners.spark.translation;
 
 import static org.junit.Assert.assertEquals;
 
+import com.google.common.collect.Iterables;
+import java.util.Arrays;
+import java.util.List;
 import org.apache.beam.runners.spark.EvaluationResult;
 import org.apache.beam.runners.spark.SparkPipelineOptions;
 import org.apache.beam.runners.spark.SparkRunner;
@@ -29,14 +32,8 @@ import org.apache.beam.sdk.options.PipelineOptionsFactory;
 import org.apache.beam.sdk.transforms.Combine;
 import org.apache.beam.sdk.transforms.Create;
 import org.apache.beam.sdk.values.PCollection;
-
-import com.google.common.collect.Iterables;
-
 import org.junit.Test;
 
-import java.util.Arrays;
-import java.util.List;
-
 /**
  * Combine globally test.
  */

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/runners/spark/src/test/java/org/apache/beam/runners/spark/translation/CombinePerKeyTest.java
--
diff --git 
a/runners/spark/src/test/java/org/apache/beam/runners/spark/translation/CombinePerKeyTest.java
 
b/runners/spark/src/test/java/org/apache/beam/runners/spark/translation/CombinePerKeyTest.java
index cdf2cfb..281144f 100644
--- 
a/runners/spark/src/test/java/org/apache/beam/runners/spark/translation/CombinePerKeyTest.java
+++ 
b/runners/spark/src/test/java/org/apache/beam/runners/spark/translation/CombinePerKeyTest.java
@@ -18,6 +18,10 @@
 
 package org.apache.beam.runners.spark.translation;
 
+import com.google.common.collect.ImmutableList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
 import org.apache.beam.runners.spark.EvaluationResult;
 import org.apache.beam.runners.spark.SparkRunner;
 import org.apache.beam.sdk.Pipeline;
@@ -33,16 +37,9 @@ import org.apache.beam.sdk.transforms.ParDo;
 import org.apache.beam.sdk.transforms.Sum;
 import org.apache.beam.sdk.values.KV;
 import org.apache.beam.sdk.values.PCollection;
-
-import com.google.common.collect.ImmutableList;
-
 import org.junit.Assert;
 import org.junit.Test;
 
-import java.util.HashMap;
-import java.util.List;
-import java.util.Map;
-
 /**
  * Combine per key function test.
  */

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/runners/spark/src/test/java/org/apache/beam/runners/spark/translation/DoFnOutputTest.java
--
diff --git 
a/runners/spark/src/test/java/org/apache/beam/runners/spark/translation/DoFnOutputTest.java
 
b/runners/spark/src/test/java/org/apache/beam/runners/spark/translation/DoFnOutputTest.java
index e4b25bb..31e0dd8 100644
--- 
a/runners/spark/src/test/java/org/apache/beam/runners/spark/translation/DoFnOutputTest.java
+++ 
b/runners/spark/src/test/java/org/apache/beam/runners/spark/translation/DoFnOutputTest.java
@@ -18,6 +18,7 @@
 
 package org.apache.beam.runners.spark.translation;
 
+import java.io.Serializable;
 import org.apache.beam.runners.spark.EvaluationResult;
 import org.apache.beam.runners.spark.SparkPipelineOptions;
 import org.apache.beam.runners.spark.SparkRunner;
@@ -28,11 +29,8 @@ import org.apache.beam.sdk.transforms.Create;
 import 

[18/18] incubator-beam git commit: Closes #869

2016-08-24 Thread dhalperi
Closes #869


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/0f072414
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/0f072414
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/0f072414

Branch: refs/heads/master
Commit: 0f072414b58c664e030810a4f15028e6081788bd
Parents: 603f337 c623a27
Author: Dan Halperin 
Authored: Wed Aug 24 13:39:05 2016 -0700
Committer: Dan Halperin 
Committed: Wed Aug 24 13:39:05 2016 -0700

--
 .../beam/examples/DebuggingWordCount.java   |   8 +-
 .../apache/beam/examples/WindowedWordCount.java |  17 ++-
 .../org/apache/beam/examples/WordCount.java |   8 +-
 .../common/ExampleBigQueryTableOptions.java |   3 +-
 .../beam/examples/common/ExampleOptions.java|   7 +-
 .../beam/examples/common/ExampleUtils.java  |  14 +--
 .../examples/common/PubsubFileInjector.java |  14 +--
 .../beam/examples/complete/AutoComplete.java|  33 +++---
 .../examples/complete/StreamingWordExtract.java |  12 +-
 .../apache/beam/examples/complete/TfIdf.java|  14 +--
 .../examples/complete/TopWikipediaSessions.java |   7 +-
 .../examples/complete/TrafficMaxLaneFlow.java   |  19 ++--
 .../beam/examples/complete/TrafficRoutes.java   |  29 +++--
 .../examples/cookbook/BigQueryTornadoes.java|  12 +-
 .../cookbook/CombinePerKeyExamples.java |  12 +-
 .../examples/cookbook/DatastoreWordCount.java   |  18 ++-
 .../beam/examples/cookbook/FilterExamples.java  |  14 +--
 .../beam/examples/cookbook/JoinExamples.java|   3 +-
 .../examples/cookbook/MaxPerKeyExamples.java|  12 +-
 .../beam/examples/cookbook/TriggerExample.java  |  17 ++-
 .../beam/examples/DebuggingWordCountTest.java   |   6 +-
 .../org/apache/beam/examples/WordCountIT.java   |   4 +-
 .../org/apache/beam/examples/WordCountTest.java |   6 +-
 .../examples/complete/AutoCompleteTest.java |  12 +-
 .../beam/examples/complete/TfIdfTest.java   |   6 +-
 .../complete/TopWikipediaSessionsTest.java  |   7 +-
 .../examples/cookbook/BigQueryTornadoesIT.java  |   1 -
 .../cookbook/BigQueryTornadoesTest.java |   7 +-
 .../cookbook/CombinePerKeyExamplesTest.java |   7 +-
 .../examples/cookbook/DeDupExampleTest.java |   6 +-
 .../examples/cookbook/FilterExamplesTest.java   |   9 +-
 .../examples/cookbook/JoinExamplesTest.java |   9 +-
 .../cookbook/MaxPerKeyExamplesTest.java |   9 +-
 .../examples/cookbook/TriggerExampleTest.java   |  17 ++-
 .../beam/examples/MinimalWordCountJava8.java|   3 +-
 .../beam/examples/complete/game/GameStats.java  |   8 +-
 .../examples/complete/game/HourlyTeamScore.java |   8 +-
 .../examples/complete/game/LeaderBoard.java |   8 +-
 .../beam/examples/complete/game/UserScore.java  |   8 +-
 .../complete/game/injector/Injector.java|   8 +-
 .../complete/game/injector/InjectorUtils.java   |   1 -
 .../injector/RetryHttpInitializerWrapper.java   |   1 -
 .../complete/game/utils/WriteToBigQuery.java|  18 ++-
 .../game/utils/WriteWindowedToBigQuery.java |   6 +-
 .../examples/MinimalWordCountJava8Test.java |  21 ++--
 .../examples/complete/game/GameStatsTest.java   |   8 +-
 .../complete/game/HourlyTeamScoreTest.java  |   8 +-
 .../examples/complete/game/UserScoreTest.java   |   8 +-
 .../beam/runners/core/SideInputHandler.java |  17 ++-
 .../core/UnboundedReadFromBoundedSource.java|  37 +++
 .../apache/beam/sdk/util/AssignWindowsDoFn.java |   7 +-
 .../beam/sdk/util/BatchTimerInternals.java  |   8 +-
 .../apache/beam/sdk/util/DoFnRunnerBase.java|  22 ++--
 .../org/apache/beam/sdk/util/DoFnRunners.java   |   3 +-
 .../GroupAlsoByWindowsViaOutputBufferDoFn.java  |   7 +-
 .../sdk/util/GroupByKeyViaGroupByKeyOnly.java   |   9 +-
 .../sdk/util/LateDataDroppingDoFnRunner.java|  10 +-
 .../apache/beam/sdk/util/PaneInfoTracker.java   |   6 +-
 .../sdk/util/PushbackSideInputDoFnRunner.java   |   6 +-
 .../java/org/apache/beam/sdk/util/ReduceFn.java |   4 +-
 .../beam/sdk/util/ReduceFnContextFactory.java   |  12 +-
 .../apache/beam/sdk/util/ReduceFnRunner.java|  26 ++---
 .../apache/beam/sdk/util/SimpleDoFnRunner.java  |   2 +-
 .../apache/beam/sdk/util/SystemReduceFn.java|   3 +-
 .../org/apache/beam/sdk/util/TriggerRunner.java |  16 +--
 .../org/apache/beam/sdk/util/WatermarkHold.java |  12 +-
 .../beam/runners/core/SideInputHandlerTest.java |   4 +-
 .../UnboundedReadFromBoundedSourceTest.java |  27 ++---
 .../beam/sdk/util/BatchTimerInternalsTest.java  |   1 -
 .../sdk/util/GroupAlsoByWindowsProperties.java  |  19 ++--
 ...oupAlsoByWindowsViaOutputBufferDoFnTest.java |   1 -
 .../util/LateDataDroppingDoFnRunnerTest.java|   9 +-
 .../util/PushbackSideInputDoFnRunnerTest.java   |   9 +-
 .../beam/sdk/util/ReduceFnRunnerTest.java   |  13 +--
 

[01/18] incubator-beam git commit: Update checkstyle.xml to put all imports in one group

2016-08-24 Thread dhalperi
Repository: incubator-beam
Updated Branches:
  refs/heads/master 603f337b1 -> 0f072414b


Update checkstyle.xml to put all imports in one group


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/5776b936
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/5776b936
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/5776b936

Branch: refs/heads/master
Commit: 5776b9365c9a786d66d4d8e9dce6e3a8060c74d2
Parents: 603f337
Author: bchambers 
Authored: Tue Aug 23 17:06:17 2016 -0700
Committer: bchambers 
Committed: Wed Aug 24 12:55:45 2016 -0700

--
 sdks/java/build-tools/src/main/resources/beam/checkstyle.xml | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/5776b936/sdks/java/build-tools/src/main/resources/beam/checkstyle.xml
--
diff --git a/sdks/java/build-tools/src/main/resources/beam/checkstyle.xml 
b/sdks/java/build-tools/src/main/resources/beam/checkstyle.xml
index 4bb7428..47ddc5b 100644
--- a/sdks/java/build-tools/src/main/resources/beam/checkstyle.xml
+++ b/sdks/java/build-tools/src/main/resources/beam/checkstyle.xml
@@ -86,11 +86,10 @@ page at http://checkstyle.sourceforge.net/config.html -->
 
 
   
-
   
-  
   
   
+  
   
   



[05/18] incubator-beam git commit: Optimize imports

2016-08-24 Thread dhalperi
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/GroupByKeyTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/GroupByKeyTest.java
 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/GroupByKeyTest.java
index afe460f..bea0e2d 100644
--- 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/GroupByKeyTest.java
+++ 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/GroupByKeyTest.java
@@ -19,7 +19,6 @@ package org.apache.beam.sdk.transforms;
 
 import static org.apache.beam.sdk.TestUtils.KvMatcher.isKv;
 import static 
org.apache.beam.sdk.transforms.display.DisplayDataMatchers.hasDisplayItem;
-
 import static org.hamcrest.CoreMatchers.equalTo;
 import static org.hamcrest.CoreMatchers.hasItem;
 import static org.hamcrest.Matchers.empty;
@@ -27,6 +26,19 @@ import static 
org.hamcrest.collection.IsIterableContainingInAnyOrder.containsInA
 import static org.hamcrest.core.Is.is;
 import static org.junit.Assert.assertThat;
 
+import com.fasterxml.jackson.annotation.JsonCreator;
+import com.google.common.base.Function;
+import com.google.common.collect.Iterables;
+import java.io.DataInputStream;
+import java.io.DataOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.ThreadLocalRandom;
 import org.apache.beam.sdk.Pipeline;
 import org.apache.beam.sdk.coders.AtomicCoder;
 import org.apache.beam.sdk.coders.BigEndianIntegerCoder;
@@ -51,11 +63,6 @@ import org.apache.beam.sdk.values.PBegin;
 import org.apache.beam.sdk.values.PCollection;
 import org.apache.beam.sdk.values.TimestampedValue;
 import org.apache.beam.sdk.values.TypeDescriptor;
-
-import com.google.common.base.Function;
-import com.google.common.collect.Iterables;
-
-import com.fasterxml.jackson.annotation.JsonCreator;
 import org.joda.time.Duration;
 import org.joda.time.Instant;
 import org.junit.Assert;
@@ -66,17 +73,6 @@ import org.junit.rules.ExpectedException;
 import org.junit.runner.RunWith;
 import org.junit.runners.JUnit4;
 
-import java.io.DataInputStream;
-import java.io.DataOutputStream;
-import java.io.IOException;
-import java.io.InputStream;
-import java.io.OutputStream;
-import java.util.ArrayList;
-import java.util.Arrays;
-import java.util.List;
-import java.util.Map;
-import java.util.concurrent.ThreadLocalRandom;
-
 /**
  * Tests for GroupByKey.
  */

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/IntraBundleParallelizationTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/IntraBundleParallelizationTest.java
 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/IntraBundleParallelizationTest.java
index fa2fae9..b9afd35 100644
--- 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/IntraBundleParallelizationTest.java
+++ 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/IntraBundleParallelizationTest.java
@@ -20,7 +20,6 @@ package org.apache.beam.sdk.transforms;
 import static org.apache.beam.sdk.testing.SystemNanoTimeSleeper.sleepMillis;
 import static 
org.apache.beam.sdk.transforms.display.DisplayDataMatchers.hasDisplayItem;
 import static 
org.apache.beam.sdk.transforms.display.DisplayDataMatchers.includesDisplayDataFrom;
-
 import static org.hamcrest.Matchers.both;
 import static org.hamcrest.Matchers.containsString;
 import static org.hamcrest.Matchers.greaterThanOrEqualTo;
@@ -31,20 +30,18 @@ import static org.junit.Assert.assertEquals;
 import static org.junit.Assert.assertThat;
 import static org.junit.Assert.fail;
 
+import java.util.ArrayList;
+import java.util.concurrent.atomic.AtomicInteger;
 import org.apache.beam.sdk.Pipeline;
 import org.apache.beam.sdk.testing.NeedsRunner;
 import org.apache.beam.sdk.testing.TestPipeline;
 import org.apache.beam.sdk.transforms.display.DisplayData;
-
 import org.junit.Before;
 import org.junit.Test;
 import org.junit.experimental.categories.Category;
 import org.junit.runner.RunWith;
 import org.junit.runners.JUnit4;
 
-import java.util.ArrayList;
-import java.util.concurrent.atomic.AtomicInteger;
-
 /**
  * Tests for RateLimiter.
  */

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/KeysTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/KeysTest.java 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/KeysTest.java
index cf30940..fce5b2f 100644
--- 

[04/18] incubator-beam git commit: Optimize imports

2016-08-24 Thread dhalperi
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/test/java/org/apache/beam/sdk/util/ExposedByteArrayInputStreamTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/util/ExposedByteArrayInputStreamTest.java
 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/util/ExposedByteArrayInputStreamTest.java
index d717caf..31cf1a8 100644
--- 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/util/ExposedByteArrayInputStreamTest.java
+++ 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/util/ExposedByteArrayInputStreamTest.java
@@ -22,13 +22,12 @@ import static org.junit.Assert.assertEquals;
 import static org.junit.Assert.assertNotEquals;
 import static org.junit.Assert.assertSame;
 
+import java.io.ByteArrayInputStream;
+import java.io.IOException;
 import org.junit.Test;
 import org.junit.runner.RunWith;
 import org.junit.runners.JUnit4;
 
-import java.io.ByteArrayInputStream;
-import java.io.IOException;
-
 /** Unit tests for {@link ExposedByteArrayInputStream}. */
 @RunWith(JUnit4.class)
 public class ExposedByteArrayInputStreamTest {

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/test/java/org/apache/beam/sdk/util/ExposedByteArrayOutputStreamTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/util/ExposedByteArrayOutputStreamTest.java
 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/util/ExposedByteArrayOutputStreamTest.java
index 9819a9b..a3a7a1d 100644
--- 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/util/ExposedByteArrayOutputStreamTest.java
+++ 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/util/ExposedByteArrayOutputStreamTest.java
@@ -22,13 +22,12 @@ import static org.junit.Assert.assertEquals;
 import static org.junit.Assert.assertNotSame;
 import static org.junit.Assert.assertSame;
 
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
 import org.junit.Test;
 import org.junit.runner.RunWith;
 import org.junit.runners.JUnit4;
 
-import java.io.ByteArrayOutputStream;
-import java.io.IOException;
-
 /** Unit tests for {@link ExposedByteArrayOutputStream}. */
 @RunWith(JUnit4.class)
 public class ExposedByteArrayOutputStreamTest {

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/test/java/org/apache/beam/sdk/util/FileIOChannelFactoryTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/util/FileIOChannelFactoryTest.java
 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/util/FileIOChannelFactoryTest.java
index 79e6e5c..011b4f5 100644
--- 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/util/FileIOChannelFactoryTest.java
+++ 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/util/FileIOChannelFactoryTest.java
@@ -25,15 +25,6 @@ import static org.junit.Assert.assertTrue;
 import com.google.common.collect.ImmutableList;
 import com.google.common.io.Files;
 import com.google.common.io.LineReader;
-
-import org.hamcrest.Matchers;
-import org.junit.Rule;
-import org.junit.Test;
-import org.junit.rules.ExpectedException;
-import org.junit.rules.TemporaryFolder;
-import org.junit.runner.RunWith;
-import org.junit.runners.JUnit4;
-
 import java.io.File;
 import java.io.FileNotFoundException;
 import java.io.Reader;
@@ -42,6 +33,13 @@ import java.nio.channels.Channels;
 import java.nio.charset.StandardCharsets;
 import java.nio.file.Path;
 import java.util.List;
+import org.hamcrest.Matchers;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.ExpectedException;
+import org.junit.rules.TemporaryFolder;
+import org.junit.runner.RunWith;
+import org.junit.runners.JUnit4;
 
 /** Tests for {@link FileIOChannelFactory}. */
 @RunWith(JUnit4.class)

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/core/src/test/java/org/apache/beam/sdk/util/FinishedTriggersSetTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/util/FinishedTriggersSetTest.java
 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/util/FinishedTriggersSetTest.java
index b3b1856..072d264 100644
--- 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/util/FinishedTriggersSetTest.java
+++ 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/util/FinishedTriggersSetTest.java
@@ -21,12 +21,11 @@ import static org.hamcrest.Matchers.not;
 import static org.hamcrest.Matchers.theInstance;
 import static org.junit.Assert.assertThat;
 
+import java.util.HashSet;
 import org.junit.Test;
 import org.junit.runner.RunWith;
 import org.junit.runners.JUnit4;
 
-import java.util.HashSet;
-
 /**
  * Tests for {@link FinishedTriggersSet}.
  */


[02/18] incubator-beam git commit: Optimize imports

2016-08-24 Thread dhalperi
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/src/main/java/common/DataflowExampleUtils.java
--
diff --git 
a/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/src/main/java/common/DataflowExampleUtils.java
 
b/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/src/main/java/common/DataflowExampleUtils.java
index fa29fdd..9e6be78 100644
--- 
a/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/src/main/java/common/DataflowExampleUtils.java
+++ 
b/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/src/main/java/common/DataflowExampleUtils.java
@@ -17,12 +17,6 @@
  */
 package ${package}.common;
 
-import org.apache.beam.runners.dataflow.BlockingDataflowRunner;
-import org.apache.beam.runners.dataflow.DataflowPipelineJob;
-import org.apache.beam.runners.dataflow.DataflowRunner;
-import org.apache.beam.runners.dataflow.options.DataflowPipelineOptions;
-import org.apache.beam.runners.dataflow.util.MonitoringUtil;
-
 import com.google.api.client.googleapis.json.GoogleJsonResponseException;
 import com.google.api.client.googleapis.services.AbstractGoogleClientRequest;
 import com.google.api.services.bigquery.Bigquery;
@@ -36,21 +30,24 @@ import com.google.api.services.bigquery.model.TableSchema;
 import com.google.api.services.dataflow.Dataflow;
 import com.google.api.services.pubsub.Pubsub;
 import com.google.api.services.pubsub.model.Topic;
-import org.apache.beam.sdk.Pipeline;
-import org.apache.beam.sdk.PipelineResult;
-import org.apache.beam.sdk.io.TextIO;
-import org.apache.beam.sdk.options.BigQueryOptions;
-import org.apache.beam.sdk.transforms.IntraBundleParallelization;
-import org.apache.beam.sdk.util.Transport;
 import com.google.common.collect.Lists;
 import com.google.common.collect.Sets;
-
 import java.io.IOException;
 import java.util.Collection;
 import java.util.List;
 import java.util.Set;
-
 import javax.servlet.http.HttpServletResponse;
+import org.apache.beam.runners.dataflow.BlockingDataflowRunner;
+import org.apache.beam.runners.dataflow.DataflowPipelineJob;
+import org.apache.beam.runners.dataflow.DataflowRunner;
+import org.apache.beam.runners.dataflow.options.DataflowPipelineOptions;
+import org.apache.beam.runners.dataflow.util.MonitoringUtil;
+import org.apache.beam.sdk.Pipeline;
+import org.apache.beam.sdk.PipelineResult;
+import org.apache.beam.sdk.io.TextIO;
+import org.apache.beam.sdk.options.BigQueryOptions;
+import org.apache.beam.sdk.transforms.IntraBundleParallelization;
+import org.apache.beam.sdk.util.Transport;
 
 /**
  * The utility class that sets up and tears down external resources, starts 
the Google Cloud Pub/Sub

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/src/main/java/common/ExampleBigQueryTableOptions.java
--
diff --git 
a/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/src/main/java/common/ExampleBigQueryTableOptions.java
 
b/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/src/main/java/common/ExampleBigQueryTableOptions.java
index 279f2e0..79fa865 100644
--- 
a/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/src/main/java/common/ExampleBigQueryTableOptions.java
+++ 
b/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/src/main/java/common/ExampleBigQueryTableOptions.java
@@ -17,14 +17,13 @@
  */
 package ${package}.common;
 
+import com.google.api.services.bigquery.model.TableSchema;
 import org.apache.beam.runners.dataflow.options.DataflowPipelineOptions;
 import org.apache.beam.sdk.options.Default;
 import org.apache.beam.sdk.options.DefaultValueFactory;
 import org.apache.beam.sdk.options.Description;
 import org.apache.beam.sdk.options.PipelineOptions;
 
-import com.google.api.services.bigquery.model.TableSchema;
-
 /**
  * Options that can be used to configure BigQuery tables in Dataflow examples.
  * The project defaults to the project being used to run the example.

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/c623a271/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/src/main/java/common/PubsubFileInjector.java
--
diff --git 
a/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/src/main/java/common/PubsubFileInjector.java
 
b/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/src/main/java/common/PubsubFileInjector.java
index 9b347da..58e0821 100644
--- 
a/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/src/main/java/common/PubsubFileInjector.java
+++ 

[jira] [Commented] (BEAM-582) Allow usage of the new GCP service account JSON key

2016-08-24 Thread Daniel Halperin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-582?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435674#comment-15435674
 ] 

Daniel Halperin commented on BEAM-582:
--

Thanks!

> Allow usage of the new GCP service account JSON key
> ---
>
> Key: BEAM-582
> URL: https://issues.apache.org/jira/browse/BEAM-582
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-dataflow
>Reporter: Alex Van Boxel
>Assignee: Davor Bonaci
>
> The new JSON service account files are a lot easier to use, you don't need to 
> provide the accountId (as it's embedded in the JSON files, including the 
> private key as well).
> I noticed this will integrating Cloud DataFlow in Apache Airflow, where I 
> upgraded the usage of the service keys. Airflow will drop support for the old 
> service files.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-582) Allow usage of the new GCP service account JSON key

2016-08-24 Thread Alex Van Boxel (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-582?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435675#comment-15435675
 ] 

Alex Van Boxel commented on BEAM-582:
-

I've included information in the description. Already got a review on the PR 
and will incorporate them in the PR tomorrow (it's time to get some sleep now).


> Allow usage of the new GCP service account JSON key
> ---
>
> Key: BEAM-582
> URL: https://issues.apache.org/jira/browse/BEAM-582
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-dataflow
>Reporter: Alex Van Boxel
>Assignee: Davor Bonaci
>
> The new JSON service account files are a lot easier to use, you don't need to 
> provide the accountId (as it's embedded in the JSON files, including the 
> private key as well).
> I noticed this will integrating Cloud DataFlow in Apache Airflow, where I 
> upgraded the usage of the service keys. Airflow will drop support for the old 
> service files.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (BEAM-582) Allow usage of the new GCP service account JSON key

2016-08-24 Thread Alex Van Boxel (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-582?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alex Van Boxel updated BEAM-582:

Description: 
The new JSON service account files are a lot easier to use, you don't need to 
provide the accountId (as it's embedded in the JSON files, including the 
private key as well).

I noticed this will integrating Cloud DataFlow in Apache Airflow, where I 
upgraded the usage of the service keys. Airflow will drop support for the old 
service files.

> Allow usage of the new GCP service account JSON key
> ---
>
> Key: BEAM-582
> URL: https://issues.apache.org/jira/browse/BEAM-582
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-dataflow
>Reporter: Alex Van Boxel
>Assignee: Davor Bonaci
>
> The new JSON service account files are a lot easier to use, you don't need to 
> provide the accountId (as it's embedded in the JSON files, including the 
> private key as well).
> I noticed this will integrating Cloud DataFlow in Apache Airflow, where I 
> upgraded the usage of the service keys. Airflow will drop support for the old 
> service files.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-583) Auto Register TestDataflowRunner

2016-08-24 Thread Luke Cwik (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-583?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435667#comment-15435667
 ] 

Luke Cwik commented on BEAM-583:


Would a pipeline author actually use the TestDataflowRunner.
PipelineOptionsFactory supports a concept of --help and would list test runners 
if we did this in general.

> Auto Register TestDataflowRunner 
> -
>
> Key: BEAM-583
> URL: https://issues.apache.org/jira/browse/BEAM-583
> Project: Beam
>  Issue Type: Improvement
>Reporter: Mark Liu
>Assignee: Mark Liu
>
> Register TestDataflowRunner automatically.
> Simplify option's arguments when using TestDataflowRunner run end-to-end test 
> against dataflow service. Instead of using 
> {code}--runner=org.apache.beam.runners.dataflow.testing.TestDataflowRunner{code},
>  we can also use {code}--runner=TestDataflowRunner{code}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (BEAM-583) Auto Register TestDataflowRunner

2016-08-24 Thread Mark Liu (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-583?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Liu updated BEAM-583:
--
Description: 
Register TestDataflowRunner automatically.

Simplify option's arguments when using TestDataflowRunner run end-to-end test 
against dataflow service. Instead of using 
{code}--runner=org.apache.beam.runners.dataflow.testing.TestDataflowRunner{code},
 we can also use {code}--runner=TestDataflowRunner{code}.

  was:
Register TestDataflowRunner automatically.

Simplify option's arguments when using TestDataflowRunner run end-to-end test 
against dataflow service. Instead of using 
"--runner=org.apache.beam.runners.dataflow.testing.TestDataflowRunner", we can 
also use "--runner=TestDataflowRunner".


> Auto Register TestDataflowRunner 
> -
>
> Key: BEAM-583
> URL: https://issues.apache.org/jira/browse/BEAM-583
> Project: Beam
>  Issue Type: Improvement
>Reporter: Mark Liu
>Assignee: Mark Liu
>
> Register TestDataflowRunner automatically.
> Simplify option's arguments when using TestDataflowRunner run end-to-end test 
> against dataflow service. Instead of using 
> {code}--runner=org.apache.beam.runners.dataflow.testing.TestDataflowRunner{code},
>  we can also use {code}--runner=TestDataflowRunner{code}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-582) Allow usage of the new GCP service account JSON key

2016-08-24 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-582?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435641#comment-15435641
 ] 

ASF GitHub Bot commented on BEAM-582:
-

GitHub user alexvanboxel opened a pull request:

https://github.com/apache/incubator-beam/pull/879

[BEAM-582] Allow usage of the new GCP service account JSON key

Allow the usage of the more modern JSON files on the GCP. This required for 
seamless integration planned in Apache Airflow 1.8

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/alexvanboxel/incubator-beam 
feature/BEAM-582-json-gcp-service-account

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/879.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #879


commit f0334efac1af7ceb3b87b93b94407324d3b4a814
Author: Alex Van Boxel 
Date:   2016-08-24T20:24:28Z

[BEAM-582] Allow usage of the new GCP service account JSON key




> Allow usage of the new GCP service account JSON key
> ---
>
> Key: BEAM-582
> URL: https://issues.apache.org/jira/browse/BEAM-582
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-dataflow
>Reporter: Alex Van Boxel
>Assignee: Davor Bonaci
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (BEAM-583) Auto Register TestDataflowRunner

2016-08-24 Thread Mark Liu (JIRA)
Mark Liu created BEAM-583:
-

 Summary: Auto Register TestDataflowRunner 
 Key: BEAM-583
 URL: https://issues.apache.org/jira/browse/BEAM-583
 Project: Beam
  Issue Type: Improvement
Reporter: Mark Liu
Assignee: Mark Liu


Register TestDataflowRunner automatically.

Simplify option's arguments when using TestDataflowRunner run end-to-end test 
against dataflow service. Instead of using 
"--runner=org.apache.beam.runners.dataflow.testing.TestDataflowRunner", we can 
also use "--runner=TestDataflowRunner".



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] incubator-beam pull request #879: [BEAM-582] Allow usage of the new GCP serv...

2016-08-24 Thread alexvanboxel
GitHub user alexvanboxel opened a pull request:

https://github.com/apache/incubator-beam/pull/879

[BEAM-582] Allow usage of the new GCP service account JSON key

Allow the usage of the more modern JSON files on the GCP. This required for 
seamless integration planned in Apache Airflow 1.8

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/alexvanboxel/incubator-beam 
feature/BEAM-582-json-gcp-service-account

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/879.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #879


commit f0334efac1af7ceb3b87b93b94407324d3b4a814
Author: Alex Van Boxel 
Date:   2016-08-24T20:24:28Z

[BEAM-582] Allow usage of the new GCP service account JSON key




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-582) Allow usage of the new GCP service account JSON key

2016-08-24 Thread Daniel Halperin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-582?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435574#comment-15435574
 ] 

Daniel Halperin commented on BEAM-582:
--

Hi Alex,

Can you please provide more detail on this change? I do not think there is 
sufficient context. (You can use the description field to supply details).

> Allow usage of the new GCP service account JSON key
> ---
>
> Key: BEAM-582
> URL: https://issues.apache.org/jira/browse/BEAM-582
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-dataflow
>Reporter: Alex Van Boxel
>Assignee: Davor Bonaci
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-582) Allow usage of the new GCP service account JSON key

2016-08-24 Thread Luke Cwik (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-582?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435554#comment-15435554
 ] 

Luke Cwik commented on BEAM-582:


Adopting gcloud-java-core would give us this instead of using our own 
implementation on getting service accounts, default project, ...

> Allow usage of the new GCP service account JSON key
> ---
>
> Key: BEAM-582
> URL: https://issues.apache.org/jira/browse/BEAM-582
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-dataflow
>Reporter: Alex Van Boxel
>Assignee: Davor Bonaci
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[01/17] incubator-beam git commit: Fix Exception Unwrapping in TestFlinkRunner

2016-08-24 Thread kenn
Repository: incubator-beam
Updated Branches:
  refs/heads/master a87015bfe -> 603f337b1


Fix Exception Unwrapping in TestFlinkRunner


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/695a80a7
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/695a80a7
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/695a80a7

Branch: refs/heads/master
Commit: 695a80a7df12887df088d0c094177a80edc69ae2
Parents: 79dcc6b
Author: Aljoscha Krettek 
Authored: Sat Aug 20 10:37:42 2016 +0200
Committer: Kenneth Knowles 
Committed: Wed Aug 24 12:46:24 2016 -0700

--
 .../beam/runners/flink/TestFlinkRunner.java  | 19 ++-
 1 file changed, 14 insertions(+), 5 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/695a80a7/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/TestFlinkRunner.java
--
diff --git 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/TestFlinkRunner.java
 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/TestFlinkRunner.java
index 2a82749..6a4f990 100644
--- 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/TestFlinkRunner.java
+++ 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/TestFlinkRunner.java
@@ -62,13 +62,22 @@ public class TestFlinkRunner extends 
PipelineRunner {
   FlinkRunnerResult result = delegate.run(pipeline);
 
   return result;
-} catch (RuntimeException e) {
+} catch (Throwable e) {
   // Special case hack to pull out assertion errors from PAssert; instead 
there should
   // probably be a better story along the lines of UserCodeException.
-  if (e.getCause() != null
-  && e.getCause() instanceof JobExecutionException
-  && e.getCause().getCause() instanceof AssertionError) {
-  throw (AssertionError) e.getCause().getCause();
+  Throwable cause = e;
+  Throwable oldCause = e;
+  do {
+if (cause.getCause() == null) {
+  break;
+}
+
+oldCause = cause;
+cause = cause.getCause();
+
+  } while (!oldCause.equals(cause));
+  if (cause instanceof AssertionError) {
+throw (AssertionError) cause;
   } else {
 throw e;
   }



[11/17] incubator-beam git commit: [BEAM-102] Add Side Inputs in Flink Streaming Runner

2016-08-24 Thread kenn
[BEAM-102] Add Side Inputs in Flink Streaming Runner

This adds a generic SideInputHandler in runners-core that is only used
by the Flink runner right now but can be used by other runner
implementations.


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/dfbdc6c2
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/dfbdc6c2
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/dfbdc6c2

Branch: refs/heads/master
Commit: dfbdc6c2bbef5e749bfc1800f97d21377f0c713d
Parents: ff34f9e
Author: Aljoscha Krettek 
Authored: Mon Jul 11 14:08:35 2016 +0200
Committer: Kenneth Knowles 
Committed: Wed Aug 24 12:46:24 2016 -0700

--
 .../beam/runners/core/SideInputHandler.java | 240 ++
 .../beam/runners/core/SideInputHandlerTest.java | 222 ++
 .../apache/beam/runners/flink/FlinkRunner.java  | 386 +-
 .../beam/runners/flink/TestFlinkRunner.java |   4 +-
 .../FlinkStreamingPipelineTranslator.java   |  59 +-
 .../FlinkStreamingTransformTranslators.java | 727 +--
 .../translation/types/CoderTypeInformation.java |   4 +
 .../wrappers/streaming/DoFnOperator.java| 282 ++-
 .../wrappers/streaming/WindowDoFnOperator.java  |  47 +-
 .../streaming/io/BoundedSourceWrapper.java  | 219 ++
 .../io/FlinkStreamingCreateFunction.java|  56 --
 .../flink/streaming/DoFnOperatorTest.java   | 328 +
 .../streaming/UnboundedSourceWrapperTest.java   |   2 +-
 .../beam/sdk/transforms/join/RawUnionValue.java |  25 +
 14 files changed, 2270 insertions(+), 331 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/dfbdc6c2/runners/core-java/src/main/java/org/apache/beam/runners/core/SideInputHandler.java
--
diff --git 
a/runners/core-java/src/main/java/org/apache/beam/runners/core/SideInputHandler.java
 
b/runners/core-java/src/main/java/org/apache/beam/runners/core/SideInputHandler.java
new file mode 100644
index 000..6550251
--- /dev/null
+++ 
b/runners/core-java/src/main/java/org/apache/beam/runners/core/SideInputHandler.java
@@ -0,0 +1,240 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.core;
+
+import org.apache.beam.sdk.coders.Coder;
+import org.apache.beam.sdk.coders.SetCoder;
+import org.apache.beam.sdk.transforms.Combine;
+import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
+import org.apache.beam.sdk.util.ReadyCheckingSideInputReader;
+import org.apache.beam.sdk.util.WindowedValue;
+import org.apache.beam.sdk.util.state.AccumulatorCombiningState;
+import org.apache.beam.sdk.util.state.StateInternals;
+import org.apache.beam.sdk.util.state.StateNamespaces;
+import org.apache.beam.sdk.util.state.StateTag;
+import org.apache.beam.sdk.util.state.StateTags;
+import org.apache.beam.sdk.util.state.ValueState;
+import org.apache.beam.sdk.values.PCollectionView;
+
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import javax.annotation.Nullable;
+
+/**
+ * Generic side input handler that uses {@link StateInternals} to store all 
data. Both the actual
+ * side-input data and data about the windows for which we have side inputs 
available are stored
+ * using {@code StateInternals}.
+ *
+ * The given {@code StateInternals} must not be scoped to an element key. 
The state
+ * must instead be scoped to one key group for which the side input is being 
managed.
+ *
+ * This is useful for runners that transmit the side-input elements in 
band, as opposed
+ * to how Dataflow has an external service for managing side inputs.
+ *
+ * Note: storing the available windows in an extra state is redundant for 
now but in the
+ * future we might want to know which windows we have available so that we can 
garbage collect
+ * side input data. For now, this will 

[02/17] incubator-beam git commit: Make ParDoLifecycleTest Serializable to Fix Test with TupleTag

2016-08-24 Thread kenn
Make ParDoLifecycleTest Serializable to Fix Test with TupleTag


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/79dcc6b9
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/79dcc6b9
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/79dcc6b9

Branch: refs/heads/master
Commit: 79dcc6b9691c1bc90c3327679b12e333f8089498
Parents: d4b024b
Author: Aljoscha Krettek 
Authored: Sat Aug 20 10:23:06 2016 +0200
Committer: Kenneth Knowles 
Committed: Wed Aug 24 12:46:24 2016 -0700

--
 .../java/org/apache/beam/sdk/transforms/ParDoLifecycleTest.java   | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/79dcc6b9/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/ParDoLifecycleTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/ParDoLifecycleTest.java
 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/ParDoLifecycleTest.java
index 272fea7..c4ba8b7 100644
--- 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/ParDoLifecycleTest.java
+++ 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/ParDoLifecycleTest.java
@@ -37,13 +37,14 @@ import org.junit.experimental.categories.Category;
 import org.junit.runner.RunWith;
 import org.junit.runners.JUnit4;
 
+import java.io.Serializable;
 import java.util.concurrent.atomic.AtomicBoolean;
 
 /**
  * Tests that {@link ParDo} exercises {@link DoFn} methods in the appropriate 
sequence.
  */
 @RunWith(JUnit4.class)
-public class ParDoLifecycleTest {
+public class ParDoLifecycleTest implements Serializable {
   @Test
   @Category(RunnableOnService.class)
   public void testOldFnCallSequence() {



[09/17] incubator-beam git commit: Fix Checkstyle Errors in FlinkStreamingTransformTranslators

2016-08-24 Thread kenn
Fix Checkstyle Errors in FlinkStreamingTransformTranslators


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/ff34f9e8
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/ff34f9e8
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/ff34f9e8

Branch: refs/heads/master
Commit: ff34f9e81867a656e4d9dd0987063c58cbb1de88
Parents: 1de76b7
Author: Aljoscha Krettek 
Authored: Sun Jul 10 17:01:38 2016 +0200
Committer: Kenneth Knowles 
Committed: Wed Aug 24 12:46:24 2016 -0700

--
 .../FlinkStreamingTransformTranslators.java | 155 ---
 1 file changed, 102 insertions(+), 53 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/ff34f9e8/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/FlinkStreamingTransformTranslators.java
--
diff --git 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/FlinkStreamingTransformTranslators.java
 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/FlinkStreamingTransformTranslators.java
index fff629c..8167623 100644
--- 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/FlinkStreamingTransformTranslators.java
+++ 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/FlinkStreamingTransformTranslators.java
@@ -106,7 +106,9 @@ public class FlinkStreamingTransformTranslators {
   // 

 
   @SuppressWarnings("rawtypes")
-  private static final Map TRANSLATORS = new 
HashMap<>();
+  private static final Map<
+  Class,
+  FlinkStreamingPipelineTranslator.StreamTransformTranslator> TRANSLATORS 
= new HashMap<>();
 
   // here you can find all the available translators.
   static {
@@ -125,7 +127,8 @@ public class FlinkStreamingTransformTranslators {
 TRANSLATORS.put(ParDo.BoundMulti.class, new 
ParDoBoundMultiStreamingTranslator());
   }
 
-  public static FlinkStreamingPipelineTranslator.StreamTransformTranslator 
getTranslator(PTransform transform) {
+  public static FlinkStreamingPipelineTranslator.StreamTransformTranslator 
getTranslator(
+  PTransform transform) {
 return TRANSLATORS.get(transform.getClass());
   }
 
@@ -133,21 +136,24 @@ public class FlinkStreamingTransformTranslators {
   //  Transformation Implementations
   // 

 
-  private static class CreateStreamingTranslator implements
-  
FlinkStreamingPipelineTranslator.StreamTransformTranslator {
+  private static class CreateStreamingTranslator implements
+  
FlinkStreamingPipelineTranslator.StreamTransformTranslator
 {
 
 @Override
-public void translateNode(Create.Values transform, 
FlinkStreamingTranslationContext context) {
-  PCollection output = context.getOutput(transform);
-  Iterable elements = transform.getElements();
+public void translateNode(
+Create.Values transform,
+FlinkStreamingTranslationContext context) {
+
+  PCollection output = context.getOutput(transform);
+  Iterable elements = transform.getElements();
 
   // we need to serialize the elements to byte arrays, since they might 
contain
   // elements that are not serializable by Java serialization. We 
deserialize them
   // in the FlatMap function using the Coder.
 
   List serializedElements = Lists.newArrayList();
-  Coder elementCoder = output.getCoder();
-  for (OUT element: elements) {
+  Coder elementCoder = output.getCoder();
+  for (OutputT element: elements) {
 ByteArrayOutputStream bao = new ByteArrayOutputStream();
 try {
   elementCoder.encode(element, bao, Coder.Context.OUTER);
@@ -160,25 +166,33 @@ public class FlinkStreamingTransformTranslators {
 
   DataStream initDataSet = 
context.getExecutionEnvironment().fromElements(1);
 
-  FlinkStreamingCreateFunction createFunction =
+  FlinkStreamingCreateFunction createFunction =
   new FlinkStreamingCreateFunction<>(serializedElements, elementCoder);
 
-  WindowedValue.ValueOnlyWindowedValueCoder windowCoder = 
WindowedValue.getValueOnlyCoder(elementCoder);
-  TypeInformation outputType = new 
CoderTypeInformation<>(windowCoder);
+  WindowedValue.ValueOnlyWindowedValueCoder windowCoder =
+  WindowedValue.getValueOnlyCoder(elementCoder);
+
+  

[10/17] incubator-beam git commit: [BEAM-102] Add Side Inputs in Flink Streaming Runner

2016-08-24 Thread kenn
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/dfbdc6c2/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
--
diff --git 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
index e273132..092a226 100644
--- 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
+++ 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/DoFnOperator.java
@@ -17,9 +17,13 @@
  */
 package org.apache.beam.runners.flink.translation.wrappers.streaming;
 
+import avro.shaded.com.google.common.base.Preconditions;
+
+import org.apache.beam.runners.core.SideInputHandler;
 import 
org.apache.beam.runners.flink.translation.utils.SerializedPipelineOptions;
 import 
org.apache.beam.runners.flink.translation.wrappers.SerializableFnAggregatorWrapper;
 import org.apache.beam.sdk.coders.Coder;
+import org.apache.beam.sdk.coders.VoidCoder;
 import org.apache.beam.sdk.options.PipelineOptions;
 import org.apache.beam.sdk.transforms.Aggregator;
 import org.apache.beam.sdk.transforms.Combine;
@@ -27,9 +31,13 @@ import org.apache.beam.sdk.transforms.DoFn;
 import org.apache.beam.sdk.transforms.OldDoFn;
 import org.apache.beam.sdk.transforms.join.RawUnionValue;
 import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
+import org.apache.beam.sdk.util.CoderUtils;
 import org.apache.beam.sdk.util.DoFnRunner;
 import org.apache.beam.sdk.util.DoFnRunners;
 import org.apache.beam.sdk.util.ExecutionContext;
+import org.apache.beam.sdk.util.NullSideInputReader;
+import org.apache.beam.sdk.util.PushbackSideInputDoFnRunner;
+import org.apache.beam.sdk.util.SideInputReader;
 import org.apache.beam.sdk.util.TimerInternals;
 import org.apache.beam.sdk.util.WindowedValue;
 import org.apache.beam.sdk.util.WindowingStrategy;
@@ -37,15 +45,36 @@ import org.apache.beam.sdk.util.state.StateInternals;
 import org.apache.beam.sdk.values.PCollectionView;
 import org.apache.beam.sdk.values.TupleTag;
 
+import com.google.common.collect.Iterables;
+
+import org.apache.flink.api.common.ExecutionConfig;
+import org.apache.flink.api.common.functions.ReduceFunction;
+import org.apache.flink.api.common.state.ListState;
+import org.apache.flink.api.common.state.ListStateDescriptor;
+import org.apache.flink.api.common.state.ReducingState;
+import org.apache.flink.api.common.state.ReducingStateDescriptor;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeutils.base.LongSerializer;
+import org.apache.flink.api.common.typeutils.base.VoidSerializer;
+import org.apache.flink.api.java.typeutils.GenericTypeInfo;
+import org.apache.flink.runtime.state.AbstractStateBackend;
+import org.apache.flink.runtime.state.KvStateSnapshot;
+import org.apache.flink.runtime.state.StateHandle;
 import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
 import org.apache.flink.streaming.api.operators.ChainingStrategy;
 import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
 import org.apache.flink.streaming.api.operators.Output;
+import org.apache.flink.streaming.api.operators.TwoInputStreamOperator;
 import org.apache.flink.streaming.api.watermark.Watermark;
 import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.streaming.runtime.tasks.StreamTaskState;
 
 import java.io.IOException;
 import java.io.Serializable;
+import java.nio.ByteBuffer;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.HashMap;
 import java.util.List;
 import java.util.Map;
 
@@ -58,7 +87,8 @@ import java.util.Map;
  */
 public class DoFnOperator
 extends AbstractStreamOperator
-implements OneInputStreamOperator {
+implements OneInputStreamOperator,
+  TwoInputStreamOperator {
 
   protected OldDoFn doFn;
   protected final SerializedPipelineOptions serializedOptions;
@@ -66,7 +96,8 @@ public class DoFnOperator
   protected final TupleTag mainOutputTag;
   protected final List sideOutputTags;
 
-  protected final Map sideInputs;
+  protected final Collection sideInputs;
+  protected final Map sideInputTagMapping;
 
   protected final boolean hasSideInputs;
 
@@ -74,25 +105,36 @@ public class DoFnOperator
 
   protected final OutputManagerFactory outputManagerFactory;
 
-  protected transient DoFnRunner doFnRunner;
+  protected transient 

[08/17] incubator-beam git commit: [BEAM-253] Unify Flink-Streaming Operator Wrappers

2016-08-24 Thread kenn
[BEAM-253] Unify Flink-Streaming Operator Wrappers

This also replaces the custom Flink StateInternals by proper Flink
Partitioned StateInternals.


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/1de76b7a
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/1de76b7a
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/1de76b7a

Branch: refs/heads/master
Commit: 1de76b7a5169a46ef9f14406e5a6e1284832f7f9
Parents: d94bffd
Author: Aljoscha Krettek 
Authored: Sat Jun 11 11:42:12 2016 +0200
Committer: Kenneth Knowles 
Committed: Wed Aug 24 12:46:24 2016 -0700

--
 .../FlinkStreamingTransformTranslators.java |  368 +--
 .../wrappers/streaming/DoFnOperator.java|  268 +
 .../streaming/FlinkAbstractParDoWrapper.java|  282 -
 .../FlinkGroupAlsoByWindowWrapper.java  |  644 ---
 .../streaming/FlinkGroupByKeyWrapper.java   |   73 --
 .../streaming/FlinkParDoBoundMultiWrapper.java  |   79 --
 .../streaming/FlinkParDoBoundWrapper.java   |  104 --
 .../wrappers/streaming/FlinkStateInternals.java | 1038 ++
 .../streaming/SingletonKeyedWorkItem.java   |   54 +
 .../streaming/SingletonKeyedWorkItemCoder.java  |  125 +++
 .../wrappers/streaming/WindowDoFnOperator.java  |  326 ++
 .../wrappers/streaming/WorkItemKeySelector.java |   58 +
 .../state/AbstractFlinkTimerInternals.java  |  127 ---
 .../streaming/state/FlinkStateInternals.java|  733 -
 .../streaming/state/StateCheckpointReader.java  |   93 --
 .../streaming/state/StateCheckpointUtils.java   |  155 ---
 .../streaming/state/StateCheckpointWriter.java  |  131 ---
 .../wrappers/streaming/state/StateType.java |   73 --
 .../beam/runners/flink/PipelineOptionsTest.java |  103 +-
 .../streaming/FlinkStateInternalsTest.java  |  391 +++
 .../flink/streaming/GroupAlsoByWindowTest.java  |  523 -
 .../flink/streaming/StateSerializationTest.java |  338 --
 22 files changed, 2572 insertions(+), 3514 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/1de76b7a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/FlinkStreamingTransformTranslators.java
--
diff --git 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/FlinkStreamingTransformTranslators.java
 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/FlinkStreamingTransformTranslators.java
index 5b55d42..fff629c 100644
--- 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/FlinkStreamingTransformTranslators.java
+++ 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/FlinkStreamingTransformTranslators.java
@@ -18,13 +18,15 @@
 
 package org.apache.beam.runners.flink.translation;
 
+import org.apache.beam.runners.core.GroupAlsoByWindowViaWindowSetDoFn;
 import org.apache.beam.runners.flink.translation.types.CoderTypeInformation;
 import org.apache.beam.runners.flink.translation.types.FlinkCoder;
 import org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat;
-import 
org.apache.beam.runners.flink.translation.wrappers.streaming.FlinkGroupAlsoByWindowWrapper;
-import 
org.apache.beam.runners.flink.translation.wrappers.streaming.FlinkGroupByKeyWrapper;
-import 
org.apache.beam.runners.flink.translation.wrappers.streaming.FlinkParDoBoundMultiWrapper;
-import 
org.apache.beam.runners.flink.translation.wrappers.streaming.FlinkParDoBoundWrapper;
+import 
org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator;
+import 
org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItem;
+import 
org.apache.beam.runners.flink.translation.wrappers.streaming.SingletonKeyedWorkItemCoder;
+import 
org.apache.beam.runners.flink.translation.wrappers.streaming.WindowDoFnOperator;
+import 
org.apache.beam.runners.flink.translation.wrappers.streaming.WorkItemKeySelector;
 import 
org.apache.beam.runners.flink.translation.wrappers.streaming.io.FlinkStreamingCreateFunction;
 import 
org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedFlinkSink;
 import 
org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedFlinkSource;
@@ -50,10 +52,14 @@ import 
org.apache.beam.sdk.transforms.windowing.GlobalWindow;
 import org.apache.beam.sdk.transforms.windowing.PaneInfo;
 import org.apache.beam.sdk.transforms.windowing.Window;
 import org.apache.beam.sdk.transforms.windowing.WindowFn;
+import org.apache.beam.sdk.util.AppliedCombineFn;
+import org.apache.beam.sdk.util.KeyedWorkItem;
+import org.apache.beam.sdk.util.SystemReduceFn;
 import 

[16/17] incubator-beam git commit: Fix combine tests with Accumulation Mode

2016-08-24 Thread kenn
Fix combine tests with Accumulation Mode

These tests were not written in such a way as to succeed if the trigger
fired multiple times.


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/e302cab0
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/e302cab0
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/e302cab0

Branch: refs/heads/master
Commit: e302cab06640e016fcb24e03376ac84149f2ed2b
Parents: f9fac64
Author: Thomas Groh 
Authored: Wed Jul 27 09:04:06 2016 -0700
Committer: Kenneth Knowles 
Committed: Wed Aug 24 12:46:24 2016 -0700

--
 .../java/org/apache/beam/sdk/transforms/CombineTest.java | 11 ---
 1 file changed, 8 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/e302cab0/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/CombineTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/CombineTest.java 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/CombineTest.java
index 6421b3b..897d17a 100644
--- 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/CombineTest.java
+++ 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/CombineTest.java
@@ -21,7 +21,6 @@ import static org.apache.beam.sdk.TestUtils.checkCombineFn;
 import static 
org.apache.beam.sdk.transforms.display.DisplayDataMatchers.hasDisplayItem;
 import static 
org.apache.beam.sdk.transforms.display.DisplayDataMatchers.hasNamespace;
 import static 
org.apache.beam.sdk.transforms.display.DisplayDataMatchers.includesDisplayDataFrom;
-
 import static com.google.common.base.Preconditions.checkArgument;
 import static com.google.common.base.Preconditions.checkNotNull;
 import static com.google.common.base.Preconditions.checkState;
@@ -387,7 +386,7 @@ public class CombineTest implements Serializable {
 
 PCollection output = input
 .apply(Window.into(new GlobalWindows())
-.triggering(AfterPane.elementCountAtLeast(1))
+.triggering(Repeatedly.forever(AfterPane.elementCountAtLeast(1)))
 .accumulatingFiredPanes()
 .withAllowedLateness(new Duration(0)))
 .apply(Sum.integersGlobally())
@@ -583,7 +582,13 @@ public class CombineTest implements Serializable {
 .apply(Sum.integersGlobally().withoutDefaults().withFanout(2))
 .apply(ParDo.of(new GetLast()));
 
-PAssert.that(output).containsInAnyOrder(15);
+PAssert.that(output).satisfies(new SerializableFunction() {
+  @Override
+  public Void apply(Iterable input) {
+assertThat(input, hasItem(15));
+return null;
+  }
+});
 
 pipeline.run();
   }



[14/17] incubator-beam git commit: Use AllPanes as the PaneExtractor in IterableAssert

2016-08-24 Thread kenn
Use AllPanes as the PaneExtractor in IterableAssert

This ensures that tests with triggering in the global window which assert
on the entire PCollection (not a singleton iterable) will succeed over the
entire PCollection


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/d4b024ba
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/d4b024ba
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/d4b024ba

Branch: refs/heads/master
Commit: d4b024baecb99a259a79918c0554b85efc148120
Parents: e302cab
Author: Thomas Groh 
Authored: Thu Jul 28 08:26:43 2016 -0700
Committer: Kenneth Knowles 
Committed: Wed Aug 24 12:46:24 2016 -0700

--
 .../core/src/main/java/org/apache/beam/sdk/testing/PAssert.java| 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/d4b024ba/sdks/java/core/src/main/java/org/apache/beam/sdk/testing/PAssert.java
--
diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/testing/PAssert.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/testing/PAssert.java
index 943ed11..f02bbe0 100644
--- a/sdks/java/core/src/main/java/org/apache/beam/sdk/testing/PAssert.java
+++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/testing/PAssert.java
@@ -357,7 +357,7 @@ public class PAssert {
 private final SimpleFunction, Iterable> 
paneExtractor;
 
 public PCollectionContentsAssert(PCollection actual) {
-  this(actual, IntoGlobalWindow.of(), PaneExtractors.onlyPane());
+  this(actual, IntoGlobalWindow.of(), PaneExtractors.allPanes());
 }
 
 public PCollectionContentsAssert(



[GitHub] incubator-beam pull request #737: Add Side-Input Support in Flink Streaming ...

2016-08-24 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/737


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[07/17] incubator-beam git commit: [BEAM-253] Unify Flink-Streaming Operator Wrappers

2016-08-24 Thread kenn
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/1de76b7a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/FlinkParDoBoundWrapper.java
--
diff --git 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/FlinkParDoBoundWrapper.java
 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/FlinkParDoBoundWrapper.java
deleted file mode 100644
index 6be94b2..000
--- 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/FlinkParDoBoundWrapper.java
+++ /dev/null
@@ -1,104 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-package org.apache.beam.runners.flink.translation.wrappers.streaming;
-
-import org.apache.beam.sdk.coders.Coder;
-import org.apache.beam.sdk.options.PipelineOptions;
-import org.apache.beam.sdk.transforms.OldDoFn;
-import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
-import org.apache.beam.sdk.transforms.windowing.PaneInfo;
-import org.apache.beam.sdk.util.TimerInternals;
-import org.apache.beam.sdk.util.WindowedValue;
-import org.apache.beam.sdk.util.WindowingInternals;
-import org.apache.beam.sdk.util.WindowingStrategy;
-import org.apache.beam.sdk.util.state.StateInternals;
-import org.apache.beam.sdk.values.PCollectionView;
-import org.apache.beam.sdk.values.TupleTag;
-
-import org.apache.flink.util.Collector;
-import org.joda.time.Instant;
-
-import java.io.IOException;
-import java.util.Collection;
-
-/**
- * A wrapper for the {@link org.apache.beam.sdk.transforms.ParDo.Bound} Beam 
transformation.
- * */
-public class FlinkParDoBoundWrapper extends 
FlinkAbstractParDoWrapper {
-
-  public FlinkParDoBoundWrapper(PipelineOptions options, WindowingStrategy windowingStrategy, OldDoFn doFn) {
-super(options, windowingStrategy, doFn);
-  }
-
-  @Override
-  public void outputWithTimestampHelper(WindowedValue inElement, OUT 
output, Instant timestamp, Collector collector) {
-checkTimestamp(inElement, timestamp);
-collector.collect(makeWindowedValue(
-output,
-timestamp,
-inElement.getWindows(),
-inElement.getPane()));
-  }
-
-  @Override
-  public  void sideOutputWithTimestampHelper(WindowedValue inElement, T 
output, Instant timestamp, Collector outCollector, 
TupleTag tag) {
-// ignore the side output, this can happen when a user does not register
-// side outputs but then outputs using a freshly created TupleTag.
-throw new RuntimeException("sideOutput() not not available in 
ParDo.Bound().");
-  }
-
-  @Override
-  public WindowingInternals windowingInternalsHelper(final 
WindowedValue inElement, final Collector collector) {
-return new WindowingInternals() {
-  @Override
-  public StateInternals stateInternals() {
-throw new NullPointerException("StateInternals are not available for 
ParDo.Bound().");
-  }
-
-  @Override
-  public void outputWindowedValue(OUT output, Instant timestamp, 
Collection windows, PaneInfo pane) {
-collector.collect(makeWindowedValue(output, timestamp, windows, pane));
-  }
-
-  @Override
-  public TimerInternals timerInternals() {
-throw new NullPointerException("TimeInternals are not available for 
ParDo.Bound().");
-  }
-
-  @Override
-  public Collection windows() {
-return inElement.getWindows();
-  }
-
-  @Override
-  public PaneInfo pane() {
-return inElement.getPane();
-  }
-
-  @Override
-  public  void writePCollectionViewData(TupleTag tag, 
Iterable data, Coder elemCoder) throws IOException {
-throw new RuntimeException("writePCollectionViewData() not supported 
in Streaming mode.");
-  }
-
-  @Override
-  public  T sideInput(PCollectionView view, BoundedWindow 
mainInputWindow) {
-throw new RuntimeException("sideInput() not implemented.");
-  }
-};
-  }
-}


[05/17] incubator-beam git commit: [BEAM-253] Unify Flink-Streaming Operator Wrappers

2016-08-24 Thread kenn
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/1de76b7a/runners/flink/runner/src/test/java/org/apache/beam/runners/flink/streaming/GroupAlsoByWindowTest.java
--
diff --git 
a/runners/flink/runner/src/test/java/org/apache/beam/runners/flink/streaming/GroupAlsoByWindowTest.java
 
b/runners/flink/runner/src/test/java/org/apache/beam/runners/flink/streaming/GroupAlsoByWindowTest.java
deleted file mode 100644
index 2d83fb6..000
--- 
a/runners/flink/runner/src/test/java/org/apache/beam/runners/flink/streaming/GroupAlsoByWindowTest.java
+++ /dev/null
@@ -1,523 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-package org.apache.beam.runners.flink.streaming;
-
-import org.apache.beam.runners.flink.FlinkTestPipeline;
-import 
org.apache.beam.runners.flink.translation.wrappers.streaming.FlinkGroupAlsoByWindowWrapper;
-import org.apache.beam.sdk.Pipeline;
-import org.apache.beam.sdk.coders.KvCoder;
-import org.apache.beam.sdk.coders.StringUtf8Coder;
-import org.apache.beam.sdk.coders.VarIntCoder;
-import org.apache.beam.sdk.transforms.Combine;
-import org.apache.beam.sdk.transforms.Sum;
-import org.apache.beam.sdk.transforms.windowing.AfterPane;
-import org.apache.beam.sdk.transforms.windowing.AfterWatermark;
-import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
-import org.apache.beam.sdk.transforms.windowing.FixedWindows;
-import org.apache.beam.sdk.transforms.windowing.IntervalWindow;
-import org.apache.beam.sdk.transforms.windowing.OutputTimeFns;
-import org.apache.beam.sdk.transforms.windowing.PaneInfo;
-import org.apache.beam.sdk.transforms.windowing.Repeatedly;
-import org.apache.beam.sdk.transforms.windowing.Sessions;
-import org.apache.beam.sdk.transforms.windowing.SlidingWindows;
-import org.apache.beam.sdk.transforms.windowing.WindowFn;
-import org.apache.beam.sdk.util.UserCodeException;
-import org.apache.beam.sdk.util.WindowedValue;
-import org.apache.beam.sdk.util.WindowingStrategy;
-import org.apache.beam.sdk.values.KV;
-
-import org.apache.flink.streaming.api.watermark.Watermark;
-import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
-import org.apache.flink.streaming.util.OneInputStreamOperatorTestHarness;
-import org.apache.flink.streaming.util.StreamingMultipleProgramsTestBase;
-import org.apache.flink.streaming.util.TestHarnessUtil;
-import org.joda.time.Duration;
-import org.joda.time.Instant;
-import org.junit.Test;
-
-import java.util.Collection;
-import java.util.Comparator;
-import java.util.concurrent.ConcurrentLinkedQueue;
-
-public class GroupAlsoByWindowTest extends StreamingMultipleProgramsTestBase {
-
-  private final Combine.CombineFn combiner = new Sum.SumIntegerFn();
-
-  private final WindowingStrategy 
slidingWindowWithAfterWatermarkTriggerStrategy =
-  
WindowingStrategy.of(SlidingWindows.of(Duration.standardSeconds(10)).every(Duration.standardSeconds(5)))
-  .withOutputTimeFn(OutputTimeFns.outputAtEarliestInputTimestamp())
-  
.withTrigger(AfterWatermark.pastEndOfWindow()).withMode(WindowingStrategy.AccumulationMode.ACCUMULATING_FIRED_PANES);
-
-  private final WindowingStrategy sessionWindowingStrategy =
-  
WindowingStrategy.of(Sessions.withGapDuration(Duration.standardSeconds(2)))
-  .withTrigger(Repeatedly.forever(AfterWatermark.pastEndOfWindow()))
-  
.withMode(WindowingStrategy.AccumulationMode.ACCUMULATING_FIRED_PANES)
-  .withOutputTimeFn(OutputTimeFns.outputAtEarliestInputTimestamp())
-  .withAllowedLateness(Duration.standardSeconds(100));
-
-  private final WindowingStrategy fixedWindowingStrategy =
-  WindowingStrategy.of(FixedWindows.of(Duration.standardSeconds(10)))
-  .withOutputTimeFn(OutputTimeFns.outputAtEarliestInputTimestamp());
-
-  private final WindowingStrategy fixedWindowWithCountTriggerStrategy =
-  fixedWindowingStrategy.withTrigger(AfterPane.elementCountAtLeast(5));
-
-  private final WindowingStrategy fixedWindowWithAfterWatermarkTriggerStrategy 
=
-  fixedWindowingStrategy.withTrigger(AfterWatermark.pastEndOfWindow());
-
-  private final WindowingStrategy fixedWindowWithCompoundTriggerStrategy =
-

[04/17] incubator-beam git commit: Fix Emission in startBundle/finishBundle in Flink Wrappers

2016-08-24 Thread kenn
Fix Emission in startBundle/finishBundle in Flink Wrappers


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/d94bffdd
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/d94bffdd
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/d94bffdd

Branch: refs/heads/master
Commit: d94bffdd20f7bb2f380c807f84c5405552a40f71
Parents: a87015b
Author: Aljoscha Krettek 
Authored: Sat Jun 11 10:55:55 2016 +0200
Committer: Kenneth Knowles 
Committed: Wed Aug 24 12:46:24 2016 -0700

--
 .../flink/translation/functions/FlinkDoFnFunction.java| 4 ++--
 .../translation/functions/FlinkMultiOutputDoFnFunction.java   | 7 +--
 2 files changed, 7 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/d94bffdd/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
--
diff --git 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
index fdf1e59..733d3d4 100644
--- 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
+++ 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkDoFnFunction.java
@@ -94,8 +94,8 @@ public class FlinkDoFnFunction
   }
 }
 
-// set the windowed value to null so that the logic
-// or outputting in finishBundle kicks in
+// set the windowed value to null so that the special logic for outputting
+// in startBundle/finishBundle kicks in
 context = context.forWindowedValue(null);
 this.doFn.finishBundle(context);
   }

http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/d94bffdd/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkMultiOutputDoFnFunction.java
--
diff --git 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkMultiOutputDoFnFunction.java
 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkMultiOutputDoFnFunction.java
index 5013b90..ef75878 100644
--- 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkMultiOutputDoFnFunction.java
+++ 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/functions/FlinkMultiOutputDoFnFunction.java
@@ -33,8 +33,8 @@ import org.apache.flink.util.Collector;
 import java.util.Map;
 
 /**
- * Encapsulates a {@link OldDoFn} that uses side outputs
- * inside a Flink {@link 
org.apache.flink.api.common.functions.RichMapPartitionFunction}.
+ * Encapsulates a {@link OldDoFn} that can emit to multiple
+ * outputs inside a Flink {@link 
org.apache.flink.api.common.functions.RichMapPartitionFunction}.
  *
  * We get a mapping from {@link org.apache.beam.sdk.values.TupleTag} to output 
index
  * and must tag all outputs with the output number. Afterwards a filter will 
filter out
@@ -106,6 +106,9 @@ public class FlinkMultiOutputDoFnFunction
   }
 }
 
+// set the windowed value to null so that the special logic for outputting
+// in startBundle/finishBundle kicks in
+context = context.forWindowedValue(null);
 this.doFn.finishBundle(context);
   }
 



[06/17] incubator-beam git commit: [BEAM-253] Unify Flink-Streaming Operator Wrappers

2016-08-24 Thread kenn
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/1de76b7a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/state/FlinkStateInternals.java
--
diff --git 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/state/FlinkStateInternals.java
 
b/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/state/FlinkStateInternals.java
deleted file mode 100644
index e6a43dc..000
--- 
a/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/state/FlinkStateInternals.java
+++ /dev/null
@@ -1,733 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-package org.apache.beam.runners.flink.translation.wrappers.streaming.state;
-
-import static com.google.common.base.Preconditions.checkNotNull;
-
-import org.apache.beam.sdk.coders.Coder;
-import org.apache.beam.sdk.options.PipelineOptions;
-import org.apache.beam.sdk.transforms.Combine;
-import org.apache.beam.sdk.transforms.CombineWithContext;
-import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
-import org.apache.beam.sdk.transforms.windowing.OutputTimeFn;
-import org.apache.beam.sdk.util.state.AccumulatorCombiningState;
-import org.apache.beam.sdk.util.state.BagState;
-import org.apache.beam.sdk.util.state.ReadableState;
-import org.apache.beam.sdk.util.state.State;
-import org.apache.beam.sdk.util.state.StateContext;
-import org.apache.beam.sdk.util.state.StateInternals;
-import org.apache.beam.sdk.util.state.StateNamespace;
-import org.apache.beam.sdk.util.state.StateNamespaces;
-import org.apache.beam.sdk.util.state.StateTable;
-import org.apache.beam.sdk.util.state.StateTag;
-import org.apache.beam.sdk.util.state.StateTags;
-import org.apache.beam.sdk.util.state.ValueState;
-import org.apache.beam.sdk.util.state.WatermarkHoldState;
-import org.apache.beam.sdk.values.PCollectionView;
-
-import com.google.protobuf.ByteString;
-
-import org.apache.flink.util.InstantiationUtil;
-import org.joda.time.Instant;
-
-import java.io.ByteArrayInputStream;
-import java.io.IOException;
-import java.util.ArrayList;
-import java.util.Arrays;
-import java.util.List;
-import java.util.Objects;
-
-/**
- * An implementation of the Beam {@link StateInternals}. This implementation 
simply keeps elements in memory.
- * This state is periodically checkpointed by Flink, for fault-tolerance.
- *
- * TODO: State should be rewritten to redirect to Flink per-key state so that 
coders and combiners don't need
- * to be serialized along with encoded values when snapshotting.
- */
-public class FlinkStateInternals implements StateInternals {
-
-  private final K key;
-
-  private final Coder keyCoder;
-
-  private final Coder windowCoder;
-
-  private final OutputTimeFn outputTimeFn;
-
-  private Instant watermarkHoldAccessor;
-
-  public FlinkStateInternals(K key,
- Coder keyCoder,
- Coder windowCoder,
- OutputTimeFn outputTimeFn) 
{
-this.key = key;
-this.keyCoder = keyCoder;
-this.windowCoder = windowCoder;
-this.outputTimeFn = outputTimeFn;
-  }
-
-  public Instant getWatermarkHold() {
-return watermarkHoldAccessor;
-  }
-
-  /**
-   * This is the interface state has to implement in order for it to be fault 
tolerant when
-   * executed by the FlinkRunner.
-   */
-  private interface CheckpointableIF {
-
-boolean shouldPersist();
-
-void persistState(StateCheckpointWriter checkpointBuilder) throws 
IOException;
-  }
-
-  protected final StateTable inMemoryState = new StateTable() {
-@Override
-protected StateTag.StateBinder binderForNamespace(final StateNamespace 
namespace, final StateContext c) {
-  return new StateTag.StateBinder() {
-
-@Override
-public  ValueState bindValue(StateTag> 
address, Coder coder) {
-  return new FlinkInMemoryValue<>(encodeKey(namespace, address), 
coder);
-}
-
-@Override
-public  BagState bindBag(StateTag> 
address, Coder elemCoder) {
-  return new 

[15/17] incubator-beam git commit: Don't Suppress Throwable in PAssert in Streaming Mode

2016-08-24 Thread kenn
Don't Suppress Throwable in PAssert in Streaming Mode


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/ea3a46c7
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/ea3a46c7
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/ea3a46c7

Branch: refs/heads/master
Commit: ea3a46c7896c954084fc581a06e1f3ef68a3f3d0
Parents: 4aba224
Author: Aljoscha Krettek 
Authored: Sun Jul 24 12:04:28 2016 +0200
Committer: Kenneth Knowles 
Committed: Wed Aug 24 12:46:24 2016 -0700

--
 .../org/apache/beam/sdk/testing/PAssert.java| 20 +++-
 1 file changed, 3 insertions(+), 17 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/ea3a46c7/sdks/java/core/src/main/java/org/apache/beam/sdk/testing/PAssert.java
--
diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/testing/PAssert.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/testing/PAssert.java
index 3f1a741..943ed11 100644
--- a/sdks/java/core/src/main/java/org/apache/beam/sdk/testing/PAssert.java
+++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/testing/PAssert.java
@@ -1066,14 +1066,7 @@ public class PAssert {
 
 @ProcessElement
 public void processElement(ProcessContext c) {
-  try {
-doChecks(c.element(), checkerFn, success, failure);
-  } catch (Throwable t) {
-// Suppress exception in streaming
-if (!c.getPipelineOptions().as(StreamingOptions.class).isStreaming()) {
-  throw t;
-}
-  }
+  doChecks(c.element(), checkerFn, success, failure);
 }
   }
 
@@ -1098,15 +1091,8 @@ public class PAssert {
 
 @ProcessElement
 public void processElement(ProcessContext c) {
-  try {
-ActualT actualContents = Iterables.getOnlyElement(c.element());
-doChecks(actualContents, checkerFn, success, failure);
-  } catch (Throwable t) {
-// Suppress exception in streaming
-if (!c.getPipelineOptions().as(StreamingOptions.class).isStreaming()) {
-  throw t;
-}
-  }
+  ActualT actualContents = Iterables.getOnlyElement(c.element());
+  doChecks(actualContents, checkerFn, success, failure);
 }
   }
 



[17/17] incubator-beam git commit: This closes #737

2016-08-24 Thread kenn
This closes #737


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/603f337b
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/603f337b
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/603f337b

Branch: refs/heads/master
Commit: 603f337b130cf9bb3a8e2a810db99ed62211c2ef
Parents: a87015b 695a80a
Author: Kenneth Knowles 
Authored: Wed Aug 24 12:47:54 2016 -0700
Committer: Kenneth Knowles 
Committed: Wed Aug 24 12:47:54 2016 -0700

--
 .../beam/runners/core/SideInputHandler.java |  240 
 .../beam/runners/core/SideInputHandlerTest.java |  222 
 runners/flink/runner/pom.xml|   11 +-
 .../apache/beam/runners/flink/FlinkRunner.java  |  386 ++-
 .../beam/runners/flink/TestFlinkRunner.java |   23 +-
 .../FlinkStreamingPipelineTranslator.java   |   59 +-
 .../FlinkStreamingTransformTranslators.java |  952 
 .../functions/FlinkDoFnFunction.java|4 +-
 .../functions/FlinkMultiOutputDoFnFunction.java |7 +-
 .../translation/types/CoderTypeInformation.java |4 +
 .../wrappers/streaming/DoFnOperator.java|  516 +
 .../streaming/FlinkAbstractParDoWrapper.java|  282 -
 .../FlinkGroupAlsoByWindowWrapper.java  |  644 ---
 .../streaming/FlinkGroupByKeyWrapper.java   |   73 --
 .../streaming/FlinkParDoBoundMultiWrapper.java  |   79 --
 .../streaming/FlinkParDoBoundWrapper.java   |  104 --
 .../wrappers/streaming/FlinkStateInternals.java | 1038 ++
 .../streaming/SingletonKeyedWorkItem.java   |   54 +
 .../streaming/SingletonKeyedWorkItemCoder.java  |  125 +++
 .../wrappers/streaming/WindowDoFnOperator.java  |  345 ++
 .../wrappers/streaming/WorkItemKeySelector.java |   58 +
 .../streaming/io/BoundedSourceWrapper.java  |  219 
 .../io/FlinkStreamingCreateFunction.java|   56 -
 .../state/AbstractFlinkTimerInternals.java  |  127 ---
 .../streaming/state/FlinkStateInternals.java|  733 -
 .../streaming/state/StateCheckpointReader.java  |   93 --
 .../streaming/state/StateCheckpointUtils.java   |  155 ---
 .../streaming/state/StateCheckpointWriter.java  |  131 ---
 .../wrappers/streaming/state/StateType.java |   73 --
 .../beam/runners/flink/PipelineOptionsTest.java |  103 +-
 .../flink/streaming/DoFnOperatorTest.java   |  328 ++
 .../streaming/FlinkStateInternalsTest.java  |  391 +++
 .../flink/streaming/GroupAlsoByWindowTest.java  |  523 -
 .../flink/streaming/StateSerializationTest.java |  338 --
 .../streaming/UnboundedSourceWrapperTest.java   |2 +-
 .../org/apache/beam/sdk/testing/PAssert.java|   22 +-
 .../beam/sdk/transforms/join/RawUnionValue.java |   25 +
 .../apache/beam/sdk/transforms/CombineTest.java |   11 +-
 .../beam/sdk/transforms/ParDoLifecycleTest.java |3 +-
 .../apache/beam/sdk/transforms/ParDoTest.java   |4 +-
 40 files changed, 4810 insertions(+), 3753 deletions(-)
--




[12/17] incubator-beam git commit: Allow DoFn Reuse in ParDoTest.TestDoFnWithContext

2016-08-24 Thread kenn
Allow DoFn Reuse in ParDoTest.TestDoFnWithContext


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/4aba224e
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/4aba224e
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/4aba224e

Branch: refs/heads/master
Commit: 4aba224e44f3ef6c96d1836d9434bbc2e3daad0c
Parents: dfbdc6c
Author: Aljoscha Krettek 
Authored: Sat Jul 23 10:48:42 2016 +0200
Committer: Kenneth Knowles 
Committed: Wed Aug 24 12:46:24 2016 -0700

--
 .../src/test/java/org/apache/beam/sdk/transforms/ParDoTest.java  | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/4aba224e/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/ParDoTest.java
--
diff --git 
a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/ParDoTest.java 
b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/ParDoTest.java
index c384114..13dec9a 100644
--- a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/ParDoTest.java
+++ b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/ParDoTest.java
@@ -193,7 +193,9 @@ public class ParDoTest implements Serializable {
 
 @StartBundle
 public void startBundle(Context c) {
-  assertEquals(State.UNSTARTED, state);
+  assertThat(state,
+  anyOf(equalTo(State.UNSTARTED), equalTo(State.FINISHED)));
+
   state = State.STARTED;
   outputToAll(c, "started");
 }



[03/17] incubator-beam git commit: Fix Flink Runner Pom for Batch RunnableOnService tests

2016-08-24 Thread kenn
Fix Flink Runner Pom for Batch RunnableOnService tests


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/df2874d0
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/df2874d0
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/df2874d0

Branch: refs/heads/master
Commit: df2874d02c7e3ad6660e488b3f847a2d6d830953
Parents: ea3a46c
Author: Aljoscha Krettek 
Authored: Sat Aug 20 10:42:39 2016 +0200
Committer: Kenneth Knowles 
Committed: Wed Aug 24 12:46:24 2016 -0700

--
 runners/flink/runner/pom.xml | 10 --
 1 file changed, 8 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/df2874d0/runners/flink/runner/pom.xml
--
diff --git a/runners/flink/runner/pom.xml b/runners/flink/runner/pom.xml
index ca9e5e2..e222bfd 100644
--- a/runners/flink/runner/pom.xml
+++ b/runners/flink/runner/pom.xml
@@ -52,11 +52,17 @@
   test
 
 
+  
org.apache.beam.sdk.testing.RunnableOnService
+  none
+  true
+  
+
org.apache.beam:beam-sdks-java-core
+  
   
 
   [
-"--runner=TestFlinkRunner",
-"--streaming=false"
+  "--runner=TestFlinkRunner",
+  "--streaming=false"
   ]
 
   



[jira] [Created] (BEAM-582) Allow usage of the new GCP service account JSON key

2016-08-24 Thread Alex Van Boxel (JIRA)
Alex Van Boxel created BEAM-582:
---

 Summary: Allow usage of the new GCP service account JSON key
 Key: BEAM-582
 URL: https://issues.apache.org/jira/browse/BEAM-582
 Project: Beam
  Issue Type: Improvement
  Components: runner-dataflow
Reporter: Alex Van Boxel
Assignee: Davor Bonaci






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


Jenkins build is back to stable : beam_PostCommit_RunnableOnService_GoogleCloudDataflow #1011

2016-08-24 Thread Apache Jenkins Server
See 




[GitHub] incubator-beam pull request #878: [Experiment]: update travis yml to install

2016-08-24 Thread vikkyrk
GitHub user vikkyrk opened a pull request:

https://github.com/apache/incubator-beam/pull/878

[Experiment]: update travis yml to install

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/vikkyrk/incubator-beam travis

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/878.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #878


commit 0111a71803ddc994bbc14732ea48c9750ceecec0
Author: Vikas Kedigehalli 
Date:   2016-08-24T18:51:06Z

update travis yml to install




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Jenkins build became unstable: beam_PostCommit_RunnableOnService_GoogleCloudDataflow #1010

2016-08-24 Thread Apache Jenkins Server
See 




[jira] [Closed] (BEAM-276) Add PCollections Section

2016-08-24 Thread Frances Perry (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-276?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Frances Perry closed BEAM-276.
--
   Resolution: Fixed
Fix Version/s: Not applicable

Done by Devin: 
http://beam.incubator.apache.org/learn/programming-guide/#pcollection

> Add PCollections Section
> 
>
> Key: BEAM-276
> URL: https://issues.apache.org/jira/browse/BEAM-276
> Project: Beam
>  Issue Type: Sub-task
>  Components: website
>Reporter: Devin Donnelly
> Fix For: Not applicable
>
>
> Add section with overview and usage of PCollection class.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-277) Add Transforms Section

2016-08-24 Thread Frances Perry (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-277?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435328#comment-15435328
 ] 

Frances Perry commented on BEAM-277:


Partially completed: 
http://beam.incubator.apache.org/learn/programming-guide/#transforms 

> Add Transforms Section
> --
>
> Key: BEAM-277
> URL: https://issues.apache.org/jira/browse/BEAM-277
> Project: Beam
>  Issue Type: Sub-task
>  Components: website
>Reporter: Devin Donnelly
>
> Document general transforms usage and ParDo usage.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Closed] (BEAM-275) Add Pipelines Section

2016-08-24 Thread Frances Perry (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-275?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Frances Perry closed BEAM-275.
--
   Resolution: Fixed
Fix Version/s: Not applicable

Completed by Devin: 
http://beam.incubator.apache.org/learn/programming-guide/#pipeline

> Add Pipelines Section
> -
>
> Key: BEAM-275
> URL: https://issues.apache.org/jira/browse/BEAM-275
> Project: Beam
>  Issue Type: Sub-task
>  Components: website
>Reporter: Devin Donnelly
> Fix For: Not applicable
>
>
> Document overview and usage of Pipeline object, including creation and 
> options assignment.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Closed] (BEAM-274) Add Programming Guide Skeleton

2016-08-24 Thread Frances Perry (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-274?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Frances Perry closed BEAM-274.
--
   Resolution: Fixed
Fix Version/s: Not applicable

> Add Programming Guide Skeleton
> --
>
> Key: BEAM-274
> URL: https://issues.apache.org/jira/browse/BEAM-274
> Project: Beam
>  Issue Type: Sub-task
>  Components: website
>Reporter: Devin Donnelly
> Fix For: Not applicable
>
>
> Creating headings, front matter, and TOC for table of contents.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


Build failed in Jenkins: beam_PostCommit_RunnableOnService_GearpumpLocal #135

2016-08-24 Thread Apache Jenkins Server
See 


--
[...truncated 8204 lines...]
[WARNING]   - javax.annotation.RegEx
[WARNING]   - javax.annotation.concurrent.Immutable
[WARNING]   - javax.annotation.meta.TypeQualifierDefault
[WARNING]   - javax.annotation.meta.TypeQualifier
[WARNING]   - javax.annotation.Syntax
[WARNING]   - javax.annotation.Nonnull
[WARNING]   - javax.annotation.CheckReturnValue
[WARNING]   - javax.annotation.CheckForNull
[WARNING]   - javax.annotation.meta.TypeQualifierNickname
[WARNING]   - javax.annotation.MatchesPattern
[WARNING]   - 25 more...
[WARNING] grpc-all-0.13.1.jar, grpc-okhttp-0.13.1.jar define 75 overlapping 
classes: 
[WARNING]   - io.grpc.okhttp.OkHttpSettingsUtil
[WARNING]   - io.grpc.okhttp.NegotiationType
[WARNING]   - io.grpc.okhttp.AsyncFrameWriter$12
[WARNING]   - io.grpc.okhttp.OkHttpTlsUpgrader
[WARNING]   - io.grpc.okhttp.AsyncFrameWriter$WriteRunnable
[WARNING]   - io.grpc.okhttp.Utils
[WARNING]   - io.grpc.okhttp.OkHttpProtocolNegotiator$AndroidNegotiator
[WARNING]   - io.grpc.okhttp.OkHttpChannelBuilder$2
[WARNING]   - io.grpc.okhttp.internal.framed.Huffman$Node
[WARNING]   - io.grpc.okhttp.AsyncFrameWriter$7
[WARNING]   - 65 more...
[WARNING] grpc-all-0.13.1.jar, grpc-auth-0.13.1.jar define 2 overlapping 
classes: 
[WARNING]   - io.grpc.auth.ClientAuthInterceptor$1
[WARNING]   - io.grpc.auth.ClientAuthInterceptor
[WARNING] grpc-all-0.13.1.jar, grpc-protobuf-nano-0.13.1.jar define 4 
overlapping classes: 
[WARNING]   - io.grpc.protobuf.nano.NanoProtoInputStream
[WARNING]   - io.grpc.protobuf.nano.NanoUtils
[WARNING]   - io.grpc.protobuf.nano.NanoUtils$1
[WARNING]   - io.grpc.protobuf.nano.MessageNanoFactory
[WARNING] annotations-3.0.1.jar, jcip-annotations-1.0.jar define 4 overlapping 
classes: 
[WARNING]   - net.jcip.annotations.GuardedBy
[WARNING]   - net.jcip.annotations.NotThreadSafe
[WARNING]   - net.jcip.annotations.ThreadSafe
[WARNING]   - net.jcip.annotations.Immutable
[WARNING] grpc-all-0.13.1.jar, grpc-netty-0.13.1.jar define 78 overlapping 
classes: 
[WARNING]   - io.grpc.netty.AbstractNettyHandler
[WARNING]   - io.grpc.netty.NettyClientTransport
[WARNING]   - io.grpc.netty.NettyClientStream$1
[WARNING]   - io.grpc.netty.SendResponseHeadersCommand
[WARNING]   - io.grpc.netty.NettyServer$2
[WARNING]   - io.grpc.netty.NettyClientHandler$4
[WARNING]   - io.grpc.netty.NettyClientTransport$3
[WARNING]   - io.grpc.netty.NettyClientHandler$FrameListener
[WARNING]   - io.grpc.netty.ProtocolNegotiators$1
[WARNING]   - io.grpc.netty.JettyTlsUtil
[WARNING]   - 68 more...
[WARNING] beam-runners-core-java-0.2.0-incubating-SNAPSHOT.jar, 
beam-sdks-java-core-0.2.0-incubating-SNAPSHOT.jar define 1717 overlapping 
classes: 
[WARNING]   - 
org.apache.beam.sdk.repackaged.com.google.common.collect.TreeRangeSet$ComplementRangesByLowerBound$2
[WARNING]   - 
org.apache.beam.sdk.repackaged.com.google.common.collect.WellBehavedMap$EntrySet$1$1
[WARNING]   - 
org.apache.beam.sdk.repackaged.com.google.common.util.concurrent.CycleDetectingLockFactory$Policies$1
[WARNING]   - org.apache.beam.sdk.repackaged.com.google.common.collect.Maps$6
[WARNING]   - 
org.apache.beam.sdk.repackaged.com.google.common.primitives.UnsignedBytes$LexicographicalComparatorHolder$UnsafeComparator$1
[WARNING]   - 
org.apache.beam.sdk.repackaged.com.google.common.collect.Collections2$OrderedPermutationCollection
[WARNING]   - org.apache.beam.sdk.repackaged.com.google.common.collect.Range$1
[WARNING]   - org.apache.beam.sdk.repackaged.com.google.common.base.Splitter$2
[WARNING]   - 
org.apache.beam.sdk.repackaged.com.google.common.base.Equivalence$Identity
[WARNING]   - org.apache.beam.sdk.repackaged.com.google.common.collect.Lists$1
[WARNING]   - 1707 more...
[WARNING] grpc-all-0.13.1.jar, grpc-protobuf-0.13.1.jar define 4 overlapping 
classes: 
[WARNING]   - io.grpc.protobuf.ProtoUtils$2
[WARNING]   - io.grpc.protobuf.ProtoUtils
[WARNING]   - io.grpc.protobuf.ProtoUtils$1
[WARNING]   - io.grpc.protobuf.ProtoInputStream
[WARNING] grpc-all-0.13.1.jar, grpc-stub-0.13.1.jar define 30 overlapping 
classes: 
[WARNING]   - io.grpc.stub.ServerCalls$UnaryRequestMethod
[WARNING]   - io.grpc.stub.StreamObserver
[WARNING]   - io.grpc.stub.ClientCalls$CallToStreamObserverAdapter
[WARNING]   - io.grpc.stub.ServerCalls$EmptyServerCallListener
[WARNING]   - io.grpc.stub.MetadataUtils$2$1
[WARNING]   - io.grpc.stub.MetadataUtils$1$1
[WARNING]   - io.grpc.stub.MetadataUtils$2$1$1
[WARNING]   - io.grpc.stub.MetadataUtils
[WARNING]   - io.grpc.stub.ServerCalls$1
[WARNING]   - io.grpc.stub.ServerCalls$ClientStreamingMethod
[WARNING]   - 20 more...
[WARNING] maven-shade-plugin has detected that some class files are
[WARNING] present in two or more JARs. When this happens, only one
[WARNING] single version of the class is copied to the uber jar.
[WARNING] Usually this is not harmful and you can skip these warnings,
[WARNING] otherwise try 

[jira] [Commented] (BEAM-274) Add Programming Guide Skeleton

2016-08-24 Thread Frances Perry (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-274?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435322#comment-15435322
 ] 

Frances Perry commented on BEAM-274:


Looks like this was already completed: 
http://beam.incubator.apache.org/learn/programming-guide/

Sorry for the miscommunication. I'll do a pass over Devin's issues and close 
the ones he finished.

> Add Programming Guide Skeleton
> --
>
> Key: BEAM-274
> URL: https://issues.apache.org/jira/browse/BEAM-274
> Project: Beam
>  Issue Type: Sub-task
>  Components: website
>Reporter: Devin Donnelly
> Fix For: Not applicable
>
>
> Creating headings, front matter, and TOC for table of contents.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[2/2] incubator-beam git commit: Closes #876

2016-08-24 Thread dhalperi
Closes #876


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/a87015bf
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/a87015bf
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/a87015bf

Branch: refs/heads/master
Commit: a87015bfe18138a5302a45712d3a6c581408c316
Parents: f17e730 ddbfcdb
Author: Dan Halperin 
Authored: Wed Aug 24 09:58:47 2016 -0700
Committer: Dan Halperin 
Committed: Wed Aug 24 09:58:47 2016 -0700

--
 runners/google-cloud-dataflow-java/pom.xml   | 11 ---
 .../org/apache/beam/runners/dataflow/DataflowRunner.java |  4 ++--
 2 files changed, 2 insertions(+), 13 deletions(-)
--




[GitHub] incubator-beam pull request #876: Remove ParDoTest Suppression in Google Clo...

2016-08-24 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/incubator-beam/pull/876


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[1/2] incubator-beam git commit: Remove ParDoTest Suppression in Google Cloud Dataflow

2016-08-24 Thread dhalperi
Repository: incubator-beam
Updated Branches:
  refs/heads/master f17e7303a -> a87015bfe


Remove ParDoTest Suppression in Google Cloud Dataflow

This reenables the lifecycle tests now that they are properly supported.

Update the container image.


Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/ddbfcdb6
Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/ddbfcdb6
Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/ddbfcdb6

Branch: refs/heads/master
Commit: ddbfcdb6d36815e00c266be11d727135d75913f0
Parents: f17e730
Author: Thomas Groh 
Authored: Tue Aug 23 14:44:53 2016 -0700
Committer: Dan Halperin 
Committed: Wed Aug 24 09:58:46 2016 -0700

--
 runners/google-cloud-dataflow-java/pom.xml   | 11 ---
 .../org/apache/beam/runners/dataflow/DataflowRunner.java |  4 ++--
 2 files changed, 2 insertions(+), 13 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/ddbfcdb6/runners/google-cloud-dataflow-java/pom.xml
--
diff --git a/runners/google-cloud-dataflow-java/pom.xml 
b/runners/google-cloud-dataflow-java/pom.xml
index 0044823..bf66f38 100644
--- a/runners/google-cloud-dataflow-java/pom.xml
+++ b/runners/google-cloud-dataflow-java/pom.xml
@@ -60,17 +60,6 @@
 true
   
 
-
-  
-runnable-on-service-tests
-
-  
-
org/apache/beam/sdk/transforms/ParDoLifecycleTest.java
-
org/apache/beam/sdk/transforms/ParDoTest.java
-  
-
-  
-
   
 
   

[jira] [Commented] (BEAM-383) BigQueryIO: update sink to shard into multiple write jobs

2016-08-24 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-383?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435276#comment-15435276
 ] 

ASF GitHub Bot commented on BEAM-383:
-

GitHub user dhalperi opened a pull request:

https://github.com/apache/incubator-beam/pull/877

[BEAM-383] BigQueryIO.Write: raise size limit to 11 TiB

BigQuery has changed their total size quota to 12 TiB.
https://cloud.google.com/bigquery/quota-policy#import

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/dhalperi/incubator-beam bigquery-limits

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/877.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #877


commit 02648a968992617ba77393d889c8df9d0191b9ea
Author: Dan Halperin 
Date:   2016-08-24T16:49:46Z

BigQueryIO.Write: raise size limit to 11 TiB

BigQuery has changed their total size quota to 12 TiB.
https://cloud.google.com/bigquery/quota-policy#import




> BigQueryIO: update sink to shard into multiple write jobs
> -
>
> Key: BEAM-383
> URL: https://issues.apache.org/jira/browse/BEAM-383
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-gcp
>Reporter: Daniel Halperin
>Assignee: Ian Zhou
> Fix For: 0.3.0-incubating
>
>
> BigQuery has global limits on both the # files that can be written in a 
> single job and the total bytes in those files. We should be able to modify 
> BigQueryIO.Write to chunk into multiple smaller jobs that meet these limits, 
> write to temp tables, and atomically copy into the destination table.
> This functionality will let us safely stay within BQ's load job limits.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] incubator-beam pull request #877: [BEAM-383] BigQueryIO.Write: raise size li...

2016-08-24 Thread dhalperi
GitHub user dhalperi opened a pull request:

https://github.com/apache/incubator-beam/pull/877

[BEAM-383] BigQueryIO.Write: raise size limit to 11 TiB

BigQuery has changed their total size quota to 12 TiB.
https://cloud.google.com/bigquery/quota-policy#import

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/dhalperi/incubator-beam bigquery-limits

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/877.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #877


commit 02648a968992617ba77393d889c8df9d0191b9ea
Author: Dan Halperin 
Date:   2016-08-24T16:49:46Z

BigQueryIO.Write: raise size limit to 11 TiB

BigQuery has changed their total size quota to 12 TiB.
https://cloud.google.com/bigquery/quota-policy#import




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] incubator-beam pull request #876: Remove ParDoTest Suppression in Google Clo...

2016-08-24 Thread tgroh
GitHub user tgroh opened a pull request:

https://github.com/apache/incubator-beam/pull/876

Remove ParDoTest Suppression in Google Cloud Dataflow

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---

This reenables the lifecycle tests now that the DoFn lifecycle is properly 
supported.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tgroh/incubator-beam reenable_df_pardo_tests

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/876.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #876


commit 4abc44ea95475a9076e29382ac941d509b215381
Author: Thomas Groh 
Date:   2016-08-23T21:44:53Z

Remove ParDoTest Suppression in Google Cloud Dataflow

This reenables the lifecycle tests now that they are properly supported.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-388) Update Beam Incubation Status page on main Apache site

2016-08-24 Thread JIRA

[ 
https://issues.apache.org/jira/browse/BEAM-388?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435068#comment-15435068
 ] 

Jean-Baptiste Onofré commented on BEAM-388:
---

Sorry, I forgot to publish. I'm updating the context now and publish. Sorry for 
the delay.

> Update Beam Incubation Status page on main Apache site
> --
>
> Key: BEAM-388
> URL: https://issues.apache.org/jira/browse/BEAM-388
> Project: Beam
>  Issue Type: Improvement
>  Components: project-management
>Affects Versions: Not applicable
>Reporter: Daniel Halperin
>Assignee: Jean-Baptiste Onofré
>
> Looking at http://incubator.apache.org/projects/beam.html , it has not been 
> updated since 2/1. Some proposed changes, from top to bottom.
> News
> * Add Release 0.1.0-incubating
> Project info
> * Add Apache IDs for all committers now that we have them.
> Incubation work items
> * Add dates for all those that have been completed



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-242) Enable Checkstyle check and Javadoc build for the Flink Runner

2016-08-24 Thread JIRA

[ 
https://issues.apache.org/jira/browse/BEAM-242?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435060#comment-15435060
 ] 

Jean-Baptiste Onofré commented on BEAM-242:
---

[~aljoscha] [~mxm] hey guys, I propose PR #874 to enable checkstyle on the 
flink runner and correct the checkstyle errors.

> Enable Checkstyle check and Javadoc build for the Flink Runner 
> ---
>
> Key: BEAM-242
> URL: https://issues.apache.org/jira/browse/BEAM-242
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-flink
>Reporter: Maximilian Michels
>Assignee: Jean-Baptiste Onofré
>Priority: Minor
>
> We don't have a Checkstyle check in place for the Flink Runner. I would like 
> to use the SDK's checkstyle rules.
> We could also think about a unified Checkstyle for all Runners.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-242) Enable Checkstyle check and Javadoc build for the Flink Runner

2016-08-24 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-242?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15435049#comment-15435049
 ] 

ASF GitHub Bot commented on BEAM-242:
-

GitHub user jbonofre opened a pull request:

https://github.com/apache/incubator-beam/pull/874

[BEAM-242] Enable checkstyle and fix checkstyle errors in Flink runner

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [X] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [X] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [X] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [X] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jbonofre/incubator-beam BEAM-242

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/874.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #874






> Enable Checkstyle check and Javadoc build for the Flink Runner 
> ---
>
> Key: BEAM-242
> URL: https://issues.apache.org/jira/browse/BEAM-242
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-flink
>Reporter: Maximilian Michels
>Assignee: Jean-Baptiste Onofré
>Priority: Minor
>
> We don't have a Checkstyle check in place for the Flink Runner. I would like 
> to use the SDK's checkstyle rules.
> We could also think about a unified Checkstyle for all Runners.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] incubator-beam pull request #874: [BEAM-242] Enable checkstyle and fix check...

2016-08-24 Thread jbonofre
GitHub user jbonofre opened a pull request:

https://github.com/apache/incubator-beam/pull/874

[BEAM-242] Enable checkstyle and fix checkstyle errors in Flink runner

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [X] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [X] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [X] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [X] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jbonofre/incubator-beam BEAM-242

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/incubator-beam/pull/874.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #874






---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Created] (BEAM-581) Support Verifiers in TestFlinkRunner

2016-08-24 Thread Aljoscha Krettek (JIRA)
Aljoscha Krettek created BEAM-581:
-

 Summary: Support Verifiers in TestFlinkRunner
 Key: BEAM-581
 URL: https://issues.apache.org/jira/browse/BEAM-581
 Project: Beam
  Issue Type: Improvement
Reporter: Aljoscha Krettek


[~jasonkuster] suggested that we should support verifiers to better support E2E 
tests.

See 
https://github.com/apache/incubator-beam/blob/master/examples/java/src/test/java/org/apache/beam/examples/WordCountIT.java
 for an example of how they're used and 
https://github.com/apache/incubator-beam/blob/master/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/testing/TestDataflowRunner.java
 for how they are implemented in the TestDataflowRunner.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-274) Add Programming Guide Skeleton

2016-08-24 Thread Chinnasamy (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-274?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15434529#comment-15434529
 ] 

Chinnasamy commented on BEAM-274:
-

Can you assign this to me. I can take it.

> Add Programming Guide Skeleton
> --
>
> Key: BEAM-274
> URL: https://issues.apache.org/jira/browse/BEAM-274
> Project: Beam
>  Issue Type: Sub-task
>  Components: website
>Reporter: Devin Donnelly
>
> Creating headings, front matter, and TOC for table of contents.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (BEAM-294) Change properties named dataflow.* to beam.*

2016-08-24 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-294?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jean-Baptiste Onofré updated BEAM-294:
--
Fix Version/s: 0.3.0-incubating

> Change properties named dataflow.* to beam.*
> 
>
> Key: BEAM-294
> URL: https://issues.apache.org/jira/browse/BEAM-294
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-spark
>Reporter: Jean-Baptiste Onofré
>Assignee: Jean-Baptiste Onofré
> Fix For: 0.3.0-incubating
>
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (BEAM-294) Change properties named dataflow.* to beam.*

2016-08-24 Thread JIRA

[ 
https://issues.apache.org/jira/browse/BEAM-294?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15434314#comment-15434314
 ] 

Jean-Baptiste Onofré commented on BEAM-294:
---

The properties have been renamed with commit 
e62bfdb16dddfd19fc881e8a78df956bd0ef0ff1.

However, there are some references to dataflow (in the comments and javadoc). 
I'm fixing that.

> Change properties named dataflow.* to beam.*
> 
>
> Key: BEAM-294
> URL: https://issues.apache.org/jira/browse/BEAM-294
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-spark
>Reporter: Jean-Baptiste Onofré
>Assignee: Jean-Baptiste Onofré
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)