[jira] [Assigned] (BEAM-372) CoderProperties: Test that the coder doesn't consume more bytes than it produces
[ https://issues.apache.org/jira/browse/BEAM-372?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Daniel Halperin reassigned BEAM-372: Assignee: Daniel Halperin > CoderProperties: Test that the coder doesn't consume more bytes than it > produces > > > Key: BEAM-372 > URL: https://issues.apache.org/jira/browse/BEAM-372 > Project: Beam > Issue Type: Bug > Components: sdk-java-core >Reporter: Ben Chambers >Assignee: Daniel Halperin >Priority: Minor > Labels: beginner, newbie, starter > > Add a test to CoderProperties that does the following: > 1. Encode a value using the Coder in the nested context > 2. Decode the value using the Coder in the nested context > 3. Verify that the input stream wasn't requested to consume any extra bytes > (This could possibly just be an enhancement to the existing round-trip > encode/decode test) > When this fails it can lead to very difficult to debug situations in a coder > wrapped around the problematic coder. This would be an easy test that would > clearly fail *for the coder which was problematic*. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Updated] (BEAM-372) CoderProperties: Test that the coder doesn't consume more bytes than it produces
[ https://issues.apache.org/jira/browse/BEAM-372?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Daniel Halperin updated BEAM-372: - Assignee: Chandni Singh (was: Daniel Halperin) > CoderProperties: Test that the coder doesn't consume more bytes than it > produces > > > Key: BEAM-372 > URL: https://issues.apache.org/jira/browse/BEAM-372 > Project: Beam > Issue Type: Bug > Components: sdk-java-core >Reporter: Ben Chambers >Assignee: Chandni Singh >Priority: Minor > Labels: beginner, newbie, starter > > Add a test to CoderProperties that does the following: > 1. Encode a value using the Coder in the nested context > 2. Decode the value using the Coder in the nested context > 3. Verify that the input stream wasn't requested to consume any extra bytes > (This could possibly just be an enhancement to the existing round-trip > encode/decode test) > When this fails it can lead to very difficult to debug situations in a coder > wrapped around the problematic coder. This would be an easy test that would > clearly fail *for the coder which was problematic*. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (BEAM-372) CoderProperties: Test that the coder doesn't consume more bytes than it produces
[ https://issues.apache.org/jira/browse/BEAM-372?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15352469#comment-15352469 ] Chandni Singh commented on BEAM-372: Can you please assign it to me? > CoderProperties: Test that the coder doesn't consume more bytes than it > produces > > > Key: BEAM-372 > URL: https://issues.apache.org/jira/browse/BEAM-372 > Project: Beam > Issue Type: Bug > Components: sdk-java-core >Reporter: Ben Chambers >Priority: Minor > Labels: beginner, newbie, starter > > Add a test to CoderProperties that does the following: > 1. Encode a value using the Coder in the nested context > 2. Decode the value using the Coder in the nested context > 3. Verify that the input stream wasn't requested to consume any extra bytes > (This could possibly just be an enhancement to the existing round-trip > encode/decode test) > When this fails it can lead to very difficult to debug situations in a coder > wrapped around the problematic coder. This would be an easy test that would > clearly fail *for the coder which was problematic*. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (BEAM-372) CoderProperties: Test that the coder doesn't consume more bytes than it produces
[ https://issues.apache.org/jira/browse/BEAM-372?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15352464#comment-15352464 ] Daniel Halperin commented on BEAM-372: -- Sounds good to me! > CoderProperties: Test that the coder doesn't consume more bytes than it > produces > > > Key: BEAM-372 > URL: https://issues.apache.org/jira/browse/BEAM-372 > Project: Beam > Issue Type: Bug > Components: sdk-java-core >Reporter: Ben Chambers >Priority: Minor > Labels: beginner, newbie, starter > > Add a test to CoderProperties that does the following: > 1. Encode a value using the Coder in the nested context > 2. Decode the value using the Coder in the nested context > 3. Verify that the input stream wasn't requested to consume any extra bytes > (This could possibly just be an enhancement to the existing round-trip > encode/decode test) > When this fails it can lead to very difficult to debug situations in a coder > wrapped around the problematic coder. This would be an easy test that would > clearly fail *for the coder which was problematic*. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (BEAM-372) CoderProperties: Test that the coder doesn't consume more bytes than it produces
[ https://issues.apache.org/jira/browse/BEAM-372?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15352461#comment-15352461 ] Chandni Singh commented on BEAM-372: Can I work on this JIRA? > CoderProperties: Test that the coder doesn't consume more bytes than it > produces > > > Key: BEAM-372 > URL: https://issues.apache.org/jira/browse/BEAM-372 > Project: Beam > Issue Type: Bug > Components: sdk-java-core >Reporter: Ben Chambers >Priority: Minor > Labels: beginner, newbie, starter > > Add a test to CoderProperties that does the following: > 1. Encode a value using the Coder in the nested context > 2. Decode the value using the Coder in the nested context > 3. Verify that the input stream wasn't requested to consume any extra bytes > (This could possibly just be an enhancement to the existing round-trip > encode/decode test) > When this fails it can lead to very difficult to debug situations in a coder > wrapped around the problematic coder. This would be an easy test that would > clearly fail *for the coder which was problematic*. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Created] (BEAM-382) Audit BoundedReaders for BEAM-381
Daniel Halperin created BEAM-382: Summary: Audit BoundedReaders for BEAM-381 Key: BEAM-382 URL: https://issues.apache.org/jira/browse/BEAM-382 Project: Beam Issue Type: Bug Components: sdk-java-core Reporter: Daniel Halperin Assignee: Daniel Halperin Prevent similar issues -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Created] (BEAM-381) OffsetBasedReader should construct sources before updating the range tracker
Daniel Halperin created BEAM-381: Summary: OffsetBasedReader should construct sources before updating the range tracker Key: BEAM-381 URL: https://issues.apache.org/jira/browse/BEAM-381 Project: Beam Issue Type: Bug Components: sdk-java-core Affects Versions: 0.1.0-incubating, 0.2.0-incubating Reporter: Daniel Halperin Assignee: Daniel Halperin OffsetBasedReader has the following code: {code} if (!rangeTracker.trySplitAtPosition(splitOffset)) { return null; } long start = source.getStartOffset(); long end = source.getEndOffset(); OffsetBasedSource primary = source.createSourceForSubrange(start, splitOffset); OffsetBasedSource residual = source.createSourceForSubrange(splitOffset, end); this.source = primary; return residual; {code} The first line is the line that updates the range of this source. However, subsequent lines might throw (specifically, in source.createSourceForSubrange). We should construct the sources first, and then catch exceptions and return null if they fail. This way, the splitAtFraction call will not throw (so work is not wasted) and the range tracker will not be updated if either the primary or (more likely) the residual could not be created. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] incubator-beam pull request #546: Fix NPE in UnboundedReadFromBoundedSource
GitHub user peihe opened a pull request: https://github.com/apache/incubator-beam/pull/546 Fix NPE in UnboundedReadFromBoundedSource You can merge this pull request into a Git repository by running: $ git pull https://github.com/peihe/incubator-beam fix-unbounded-read Alternatively you can review and apply these changes as the patch at: https://github.com/apache/incubator-beam/pull/546.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #546 commit bb59187d1ec63f0e016286d926245a0ef440eec9 Author: Pei He Date: 2016-06-28T01:21:37Z Fix NPE in UnboundedReadFromBoundedSource --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] incubator-beam pull request #545: Enables more lint rules for Python SDK
GitHub user aaltay opened a pull request: https://github.com/apache/incubator-beam/pull/545 Enables more lint rules for Python SDK - Fixes some import related warnings and enabled related pylint rules. - Replaced (or removed) invalid pylint directives in the g-* form. - Use pep8 for blank line related style check. pylint does not check for pep8 related to pep8. And pep8 is not configurable for indentations. So we need both tools. - Fixed existing lint error related to blank lines. Notes: * I excluded 3 files for the below usage: ``` T = TypeVariable('T') with_input_types(T) @with_output_types(List[T]) class TopCombineFn(core.CombineFn): ``` Here pep8 style requires 2 blank lines between first and second lines, even though they are logically connected. And there is no option to suppress this error locally. * These tools verify that imports follow the style as defined in here: https://www.python.org/dev/peps/pep-0008/#imports . This guideline requires the imports to be in standard library / third party / local imports. This does not necessarily result in an alphabetic import order. You can merge this pull request into a Git repository by running: $ git pull https://github.com/aaltay/incubator-beam morelint Alternatively you can review and apply these changes as the patch at: https://github.com/apache/incubator-beam/pull/545.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #545 commit 520d2c9f9c5c6a5771879b9b70a226242ef8cfff Author: Ahmet Altay Date: 2016-06-28T00:59:35Z Enables more linting rules. - Fixes some import related warnings and enabled related pylint rules. - Use pep8 for blank line related style check. pylint does not check for pep8 related to pep8. And pep8 is not configurable for indentations. So we need both tools. - Fixed existing lint error related to blank lines. I excluded 3 files for the below usage: T = TypeVariable('T') with_input_types(T) @with_output_types(List[T]) class TopCombineFn(core.CombineFn): Here pep8 style require 2 blanks lines between first and second lines, even though they are logically connected. And there is no option to suppress this error locally. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Commented] (BEAM-193) Port existing Dataflow SDK documentation to Beam Programming Guide
[ https://issues.apache.org/jira/browse/BEAM-193?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15352058#comment-15352058 ] ASF GitHub Bot commented on BEAM-193: - Github user devin-donnelly closed the pull request at: https://github.com/apache/incubator-beam-site/pull/17 > Port existing Dataflow SDK documentation to Beam Programming Guide > -- > > Key: BEAM-193 > URL: https://issues.apache.org/jira/browse/BEAM-193 > Project: Beam > Issue Type: Task > Components: website >Reporter: Devin Donnelly >Assignee: Devin Donnelly > > There is an extensive amount of documentation on the Dataflow SDK programming > model and classes. Port this documentation over as a new Beam Programming > Guide covering the following major topics: > - Programming model overview > - Pipeline structure > - PCollections > - Transforms > - I/O -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] incubator-beam-site pull request #17: BEAM-193 In-progress draft of Beam Pro...
Github user devin-donnelly closed the pull request at: https://github.com/apache/incubator-beam-site/pull/17 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] incubator-beam pull request #544: Explicitly set UseDummyRunner in IO, Exten...
GitHub user tgroh opened a pull request: https://github.com/apache/incubator-beam/pull/544 Explicitly set UseDummyRunner in IO, Extensions Be sure to do all of the following to help us incorporate your contribution quickly and easily: - [ ] Make sure the PR title is formatted like: `[BEAM-] Description of pull request` - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable Travis-CI on your fork and ensure the whole test matrix passes). - [ ] Replace `` in the title with the actual Jira issue number, if there is one. - [ ] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.txt). --- This mitigates test failures due to forkCount=0 in travis. You can merge this pull request into a Git repository by running: $ git pull https://github.com/tgroh/incubator-beam dummy-runner-false Alternatively you can review and apply these changes as the patch at: https://github.com/apache/incubator-beam/pull/544.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #544 commit ef6ee7cf32ec9b4d91d278ce175c100c4d3bb5bf Author: Thomas Groh Date: 2016-06-27T22:11:58Z Explicitly set UseDummyRunner in IO, Extensions This mitigates test failures due to forkCount=0 in travis. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Commented] (BEAM-354) Modify DatastoreIO to use Datastore v1beta3 API
[ https://issues.apache.org/jira/browse/BEAM-354?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15351789#comment-15351789 ] ASF GitHub Bot commented on BEAM-354: - GitHub user vikkyrk opened a pull request: https://github.com/apache/incubator-beam/pull/543 BEAM-354: Add Read/Write PTransforms to DatastoreIO Be sure to do all of the following to help us incorporate your contribution quickly and easily: - [ ] Make sure the PR title is formatted like: `[BEAM-] Description of pull request` - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable Travis-CI on your fork and ensure the whole test matrix passes). - [ ] Replace `` in the title with the actual Jira issue number, if there is one. - [ ] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.txt). --- Add Read/Write PTransforms for DatastoreIO and hide the source/sink implementations from being used directly You can merge this pull request into a Git repository by running: $ git pull https://github.com/vikkyrk/incubator-beam vikasrk/datastore-ptransform Alternatively you can review and apply these changes as the patch at: https://github.com/apache/incubator-beam/pull/543.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #543 > Modify DatastoreIO to use Datastore v1beta3 API > --- > > Key: BEAM-354 > URL: https://issues.apache.org/jira/browse/BEAM-354 > Project: Beam > Issue Type: Improvement > Components: sdk-java-gcp >Reporter: Vikas Kedigehalli >Assignee: Vikas Kedigehalli > > Datastore v1beta2 API is getting deprecated in favor of v1beta3. Hence the > DatastoreIO needs to be migrated to use the new version. Also in the process > of doing so, this is a good time to add a level of indirection via a > PTranform such that future changes in Datastore API would not result in > changing user/pipeline code. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] incubator-beam pull request #543: BEAM-354: Add Read/Write PTransforms to Da...
GitHub user vikkyrk opened a pull request: https://github.com/apache/incubator-beam/pull/543 BEAM-354: Add Read/Write PTransforms to DatastoreIO Be sure to do all of the following to help us incorporate your contribution quickly and easily: - [ ] Make sure the PR title is formatted like: `[BEAM-] Description of pull request` - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable Travis-CI on your fork and ensure the whole test matrix passes). - [ ] Replace `` in the title with the actual Jira issue number, if there is one. - [ ] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.txt). --- Add Read/Write PTransforms for DatastoreIO and hide the source/sink implementations from being used directly You can merge this pull request into a Git repository by running: $ git pull https://github.com/vikkyrk/incubator-beam vikasrk/datastore-ptransform Alternatively you can review and apply these changes as the patch at: https://github.com/apache/incubator-beam/pull/543.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #543 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] incubator-beam pull request #538: Pylint integration for Python SDK
Github user aaltay closed the pull request at: https://github.com/apache/incubator-beam/pull/538 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Commented] (BEAM-59) IOChannelFactory rethinking/redesign
[ https://issues.apache.org/jira/browse/BEAM-59?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15351692#comment-15351692 ] Amit Sela commented on BEAM-59: --- I've recently tried to work with IOChannelFactory and while I don't have anything smart enough to say towards a solution, I hope that providing my point of view will contribute somehow. As mentioned here: https://github.com/apache/incubator-beam/pull/539 the default behaviour of validate prevents me from using TextIO with HDFS without explicitly stating withoutValidation() - it only supports File or GS - HDFS is extremely important when talking about Apache's Hadoop eco-system. When trying to help people run the Spark runner with GS, I found out I need to register it with IOChannelUtils.. Those problems are because the Spark runner still doesn't implement the primitive Read.from, but even once this happens, there is still room for higher-level abstraction such as TextIO, JsonIO, ParquetIO, etc. and I think they should be as "translatable" as possible, with minimal constraints. Not all runner authors can easily influence the runner, and sometimes a runner will work better with it's own implementation. I guess what I'm trying to say is that there is a delicate balance between having a robust IO abstraction and making runner author's life easier ;-) Hope this helps, and hope to go into this deeper sometime soon.. > IOChannelFactory rethinking/redesign > > > Key: BEAM-59 > URL: https://issues.apache.org/jira/browse/BEAM-59 > Project: Beam > Issue Type: New Feature > Components: sdk-java-core, sdk-java-gcp >Reporter: Daniel Halperin > > Right now, FileBasedSource and FileBasedSink communication is mediated by > IOChannelFactory. There are a number of issues: > * Global configuration -- e.g., all 'gs://' URIs use the same credentials. > This should be per-source/per-sink/etc. > * Supported APIs -- currently IOChannelFactory is in the "non-public API" > util package and subject to change. We need users to be able to add new > backends ('s3://', 'hdfs://', etc.) directly, without fear that they will be > broken. > * Per-backend features: e.g., creating buckets in GCS/s3, setting expiration > time, etc. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] incubator-beam pull request #536: Static import Preconditions.checkX everywh...
Github user asfgit closed the pull request at: https://github.com/apache/incubator-beam/pull/536 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[1/4] incubator-beam git commit: Closes #536
Repository: incubator-beam Updated Branches: refs/heads/master fc803f60e -> 7d767056a Closes #536 Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/7d767056 Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/7d767056 Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/7d767056 Branch: refs/heads/master Commit: 7d767056a90e769eff68d4347e1b3a7bc43f415c Parents: fc803f6 02133b6 Author: Dan Halperin Authored: Mon Jun 27 12:38:02 2016 -0700 Committer: Dan Halperin Committed: Mon Jun 27 12:38:02 2016 -0700 -- .../beam/examples/complete/AutoComplete.java| 4 +- .../complete/game/injector/InjectorUtils.java | 6 +-- .../injector/RetryHttpInitializerWrapper.java | 5 +- .../beam/sdk/util/BatchTimerInternals.java | 7 +-- .../apache/beam/sdk/util/DoFnRunnerBase.java| 15 +++--- .../apache/beam/sdk/util/PaneInfoTracker.java | 11 +++-- .../beam/sdk/util/ReduceFnContextFactory.java | 6 +-- .../apache/beam/sdk/util/ReduceFnRunner.java| 26 +- .../org/apache/beam/sdk/util/TriggerRunner.java | 5 +- .../org/apache/beam/sdk/util/WatermarkHold.java | 19 .../beam/sdk/util/ReduceFnRunnerTest.java | 18 +++ .../apache/beam/sdk/util/ReduceFnTester.java| 19 .../beam/runners/direct/WatermarkManager.java | 5 +- .../FlinkPipelineExecutionEnvironment.java | 6 +-- .../FlinkStreamingTranslationContext.java | 8 +-- .../functions/FlinkProcessContext.java | 25 +- .../translation/types/CoderTypeInformation.java | 6 +-- .../utils/SerializedPipelineOptions.java| 8 +-- .../streaming/FlinkAbstractParDoWrapper.java| 10 ++-- .../FlinkGroupAlsoByWindowWrapper.java | 23 - .../streaming/FlinkParDoBoundMultiWrapper.java | 8 +-- .../streaming/io/UnboundedFlinkSink.java| 2 +- .../streaming/io/UnboundedFlinkSource.java | 7 ++- .../streaming/state/FlinkStateInternals.java| 7 +-- .../dataflow/DataflowPipelineTranslator.java| 15 +++--- .../beam/runners/dataflow/DataflowRunner.java | 19 .../runners/dataflow/internal/IsmFormat.java| 8 +-- .../options/DataflowWorkerLoggingOptions.java | 14 +++--- .../dataflow/util/DataflowPathValidator.java| 14 +++--- .../beam/runners/dataflow/util/GcsStager.java | 5 +- .../beam/runners/spark/io/CreateStream.java | 7 +-- .../apache/beam/runners/spark/io/KafkaIO.java | 23 - .../beam/runners/spark/io/hadoop/HadoopIO.java | 38 +++ .../main/java/org/apache/beam/sdk/Pipeline.java | 7 +-- .../java/org/apache/beam/sdk/coders/Coder.java | 6 +-- .../apache/beam/sdk/coders/CoderRegistry.java | 5 +- .../apache/beam/sdk/coders/CollectionCoder.java | 7 ++- .../apache/beam/sdk/coders/IterableCoder.java | 7 ++- .../beam/sdk/coders/IterableLikeCoder.java | 10 ++-- .../org/apache/beam/sdk/coders/KvCoder.java | 7 ++- .../org/apache/beam/sdk/coders/ListCoder.java | 7 ++- .../org/apache/beam/sdk/coders/MapCoder.java| 6 +-- .../apache/beam/sdk/coders/NullableCoder.java | 6 +-- .../org/apache/beam/sdk/coders/SetCoder.java| 7 ++- .../java/org/apache/beam/sdk/io/AvroIO.java | 6 +-- .../java/org/apache/beam/sdk/io/AvroSource.java | 12 ++--- .../apache/beam/sdk/io/CompressedSource.java| 10 ++-- .../org/apache/beam/sdk/io/DatastoreIO.java | 7 ++- .../org/apache/beam/sdk/io/FileBasedSink.java | 13 +++-- .../apache/beam/sdk/io/OffsetBasedSource.java | 12 ++--- .../java/org/apache/beam/sdk/io/TextIO.java | 6 +-- .../java/org/apache/beam/sdk/io/XmlSink.java| 10 ++-- .../java/org/apache/beam/sdk/io/XmlSource.java | 11 +++-- .../sdk/options/PipelineOptionsFactory.java | 28 +-- .../sdk/options/PipelineOptionsValidator.java | 18 --- .../sdk/options/ProxyInvocationHandler.java | 9 ++-- .../apache/beam/sdk/runners/PipelineRunner.java | 6 +-- .../beam/sdk/runners/TransformHierarchy.java| 10 ++-- .../beam/sdk/runners/TransformTreeNode.java | 13 ++--- .../sdk/transforms/ApproximateQuantiles.java| 9 ++-- .../org/apache/beam/sdk/transforms/Combine.java | 8 +-- .../org/apache/beam/sdk/transforms/Create.java | 5 +- .../transforms/IntraBundleParallelization.java | 9 ++-- .../org/apache/beam/sdk/transforms/Sample.java | 6 +-- .../org/apache/beam/sdk/transforms/Top.java | 7 ++- .../sdk/transforms/display/DisplayData.java | 4 +- .../beam/sdk/transforms/join/CoGbkResult.java | 6 +-- .../beam/sdk/transforms/windowing/AfterAll.java | 6 +-- .../sdk/transforms/windowing/AfterFirst.java| 9 ++-- .../beam/sdk/transforms/windowing/PaneInfo.java | 11 +++-- ...AttemptAndTimeBoundedExponentialBackOff.java | 11 +++-- .../util/AttemptBoundedExponentialBackOff.java | 9 ++-- .../beam/sdk/util/BigQu
[3/4] incubator-beam git commit: Static import Preconditions.checkX everywhere
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/02133b65/runners/spark/src/main/java/org/apache/beam/runners/spark/io/hadoop/HadoopIO.java -- diff --git a/runners/spark/src/main/java/org/apache/beam/runners/spark/io/hadoop/HadoopIO.java b/runners/spark/src/main/java/org/apache/beam/runners/spark/io/hadoop/HadoopIO.java index 00c10d4..1177a57 100644 --- a/runners/spark/src/main/java/org/apache/beam/runners/spark/io/hadoop/HadoopIO.java +++ b/runners/spark/src/main/java/org/apache/beam/runners/spark/io/hadoop/HadoopIO.java @@ -17,6 +17,9 @@ */ package org.apache.beam.runners.spark.io.hadoop; +import static com.google.common.base.Preconditions.checkArgument; +import static com.google.common.base.Preconditions.checkNotNull; + import org.apache.beam.sdk.io.ShardNameTemplate; import org.apache.beam.sdk.transforms.PTransform; import org.apache.beam.sdk.util.WindowingStrategy; @@ -25,8 +28,6 @@ import org.apache.beam.sdk.values.PCollection; import org.apache.beam.sdk.values.PDone; import org.apache.beam.sdk.values.PInput; -import com.google.common.base.Preconditions; - import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; @@ -68,14 +69,10 @@ public final class HadoopIO { Bound(String filepattern, Class> format, Class key, Class value) { -Preconditions.checkNotNull(filepattern, - "need to set the filepattern of an HadoopIO.Read transform"); -Preconditions.checkNotNull(format, - "need to set the format class of an HadoopIO.Read transform"); -Preconditions.checkNotNull(key, - "need to set the key class of an HadoopIO.Read transform"); -Preconditions.checkNotNull(value, - "need to set the value class of an HadoopIO.Read transform"); +checkNotNull(filepattern, "need to set the filepattern of an HadoopIO.Read transform"); +checkNotNull(format, "need to set the format class of an HadoopIO.Read transform"); +checkNotNull(key, "need to set the key class of an HadoopIO.Read transform"); +checkNotNull(value, "need to set the value class of an HadoopIO.Read transform"); this.filepattern = filepattern; this.formatClass = format; this.keyClass = key; @@ -203,17 +200,16 @@ public final class HadoopIO { @Override public PDone apply(PCollection> input) { -Preconditions.checkNotNull(filenamePrefix, -"need to set the filename prefix of an HadoopIO.Write transform"); -Preconditions.checkNotNull(formatClass, -"need to set the format class of an HadoopIO.Write transform"); -Preconditions.checkNotNull(keyClass, -"need to set the key class of an HadoopIO.Write transform"); -Preconditions.checkNotNull(valueClass, -"need to set the value class of an HadoopIO.Write transform"); - - Preconditions.checkArgument(ShardNameTemplateAware.class.isAssignableFrom(formatClass), -"Format class must implement " + ShardNameTemplateAware.class.getName()); +checkNotNull( +filenamePrefix, "need to set the filename prefix of an HadoopIO.Write transform"); +checkNotNull(formatClass, "need to set the format class of an HadoopIO.Write transform"); +checkNotNull(keyClass, "need to set the key class of an HadoopIO.Write transform"); +checkNotNull(valueClass, "need to set the value class of an HadoopIO.Write transform"); + +checkArgument( +ShardNameTemplateAware.class.isAssignableFrom(formatClass), +"Format class must implement %s", +ShardNameTemplateAware.class.getName()); return PDone.in(input.getPipeline()); } http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/02133b65/sdks/java/core/src/main/java/org/apache/beam/sdk/Pipeline.java -- diff --git a/sdks/java/core/src/main/java/org/apache/beam/sdk/Pipeline.java b/sdks/java/core/src/main/java/org/apache/beam/sdk/Pipeline.java index e264bc6..31ae2dc 100644 --- a/sdks/java/core/src/main/java/org/apache/beam/sdk/Pipeline.java +++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/Pipeline.java @@ -17,6 +17,8 @@ */ package org.apache.beam.sdk; +import static com.google.common.base.Preconditions.checkState; + import org.apache.beam.sdk.coders.CoderRegistry; import org.apache.beam.sdk.io.Read; import org.apache.beam.sdk.options.PipelineOptions; @@ -34,7 +36,6 @@ import org.apache.beam.sdk.values.PInput; import org.apache.beam.sdk.values.POutput; import org.apache.beam.sdk.values.PValue; -import com.google.common.base.Preconditions; import com.google.common.collect.HashMultimap; import co
[4/4] incubator-beam git commit: Static import Preconditions.checkX everywhere
Static import Preconditions.checkX everywhere Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/02133b65 Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/02133b65 Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/02133b65 Branch: refs/heads/master Commit: 02133b6558e5180777123e22f4d41dee17aa67af Parents: fc803f6 Author: Dan Halperin Authored: Sun Jun 26 02:05:08 2016 -0700 Committer: Dan Halperin Committed: Mon Jun 27 12:38:02 2016 -0700 -- .../beam/examples/complete/AutoComplete.java| 4 +- .../complete/game/injector/InjectorUtils.java | 6 +-- .../injector/RetryHttpInitializerWrapper.java | 5 +- .../beam/sdk/util/BatchTimerInternals.java | 7 +-- .../apache/beam/sdk/util/DoFnRunnerBase.java| 15 +++--- .../apache/beam/sdk/util/PaneInfoTracker.java | 11 +++-- .../beam/sdk/util/ReduceFnContextFactory.java | 6 +-- .../apache/beam/sdk/util/ReduceFnRunner.java| 26 +- .../org/apache/beam/sdk/util/TriggerRunner.java | 5 +- .../org/apache/beam/sdk/util/WatermarkHold.java | 19 .../beam/sdk/util/ReduceFnRunnerTest.java | 18 +++ .../apache/beam/sdk/util/ReduceFnTester.java| 19 .../beam/runners/direct/WatermarkManager.java | 5 +- .../FlinkPipelineExecutionEnvironment.java | 6 +-- .../FlinkStreamingTranslationContext.java | 8 +-- .../functions/FlinkProcessContext.java | 25 +- .../translation/types/CoderTypeInformation.java | 6 +-- .../utils/SerializedPipelineOptions.java| 8 +-- .../streaming/FlinkAbstractParDoWrapper.java| 10 ++-- .../FlinkGroupAlsoByWindowWrapper.java | 23 - .../streaming/FlinkParDoBoundMultiWrapper.java | 8 +-- .../streaming/io/UnboundedFlinkSink.java| 2 +- .../streaming/io/UnboundedFlinkSource.java | 7 ++- .../streaming/state/FlinkStateInternals.java| 7 +-- .../dataflow/DataflowPipelineTranslator.java| 15 +++--- .../beam/runners/dataflow/DataflowRunner.java | 19 .../runners/dataflow/internal/IsmFormat.java| 8 +-- .../options/DataflowWorkerLoggingOptions.java | 14 +++--- .../dataflow/util/DataflowPathValidator.java| 14 +++--- .../beam/runners/dataflow/util/GcsStager.java | 5 +- .../beam/runners/spark/io/CreateStream.java | 7 +-- .../apache/beam/runners/spark/io/KafkaIO.java | 23 - .../beam/runners/spark/io/hadoop/HadoopIO.java | 38 +++ .../main/java/org/apache/beam/sdk/Pipeline.java | 7 +-- .../java/org/apache/beam/sdk/coders/Coder.java | 6 +-- .../apache/beam/sdk/coders/CoderRegistry.java | 5 +- .../apache/beam/sdk/coders/CollectionCoder.java | 7 ++- .../apache/beam/sdk/coders/IterableCoder.java | 7 ++- .../beam/sdk/coders/IterableLikeCoder.java | 10 ++-- .../org/apache/beam/sdk/coders/KvCoder.java | 7 ++- .../org/apache/beam/sdk/coders/ListCoder.java | 7 ++- .../org/apache/beam/sdk/coders/MapCoder.java| 6 +-- .../apache/beam/sdk/coders/NullableCoder.java | 6 +-- .../org/apache/beam/sdk/coders/SetCoder.java| 7 ++- .../java/org/apache/beam/sdk/io/AvroIO.java | 6 +-- .../java/org/apache/beam/sdk/io/AvroSource.java | 12 ++--- .../apache/beam/sdk/io/CompressedSource.java| 10 ++-- .../org/apache/beam/sdk/io/DatastoreIO.java | 7 ++- .../org/apache/beam/sdk/io/FileBasedSink.java | 13 +++-- .../apache/beam/sdk/io/OffsetBasedSource.java | 12 ++--- .../java/org/apache/beam/sdk/io/TextIO.java | 6 +-- .../java/org/apache/beam/sdk/io/XmlSink.java| 10 ++-- .../java/org/apache/beam/sdk/io/XmlSource.java | 11 +++-- .../sdk/options/PipelineOptionsFactory.java | 28 +-- .../sdk/options/PipelineOptionsValidator.java | 18 --- .../sdk/options/ProxyInvocationHandler.java | 9 ++-- .../apache/beam/sdk/runners/PipelineRunner.java | 6 +-- .../beam/sdk/runners/TransformHierarchy.java| 10 ++-- .../beam/sdk/runners/TransformTreeNode.java | 13 ++--- .../sdk/transforms/ApproximateQuantiles.java| 9 ++-- .../org/apache/beam/sdk/transforms/Combine.java | 8 +-- .../org/apache/beam/sdk/transforms/Create.java | 5 +- .../transforms/IntraBundleParallelization.java | 9 ++-- .../org/apache/beam/sdk/transforms/Sample.java | 6 +-- .../org/apache/beam/sdk/transforms/Top.java | 7 ++- .../sdk/transforms/display/DisplayData.java | 4 +- .../beam/sdk/transforms/join/CoGbkResult.java | 6 +-- .../beam/sdk/transforms/windowing/AfterAll.java | 6 +-- .../sdk/transforms/windowing/AfterFirst.java| 9 ++-- .../beam/sdk/transforms/windowing/PaneInfo.java | 11 +++-- ...AttemptAndTimeBoundedExponentialBackOff.java | 11 +++-- .../util/AttemptBoundedExponentialBackOff.java | 9 ++-- .../beam/sdk/util/BigQueryTableInserter.java| 5 +- .../org/apache/beam/sdk/util/C
[2/4] incubator-beam git commit: Static import Preconditions.checkX everywhere
http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/02133b65/sdks/java/core/src/main/java/org/apache/beam/sdk/util/InstanceBuilder.java -- diff --git a/sdks/java/core/src/main/java/org/apache/beam/sdk/util/InstanceBuilder.java b/sdks/java/core/src/main/java/org/apache/beam/sdk/util/InstanceBuilder.java index e5d9916..08e07ce 100644 --- a/sdks/java/core/src/main/java/org/apache/beam/sdk/util/InstanceBuilder.java +++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/util/InstanceBuilder.java @@ -17,10 +17,12 @@ */ package org.apache.beam.sdk.util; +import static com.google.common.base.Preconditions.checkArgument; +import static com.google.common.base.Preconditions.checkState; + import org.apache.beam.sdk.values.TypeDescriptor; import com.google.common.base.Joiner; -import com.google.common.base.Preconditions; import java.lang.reflect.Constructor; import java.lang.reflect.InvocationTargetException; @@ -80,8 +82,7 @@ public class InstanceBuilder { */ public InstanceBuilder fromClassName(String name) throws ClassNotFoundException { -Preconditions.checkArgument(factoryClass == null, -"Class name may only be specified once"); +checkArgument(factoryClass == null, "Class name may only be specified once"); if (name.indexOf('.') == -1) { name = type.getPackage().getName() + "." + name; } @@ -114,7 +115,7 @@ public class InstanceBuilder { * Modifies and returns the {@code InstanceBuilder} for chaining. */ public InstanceBuilder fromFactoryMethod(String methodName) { -Preconditions.checkArgument(this.methodName == null, +checkArgument(this.methodName == null, "Factory method name may only be specified once"); this.methodName = methodName; return this; @@ -201,18 +202,18 @@ public class InstanceBuilder { } private T buildFromMethod(Class[] types) { -Preconditions.checkState(factoryClass != null); -Preconditions.checkState(methodName != null); +checkState(factoryClass != null); +checkState(methodName != null); try { Method method = factoryClass.getDeclaredMethod(methodName, types); - Preconditions.checkState(Modifier.isStatic(method.getModifiers()), + checkState(Modifier.isStatic(method.getModifiers()), "Factory method must be a static method for " + factoryClass.getName() + "#" + method.getName() ); - Preconditions.checkState(type.isAssignableFrom(method.getReturnType()), + checkState(type.isAssignableFrom(method.getReturnType()), "Return type for " + factoryClass.getName() + "#" + method.getName() + " must be assignable to " + type.getSimpleName()); @@ -241,12 +242,12 @@ public class InstanceBuilder { } private T buildFromConstructor(Class[] types) { -Preconditions.checkState(factoryClass != null); +checkState(factoryClass != null); try { Constructor constructor = factoryClass.getDeclaredConstructor(types); - Preconditions.checkState(type.isAssignableFrom(factoryClass), + checkState(type.isAssignableFrom(factoryClass), "Instance type " + factoryClass.getName() + " must be assignable to " + type.getSimpleName()); http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/02133b65/sdks/java/core/src/main/java/org/apache/beam/sdk/util/IntervalBoundedExponentialBackOff.java -- diff --git a/sdks/java/core/src/main/java/org/apache/beam/sdk/util/IntervalBoundedExponentialBackOff.java b/sdks/java/core/src/main/java/org/apache/beam/sdk/util/IntervalBoundedExponentialBackOff.java index 92ff2f0..519776a 100644 --- a/sdks/java/core/src/main/java/org/apache/beam/sdk/util/IntervalBoundedExponentialBackOff.java +++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/util/IntervalBoundedExponentialBackOff.java @@ -17,8 +17,10 @@ */ package org.apache.beam.sdk.util; +import static com.google.common.base.Preconditions.checkArgument; + import com.google.api.client.util.BackOff; -import com.google.common.base.Preconditions; + /** * Implementation of {@link BackOff} that increases the back off period for each retry attempt @@ -55,10 +57,8 @@ public class IntervalBoundedExponentialBackOff implements BackOff { private int currentAttempt; public IntervalBoundedExponentialBackOff(long maximumIntervalMillis, long initialIntervalMillis) { -Preconditions.checkArgument( -maximumIntervalMillis > 0, "Maximum interval must be greater than zero."); -Preconditions.checkArgument( -initialIntervalMillis > 0, "Initial interval must be greater than zero."); +checkArgument(maximumIntervalMillis > 0, "Maximum interval must be greater than zero."); +checkArgument(initialIntervalMillis > 0, "Initial interval must be greater than zero."); this.maximumIntervalMillis
[GitHub] incubator-beam pull request #505: [BEAM-364] End to end integration tests fo...
Github user asfgit closed the pull request at: https://github.com/apache/incubator-beam/pull/505 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[1/2] incubator-beam git commit: Closes #505
Repository: incubator-beam Updated Branches: refs/heads/master 05a1d20b2 -> fc803f60e Closes #505 Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/fc803f60 Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/fc803f60 Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/fc803f60 Branch: refs/heads/master Commit: fc803f60ef2d703adfa80ebef60108a92ad9cc12 Parents: 05a1d20 76173da Author: Dan Halperin Authored: Mon Jun 27 12:36:22 2016 -0700 Committer: Dan Halperin Committed: Mon Jun 27 12:36:22 2016 -0700 -- sdks/java/io/google-cloud-platform/pom.xml | 47 + .../io/gcp/bigtable/BigtableTestOptions.java| 42 .../sdk/io/gcp/bigtable/BigtableReadIT.java | 61 ++ .../sdk/io/gcp/bigtable/BigtableWriteIT.java| 197 +++ 4 files changed, 347 insertions(+) --
[jira] [Commented] (BEAM-364) Integration Tests for Bigtable Read and Write
[ https://issues.apache.org/jira/browse/BEAM-364?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15351665#comment-15351665 ] ASF GitHub Bot commented on BEAM-364: - Github user asfgit closed the pull request at: https://github.com/apache/incubator-beam/pull/505 > Integration Tests for Bigtable Read and Write > - > > Key: BEAM-364 > URL: https://issues.apache.org/jira/browse/BEAM-364 > Project: Beam > Issue Type: Test > Components: sdk-java-gcp >Reporter: Ian Zhou >Assignee: Ian Zhou >Priority: Minor > > Integration tests should be added for BigtableIO.read and BigtableIO.write. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[2/2] incubator-beam git commit: Added integration tests for BigtableRead and BigtableWrite
Added integration tests for BigtableRead and BigtableWrite Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/76173da2 Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/76173da2 Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/76173da2 Branch: refs/heads/master Commit: 76173da215dc9878b9fa49cfe9a77541e2adf4be Parents: 05a1d20 Author: Ian Zhou Authored: Fri Jun 17 14:21:05 2016 -0700 Committer: Dan Halperin Committed: Mon Jun 27 12:36:22 2016 -0700 -- sdks/java/io/google-cloud-platform/pom.xml | 47 + .../io/gcp/bigtable/BigtableTestOptions.java| 42 .../sdk/io/gcp/bigtable/BigtableReadIT.java | 61 ++ .../sdk/io/gcp/bigtable/BigtableWriteIT.java| 197 +++ 4 files changed, 347 insertions(+) -- http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/76173da2/sdks/java/io/google-cloud-platform/pom.xml -- diff --git a/sdks/java/io/google-cloud-platform/pom.xml b/sdks/java/io/google-cloud-platform/pom.xml index 5786e84..bb5fd11 100644 --- a/sdks/java/io/google-cloud-platform/pom.xml +++ b/sdks/java/io/google-cloud-platform/pom.xml @@ -53,7 +53,39 @@ org.apache.maven.plugins maven-checkstyle-plugin + + + +org.apache.maven.plugins +maven-failsafe-plugin + + false + true + + + + + integration-test + verify + + + + ${integrationTestPipelineOptions} + + + + + + + + + +kr.motd.maven +os-maven-plugin +1.4.0.Final + + @@ -99,6 +131,14 @@ jsr305 + + io.netty + netty-tcnative-boringssl-static + 1.1.33.Fork13 + ${os.detected.classifier} + runtime + + org.apache.beam @@ -115,6 +155,13 @@ + org.apache.beam + beam-runners-google-cloud-dataflow-java + ${project.version} + test + + + org.hamcrest hamcrest-all test http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/76173da2/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigtable/BigtableTestOptions.java -- diff --git a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigtable/BigtableTestOptions.java b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigtable/BigtableTestOptions.java new file mode 100644 index 000..0cd4f57 --- /dev/null +++ b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigtable/BigtableTestOptions.java @@ -0,0 +1,42 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.beam.sdk.io.gcp.bigtable; + +import org.apache.beam.sdk.options.Default; +import org.apache.beam.sdk.options.Description; +import org.apache.beam.sdk.testing.TestPipelineOptions; + +/** + * Properties needed when using Bigtable with the Beam SDK. + */ +public interface BigtableTestOptions extends TestPipelineOptions { + @Description("Project ID for Bigtable") + @Default.String("apache-beam-testing") + String getProjectId(); + void setProjectId(String value); + + @Description("Cluster ID for Bigtable") + @Default.String("beam-test") + String getClusterId(); + void setClusterId(String value); + + @Description("Zone ID for Bigtable") + @Default.String("us-central1-c") + String getZoneId(); + void setZoneId(String value); +} http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/76173da2/sdks/java/io/google-cloud-platform/src/test/java/org/apache/beam/sdk/io/gcp/bigtable/BigtableReadIT.java -- diff --git a/sdks/java/io/google-cloud-
[GitHub] incubator-beam pull request #542: BEAM-379: Add support for source Ptransfor...
Github user asfgit closed the pull request at: https://github.com/apache/incubator-beam/pull/542 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Commented] (BEAM-379) DisplayDataEvaluator does not support source transforms of the form PTransform
[ https://issues.apache.org/jira/browse/BEAM-379?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15351657#comment-15351657 ] ASF GitHub Bot commented on BEAM-379: - Github user asfgit closed the pull request at: https://github.com/apache/incubator-beam/pull/542 > DisplayDataEvaluator does not support source transforms of the form > PTransform > - > > Key: BEAM-379 > URL: https://issues.apache.org/jira/browse/BEAM-379 > Project: Beam > Issue Type: Bug > Components: sdk-java-core >Reporter: Vikas Kedigehalli >Assignee: Vikas Kedigehalli > > DisplayDataEvaluator > (https://github.com/apache/incubator-beam/blob/c0efe568e5291298c1394016a12e7979b37afc44/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/display/DisplayDataEvaluator.java#L81) > takes PTranform, ? extends POutput>, but this > doesn't work for source transforms of the form PTransform PCollection>. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[1/2] incubator-beam git commit: Closes #542
Repository: incubator-beam Updated Branches: refs/heads/master 9abd0926a -> 05a1d20b2 Closes #542 Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/05a1d20b Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/05a1d20b Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/05a1d20b Branch: refs/heads/master Commit: 05a1d20b2b167d0a6fbfd6abe48863ef10ca5e3c Parents: 9abd092 8fdf434 Author: Dan Halperin Authored: Mon Jun 27 12:32:23 2016 -0700 Committer: Dan Halperin Committed: Mon Jun 27 12:32:23 2016 -0700 -- .../display/DisplayDataEvaluator.java | 29 +--- .../display/DisplayDataEvaluatorTest.java | 14 ++ 2 files changed, 39 insertions(+), 4 deletions(-) --
[2/2] incubator-beam git commit: DisplayDataEvaluator: Add support for source transforms
DisplayDataEvaluator: Add support for source transforms Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/8fdf434f Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/8fdf434f Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/8fdf434f Branch: refs/heads/master Commit: 8fdf434f18f00f5aa76c8cc9993e29f7fee3c33a Parents: 9abd092 Author: Vikas Kedigehalli Authored: Mon Jun 27 11:14:11 2016 -0700 Committer: Dan Halperin Committed: Mon Jun 27 12:32:23 2016 -0700 -- .../display/DisplayDataEvaluator.java | 29 +--- .../display/DisplayDataEvaluatorTest.java | 14 ++ 2 files changed, 39 insertions(+), 4 deletions(-) -- http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/8fdf434f/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/display/DisplayDataEvaluator.java -- diff --git a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/display/DisplayDataEvaluator.java b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/display/DisplayDataEvaluator.java index a17e06f..a78a4ad 100644 --- a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/display/DisplayDataEvaluator.java +++ b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/display/DisplayDataEvaluator.java @@ -25,6 +25,7 @@ import org.apache.beam.sdk.runners.TransformTreeNode; import org.apache.beam.sdk.testing.TestPipeline; import org.apache.beam.sdk.transforms.Create; import org.apache.beam.sdk.transforms.PTransform; +import org.apache.beam.sdk.values.PBegin; import org.apache.beam.sdk.values.PCollection; import org.apache.beam.sdk.values.POutput; @@ -79,8 +80,8 @@ public class DisplayDataEvaluator { * @return the set of {@link DisplayData} for primitive {@link PTransform PTransforms}. */ public Set displayDataForPrimitiveTransforms( -final PTransform, ? extends POutput> root, -Coder inputCoder) { + final PTransform, ? extends POutput> root, + Coder inputCoder) { Create.Values input = Create.of(); if (inputCoder != null) { @@ -89,9 +90,29 @@ public class DisplayDataEvaluator { Pipeline pipeline = Pipeline.create(options); pipeline - .apply(input) - .apply(root); +.apply(input) +.apply(root); +return displayDataForPipeline(pipeline, root); + } + + /** + * Traverse the specified source {@link PTransform}, collecting {@link DisplayData} registered + * on the inner primitive {@link PTransform PTransforms}. + * + * @param root The source root {@link PTransform} to traverse + * @return the set of {@link DisplayData} for primitive source {@link PTransform PTransforms}. + */ + public Set displayDataForPrimitiveSourceTransforms( + final PTransform root) { +Pipeline pipeline = Pipeline.create(options); +pipeline +.apply(root); + +return displayDataForPipeline(pipeline, root); + } + + private static Set displayDataForPipeline(Pipeline pipeline, PTransform root) { PrimitiveDisplayDataPTransformVisitor visitor = new PrimitiveDisplayDataPTransformVisitor(root); pipeline.traverseTopologically(visitor); return visitor.getPrimitivesDisplayData(); http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/8fdf434f/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/display/DisplayDataEvaluatorTest.java -- diff --git a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/display/DisplayDataEvaluatorTest.java b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/display/DisplayDataEvaluatorTest.java index 318c116..ce32b7d 100644 --- a/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/display/DisplayDataEvaluatorTest.java +++ b/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/display/DisplayDataEvaluatorTest.java @@ -23,9 +23,11 @@ import static org.hamcrest.Matchers.hasItem; import static org.hamcrest.Matchers.not; import static org.junit.Assert.assertThat; +import org.apache.beam.sdk.io.TextIO; import org.apache.beam.sdk.transforms.DoFn; import org.apache.beam.sdk.transforms.PTransform; import org.apache.beam.sdk.transforms.ParDo; +import org.apache.beam.sdk.values.PBegin; import org.apache.beam.sdk.values.PCollection; import org.apache.beam.sdk.values.POutput; @@ -92,4 +94,16 @@ public class DisplayDataEvaluatorTest implements Serializable { assertThat(displayData, hasItem(hasDisplayItem("foo"))); } + + @Test + public void testSourceTransform() { +PTransform myTransform = TextIO.Read +.from("foo.*") +.withoutValidation(); + +DisplayDataEvaluator evaluat
[2/2] incubator-beam git commit: Pylint integration for Python SDK
Pylint integration for Python SDK Runs pylint on modified files under sdks/python/**/*.py. Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/f71c1ddd Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/f71c1ddd Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/f71c1ddd Branch: refs/heads/python-sdk Commit: f71c1dddb10d4a8b0f8c8ed8722e6662781d19f9 Parents: 21c819f Author: Ahmet Altay Authored: Fri Jun 24 14:04:27 2016 -0700 Committer: Dan Halperin Committed: Mon Jun 27 12:30:02 2016 -0700 -- sdks/python/.pylintrc | 174 +++ sdks/python/apache_beam/coders/coders.py| 2 +- sdks/python/apache_beam/coders/stream_test.py | 2 +- .../apache_beam/examples/streaming_wordcount.py | 2 +- sdks/python/apache_beam/examples/wordcount.py | 4 +- .../apache_beam/examples/wordcount_debugging.py | 6 +- sdks/python/apache_beam/io/bigquery.py | 2 +- .../apache_beam/transforms/combiners_test.py| 7 +- sdks/python/apache_beam/transforms/core.py | 17 +- .../apache_beam/transforms/ptransform_test.py | 43 +++-- .../apache_beam/transforms/trigger_test.py | 25 +-- .../apache_beam/typehints/trivial_inference.py | 3 +- sdks/python/apache_beam/typehints/typehints.py | 5 +- sdks/python/run_pylint.sh | 48 + sdks/python/tox.ini | 2 + 15 files changed, 292 insertions(+), 50 deletions(-) -- http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/f71c1ddd/sdks/python/.pylintrc -- diff --git a/sdks/python/.pylintrc b/sdks/python/.pylintrc new file mode 100644 index 000..edf13ee --- /dev/null +++ b/sdks/python/.pylintrc @@ -0,0 +1,174 @@ +# +#Licensed to the Apache Software Foundation (ASF) under one or more +#contributor license agreements. See the NOTICE file distributed with +#this work for additional information regarding copyright ownership. +#The ASF licenses this file to You under the Apache License, Version 2.0 +#(the "License"); you may not use this file except in compliance with +#the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +#Unless required by applicable law or agreed to in writing, software +#distributed under the License is distributed on an "AS IS" BASIS, +#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +#See the License for the specific language governing permissions and +#limitations under the License. +# + +[MASTER] +# Ignore auto-generated files. +ignore=clients,windmill_pb2.py,windmill_service_pb2.py + +[BASIC] +# Regular expression which should only match the name +# of functions or classes which do not require a docstring. +no-docstring-rgx=(__.*__|main) + +# Min length in lines of a function that requires a docstring. +docstring-min-length=10 + +# Regular expression which should only match correct module names. The +# leading underscore is sanctioned for private modules by Google's style +# guide. +# +# There are exceptions to the basic rule (_?[a-z][a-z0-9_]*) to cover +# requirements of Python's module system and of the presubmit framework. +module-rgx=^(_?[a-z][a-z0-9_]*)|__init__|PRESUBMIT|PRESUBMIT_unittest$ + +# Regular expression which should only match correct module level names +const-rgx=^(_?[A-Z][A-Z0-9_]*|__[a-z0-9_]+__|_?[a-z][a-z0-9_]*)$ + +# Regular expression which should only match correct class attribute +class-attribute-rgx=^(_?[A-Z][A-Z0-9_]*|__[a-z0-9_]+__|_?[a-z][a-z0-9_]*)$ + +# Regular expression which should only match correct class names +class-rgx=^_?[A-Z][a-zA-Z0-9]*$ + +# Regular expression which should only match correct function names. +# 'camel_case' and 'snake_case' group names are used for consistency of naming +# styles across functions and methods. +function-rgx=^(?:(?P_?[A-Z][a-zA-Z0-9]*)|(?P_?[a-z][a-z0-9_]*))$ + +# Regular expression which should only match correct method names. +# 'camel_case' and 'snake_case' group names are used for consistency of naming +# styles across functions and methods. 'exempt' indicates a name which is +# consistent with all naming styles. +method-rgx=^(?:(?P__[a-z0-9_]+__|next)|(?P_{0,2}[A-Z][a-zA-Z0-9]*)|(?P_{0,2}[a-z][a-z0-9_]*))$ + +# Regular expression which should only match correct instance attribute names +attr-rgx=^_{0,2}[a-z][a-z0-9_]*$ + +# Regular expression which should only match correct argument names +argument-rgx=^[a-z][a-z0-9_]*$ + +# Regular expression which should only match correct variable names +variable-rgx=^[a-z][a-z0-9_]*$ + +# Regular expression which should only match correct list comprehension /
[1/2] incubator-beam git commit: Closes #538
Repository: incubator-beam Updated Branches: refs/heads/python-sdk 21c819f81 -> 3464a909b Closes #538 Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/3464a909 Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/3464a909 Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/3464a909 Branch: refs/heads/python-sdk Commit: 3464a909befafb74146bfc49492b267b1da25234 Parents: 21c819f f71c1dd Author: Dan Halperin Authored: Mon Jun 27 12:30:02 2016 -0700 Committer: Dan Halperin Committed: Mon Jun 27 12:30:02 2016 -0700 -- sdks/python/.pylintrc | 174 +++ sdks/python/apache_beam/coders/coders.py| 2 +- sdks/python/apache_beam/coders/stream_test.py | 2 +- .../apache_beam/examples/streaming_wordcount.py | 2 +- sdks/python/apache_beam/examples/wordcount.py | 4 +- .../apache_beam/examples/wordcount_debugging.py | 6 +- sdks/python/apache_beam/io/bigquery.py | 2 +- .../apache_beam/transforms/combiners_test.py| 7 +- sdks/python/apache_beam/transforms/core.py | 17 +- .../apache_beam/transforms/ptransform_test.py | 43 +++-- .../apache_beam/transforms/trigger_test.py | 25 +-- .../apache_beam/typehints/trivial_inference.py | 3 +- sdks/python/apache_beam/typehints/typehints.py | 5 +- sdks/python/run_pylint.sh | 48 + sdks/python/tox.ini | 2 + 15 files changed, 292 insertions(+), 50 deletions(-) --
[jira] [Commented] (BEAM-379) DisplayDataEvaluator does not support source transforms of the form PTransform
[ https://issues.apache.org/jira/browse/BEAM-379?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15351570#comment-15351570 ] ASF GitHub Bot commented on BEAM-379: - GitHub user vikkyrk opened a pull request: https://github.com/apache/incubator-beam/pull/542 BEAM-379: Add support for source Ptransform in DisplayDataEvaluator Be sure to do all of the following to help us incorporate your contribution quickly and easily: - [ ] Make sure the PR title is formatted like: `[BEAM-] Description of pull request` - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable Travis-CI on your fork and ensure the whole test matrix passes). - [ ] Replace `` in the title with the actual Jira issue number, if there is one. - [ ] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.txt). --- You can merge this pull request into a Git repository by running: $ git pull https://github.com/vikkyrk/incubator-beam vikasrk/display_evaluator Alternatively you can review and apply these changes as the patch at: https://github.com/apache/incubator-beam/pull/542.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #542 > DisplayDataEvaluator does not support source transforms of the form > PTransform > - > > Key: BEAM-379 > URL: https://issues.apache.org/jira/browse/BEAM-379 > Project: Beam > Issue Type: Bug > Components: sdk-java-core >Reporter: Vikas Kedigehalli >Assignee: Vikas Kedigehalli > > DisplayDataEvaluator > (https://github.com/apache/incubator-beam/blob/c0efe568e5291298c1394016a12e7979b37afc44/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/display/DisplayDataEvaluator.java#L81) > takes PTranform, ? extends POutput>, but this > doesn't work for source transforms of the form PTransform PCollection>. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] incubator-beam pull request #542: BEAM-379: Add support for source Ptransfor...
GitHub user vikkyrk opened a pull request: https://github.com/apache/incubator-beam/pull/542 BEAM-379: Add support for source Ptransform in DisplayDataEvaluator Be sure to do all of the following to help us incorporate your contribution quickly and easily: - [ ] Make sure the PR title is formatted like: `[BEAM-] Description of pull request` - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable Travis-CI on your fork and ensure the whole test matrix passes). - [ ] Replace `` in the title with the actual Jira issue number, if there is one. - [ ] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.txt). --- You can merge this pull request into a Git repository by running: $ git pull https://github.com/vikkyrk/incubator-beam vikasrk/display_evaluator Alternatively you can review and apply these changes as the patch at: https://github.com/apache/incubator-beam/pull/542.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #542 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] incubator-beam pull request #541: Add direct runner package-info
GitHub user tgroh opened a pull request: https://github.com/apache/incubator-beam/pull/541 Add direct runner package-info Be sure to do all of the following to help us incorporate your contribution quickly and easily: - [ ] Make sure the PR title is formatted like: `[BEAM-] Description of pull request` - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable Travis-CI on your fork and ensure the whole test matrix passes). - [ ] Replace `` in the title with the actual Jira issue number, if there is one. - [ ] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.txt). --- You can merge this pull request into a Git repository by running: $ git pull https://github.com/tgroh/incubator-beam direct_package_info Alternatively you can review and apply these changes as the patch at: https://github.com/apache/incubator-beam/pull/541.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #541 commit 2f4e27475a17d51f5a6c8914607685bfb600254a Author: Thomas Groh Date: 2016-06-14T23:16:39Z Add direct runner package-info --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Commented] (BEAM-261) Apache Apex Runner
[ https://issues.apache.org/jira/browse/BEAM-261?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15351558#comment-15351558 ] ASF GitHub Bot commented on BEAM-261: - GitHub user tweise opened a pull request: https://github.com/apache/incubator-beam/pull/540 [BEAM-261] Apex runner PoC Be sure to do all of the following to help us incorporate your contribution quickly and easily: - [x] Make sure the PR title is formatted like: `[BEAM-] Description of pull request` - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable Travis-CI on your fork and ensure the whole test matrix passes). - [x] Replace `` in the title with the actual Jira issue number, if there is one. - [x] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.txt). --- R: @kennknowles Please note that integration tests are enabled and many are still expected to fail. This is WIP. You can merge this pull request into a Git repository by running: $ git pull https://github.com/tweise/incubator-beam BEAM-261 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/incubator-beam/pull/540.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #540 commit 3f0de449ea7811eb9020be544a0dec0fa1794f88 Author: Thomas Weise Date: 2016-06-27T18:24:13Z BEAM-261 Apex runner PoC > Apache Apex Runner > -- > > Key: BEAM-261 > URL: https://issues.apache.org/jira/browse/BEAM-261 > Project: Beam > Issue Type: Wish > Components: runner-ideas >Reporter: Suminda Dharmasena >Assignee: Thomas Weise > > Like Spark, Flink and GearPump, Apache Apex also does have advantages. Is it > possible to have a runner for Apache Apex? -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] incubator-beam pull request #540: [BEAM-261] Apex runner PoC
GitHub user tweise opened a pull request: https://github.com/apache/incubator-beam/pull/540 [BEAM-261] Apex runner PoC Be sure to do all of the following to help us incorporate your contribution quickly and easily: - [x] Make sure the PR title is formatted like: `[BEAM-] Description of pull request` - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable Travis-CI on your fork and ensure the whole test matrix passes). - [x] Replace `` in the title with the actual Jira issue number, if there is one. - [x] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.txt). --- R: @kennknowles Please note that integration tests are enabled and many are still expected to fail. This is WIP. You can merge this pull request into a Git repository by running: $ git pull https://github.com/tweise/incubator-beam BEAM-261 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/incubator-beam/pull/540.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #540 commit 3f0de449ea7811eb9020be544a0dec0fa1794f88 Author: Thomas Weise Date: 2016-06-27T18:24:13Z BEAM-261 Apex runner PoC --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Updated] (BEAM-329) Fix Spark runner README to use it's own WordCount examlpe
[ https://issues.apache.org/jira/browse/BEAM-329?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Amit Sela updated BEAM-329: --- Assignee: Amit Sela Affects Version/s: (was: 0.1.0-incubating) 0.2.0-incubating Description: The Spark runner should use its own WordCount because it can't support TextIO.Read without ignoring validation. (was: In the Spark runner README - https://github.com/apache/incubator-beam/tree/master/runners/spark The "Running On Cluster" example doesn't work because it looks for the input on the local FS instead of HDFS. Need to check what's up with TextIO there.. If this behaviour is wanted we should fix the README, otherwise it should read from HDFS. ) Summary: Fix Spark runner README to use it's own WordCount examlpe (was: README cluster example not working with HDFS input) > Fix Spark runner README to use it's own WordCount examlpe > -- > > Key: BEAM-329 > URL: https://issues.apache.org/jira/browse/BEAM-329 > Project: Beam > Issue Type: Bug > Components: runner-spark >Affects Versions: 0.2.0-incubating >Reporter: Amit Sela >Assignee: Amit Sela >Priority: Minor > > The Spark runner should use its own WordCount because it can't support > TextIO.Read without ignoring validation. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] incubator-beam pull request #539: [BEAM-380] Remove Spark runner dependency ...
GitHub user amitsela opened a pull request: https://github.com/apache/incubator-beam/pull/539 [BEAM-380] Remove Spark runner dependency on beam-examlpes-java Be sure to do all of the following to help us incorporate your contribution quickly and easily: - [ ] Make sure the PR title is formatted like: `[BEAM-] Description of pull request` - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable Travis-CI on your fork and ensure the whole test matrix passes). - [ ] Replace `` in the title with the actual Jira issue number, if there is one. - [ ] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.txt). --- Duplicate WordCount into spark examlpes package. Duplicate parts of TfIdf from beam examlpes. Better reuse of WordCount and its parts. Remove dependency on beam-examples-java You can merge this pull request into a Git repository by running: $ git pull https://github.com/amitsela/incubator-beam BEAM-380 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/incubator-beam/pull/539.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #539 commit 2b6bbc70bb1c0dd5ccedca0c6c7dc3b4b8d9fdac Author: Sela Date: 2016-06-27T17:59:07Z Remove dependency on beam-examples-java. Duplicate WordCount into spark examlpes package. Duplicate parts of TfIdf from beam examlpes. Better reuse of WordCount and its parts. Remove dependency on beam-examples-java --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Commented] (BEAM-380) Remove Spark runner dependency on beam-examlpes-java
[ https://issues.apache.org/jira/browse/BEAM-380?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15351499#comment-15351499 ] ASF GitHub Bot commented on BEAM-380: - GitHub user amitsela opened a pull request: https://github.com/apache/incubator-beam/pull/539 [BEAM-380] Remove Spark runner dependency on beam-examlpes-java Be sure to do all of the following to help us incorporate your contribution quickly and easily: - [ ] Make sure the PR title is formatted like: `[BEAM-] Description of pull request` - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable Travis-CI on your fork and ensure the whole test matrix passes). - [ ] Replace `` in the title with the actual Jira issue number, if there is one. - [ ] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.txt). --- Duplicate WordCount into spark examlpes package. Duplicate parts of TfIdf from beam examlpes. Better reuse of WordCount and its parts. Remove dependency on beam-examples-java You can merge this pull request into a Git repository by running: $ git pull https://github.com/amitsela/incubator-beam BEAM-380 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/incubator-beam/pull/539.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #539 commit 2b6bbc70bb1c0dd5ccedca0c6c7dc3b4b8d9fdac Author: Sela Date: 2016-06-27T17:59:07Z Remove dependency on beam-examples-java. Duplicate WordCount into spark examlpes package. Duplicate parts of TfIdf from beam examlpes. Better reuse of WordCount and its parts. Remove dependency on beam-examples-java > Remove Spark runner dependency on beam-examlpes-java > > > Key: BEAM-380 > URL: https://issues.apache.org/jira/browse/BEAM-380 > Project: Beam > Issue Type: Bug > Components: runner-spark >Reporter: Amit Sela >Assignee: Amit Sela >Priority: Minor > > Remove runner dependency to allow to allow beam-examples to have runtime > dependency on runners-spark. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Created] (BEAM-380) Remove Spark runner dependency on beam-examlpes-java
Amit Sela created BEAM-380: -- Summary: Remove Spark runner dependency on beam-examlpes-java Key: BEAM-380 URL: https://issues.apache.org/jira/browse/BEAM-380 Project: Beam Issue Type: Bug Components: runner-spark Reporter: Amit Sela Assignee: Amit Sela Priority: Minor Remove runner dependency to allow to allow beam-examples to have runtime dependency on runners-spark. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (BEAM-370) Remove the .named() methods from PTransforms and sub-classes
[ https://issues.apache.org/jira/browse/BEAM-370?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15351447#comment-15351447 ] ASF GitHub Bot commented on BEAM-370: - Github user asfgit closed the pull request at: https://github.com/apache/incubator-beam/pull/529 > Remove the .named() methods from PTransforms and sub-classes > > > Key: BEAM-370 > URL: https://issues.apache.org/jira/browse/BEAM-370 > Project: Beam > Issue Type: Bug > Components: sdk-java-core >Reporter: Ben Chambers >Assignee: Ben Chambers >Priority: Minor > > 1. Update examples/tests/etc. to use named application instead of `.named()` > 2. Remove the `.named()` methods from composite PTransforms > 3. Where appropriate, use the the PTransform constructor which takes a string > to use as the default name. > See further discussion in the related thread > (http://mail-archives.apache.org/mod_mbox/incubator-beam-dev/201606.mbox/%3ccan-7fgzuz1f_szzd2orfyd2pk2_prymhgwjepjpefp01h7s...@mail.gmail.com%3E). -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] incubator-beam pull request #529: [BEAM-370] Remove many additional definiti...
Github user asfgit closed the pull request at: https://github.com/apache/incubator-beam/pull/529 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[1/3] incubator-beam git commit: Remove many definitions of named methods
Repository: incubator-beam Updated Branches: refs/heads/master 4f580f5f1 -> 9abd0926a Remove many definitions of named methods Specifically, remove the occurrences in: - Window - AvroIO - PubsubIO - TextIO - BigQueryIO - Read Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/fc52a102 Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/fc52a102 Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/fc52a102 Branch: refs/heads/master Commit: fc52a10259cd045f4b55ec59b2ae87c02c926ed4 Parents: 5719535 Author: Ben Chambers Authored: Thu Jun 23 17:55:24 2016 -0700 Committer: Ben Chambers Committed: Sun Jun 26 10:06:35 2016 -0700 -- .../java/org/apache/beam/sdk/io/AvroIO.java | 53 --- .../java/org/apache/beam/sdk/io/BigQueryIO.java | 42 +-- .../java/org/apache/beam/sdk/io/PubsubIO.java | 35 + .../main/java/org/apache/beam/sdk/io/Read.java | 29 +-- .../java/org/apache/beam/sdk/io/TextIO.java | 55 .../org/apache/beam/sdk/io/package-info.java| 6 +-- .../beam/sdk/transforms/windowing/Window.java | 42 --- 7 files changed, 25 insertions(+), 237 deletions(-) -- http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/fc52a102/sdks/java/core/src/main/java/org/apache/beam/sdk/io/AvroIO.java -- diff --git a/sdks/java/core/src/main/java/org/apache/beam/sdk/io/AvroIO.java b/sdks/java/core/src/main/java/org/apache/beam/sdk/io/AvroIO.java index 4b40c01..604051b 100644 --- a/sdks/java/core/src/main/java/org/apache/beam/sdk/io/AvroIO.java +++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/io/AvroIO.java @@ -55,9 +55,7 @@ import javax.annotation.Nullable; * {@link AvroIO.Read}, specifying {@link AvroIO.Read#from} to specify * the path of the file(s) to read from (e.g., a local filename or * filename pattern if running locally, or a Google Cloud Storage - * filename or filename pattern of the form - * {@code "gs:///"}), and optionally - * {@link AvroIO.Read#named} to specify the name of the pipeline step. + * filename or filename pattern of the form {@code "gs:///"}). * * It is required to specify {@link AvroIO.Read#withSchema}. To * read specific records, such as Avro-generated classes, provide an @@ -73,15 +71,15 @@ import javax.annotation.Nullable; * // A simple Read of a local file (only runs locally): * PCollection records = * p.apply(AvroIO.Read.from("/path/to/file.avro") - *.withSchema(AvroAutoGenClass.class)); + * .withSchema(AvroAutoGenClass.class)); * * // A Read from a GCS file (runs locally and via the Google Cloud * // Dataflow service): * Schema schema = new Schema.Parser().parse(new File("schema.avsc")); * PCollection records = - * p.apply(AvroIO.Read.named("ReadFromAvro") - *.from("gs://my_bucket/path/to/records-*.avro") - *.withSchema(schema)); + * p.apply(AvroIO.Read + *.from("gs://my_bucket/path/to/records-*.avro") + *.withSchema(schema)); * } * * To write a {@link PCollection} to one or more Avro files, use @@ -110,10 +108,10 @@ import javax.annotation.Nullable; * // Dataflow service): * Schema schema = new Schema.Parser().parse(new File("schema.avsc")); * PCollection records = ...; - * records.apply(AvroIO.Write.named("WriteToAvro") - * .to("gs://my_bucket/path/to/numbers") - * .withSchema(schema) - * .withSuffix(".avro")); + * records.apply("WriteToAvro", AvroIO.Write + * .to("gs://my_bucket/path/to/numbers") + * .withSchema(schema) + * .withSuffix(".avro")); * } * * Permissions @@ -128,12 +126,6 @@ public class AvroIO { * the decoding of each record. */ public static class Read { -/** - * Returns a {@link PTransform} with the given step name. - */ -public static Bound named(String name) { - return new Bound<>(GenericRecord.class).named(name); -} /** * Returns a {@link PTransform} that reads from the file(s) @@ -223,16 +215,6 @@ public class AvroIO { /** * Returns a new {@link PTransform} that's like this one but - * with the given step name. - * - * Does not modify this object. - */ - public Bound named(String name) { -return new Bound<>(name, filepattern, type, schema, validate); - } - - /** - * Returns a new {@link PTransform} that's like this one but * that reads from the file(s) with the given name or pattern. * (See {@link AvroIO.Read#from} for a description of
[3/3] incubator-beam git commit: This closes #529
This closes #529 Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/9abd0926 Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/9abd0926 Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/9abd0926 Branch: refs/heads/master Commit: 9abd0926ac471c178eb30eeb01c964f9ecbd9cce Parents: 4f580f5 fc52a10 Author: bchambers Authored: Mon Jun 27 10:29:31 2016 -0700 Committer: bchambers Committed: Mon Jun 27 10:29:31 2016 -0700 -- .../beam/examples/DebuggingWordCount.java | 2 +- .../org/apache/beam/examples/WordCount.java | 4 +- .../apache/beam/examples/complete/TfIdf.java| 3 +- .../examples/complete/TopWikipediaSessions.java | 2 +- .../examples/cookbook/DatastoreWordCount.java | 2 +- .../beam/examples/cookbook/DeDupExample.java| 5 +- .../beam/examples/cookbook/TriggerExample.java | 4 +- .../beam/examples/complete/game/GameStats.java | 28 ++- .../examples/complete/game/HourlyTeamScore.java | 5 +- .../examples/complete/game/LeaderBoard.java | 8 +- .../beam/runners/flink/examples/TFIDF.java | 3 +- .../beam/runners/flink/examples/WordCount.java | 4 +- .../flink/examples/streaming/AutoComplete.java | 9 +- .../flink/examples/streaming/JoinExamples.java | 13 +- .../KafkaWindowedWordCountExample.java | 2 +- .../examples/streaming/WindowedWordCount.java | 3 +- .../beam/runners/dataflow/DataflowRunner.java | 2 +- .../DataflowPipelineTranslatorTest.java | 6 +- .../runners/dataflow/DataflowRunnerTest.java| 18 +- .../beam/runners/spark/SimpleWordCountTest.java | 3 +- .../java/org/apache/beam/sdk/io/AvroIO.java | 53 + .../java/org/apache/beam/sdk/io/BigQueryIO.java | 43 + .../java/org/apache/beam/sdk/io/PubsubIO.java | 35 +--- .../main/java/org/apache/beam/sdk/io/Read.java | 29 +-- .../java/org/apache/beam/sdk/io/TextIO.java | 55 +- .../org/apache/beam/sdk/io/package-info.java| 6 +- .../beam/sdk/transforms/windowing/Window.java | 42 .../beam/sdk/io/AvroIOGeneratedClassTest.java | 192 +-- .../java/org/apache/beam/sdk/io/AvroIOTest.java | 5 +- .../org/apache/beam/sdk/io/BigQueryIOTest.java | 82 +++- .../apache/beam/sdk/io/FileBasedSourceTest.java | 5 +- .../org/apache/beam/sdk/io/PubsubIOTest.java| 4 - .../java/org/apache/beam/sdk/io/TextIOTest.java | 37 +--- .../org/apache/beam/sdk/io/XmlSourceTest.java | 19 +- .../beam/sdk/runners/TransformTreeTest.java | 4 +- .../sdk/transforms/windowing/WindowTest.java| 6 +- .../sdk/transforms/windowing/WindowingTest.java | 2 +- .../src/main/java/DebuggingWordCount.java | 2 +- .../src/main/java/WordCount.java| 4 +- 39 files changed, 183 insertions(+), 568 deletions(-) --
[2/3] incubator-beam git commit: Remove many uses of .named methods
Remove many uses of .named methods Specifically, remove uses of: - Window.named - AvroIO.named - PubSubIO.named - TextIO.named - BigQueryIO.named - Read.named Project: http://git-wip-us.apache.org/repos/asf/incubator-beam/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-beam/commit/57195358 Tree: http://git-wip-us.apache.org/repos/asf/incubator-beam/tree/57195358 Diff: http://git-wip-us.apache.org/repos/asf/incubator-beam/diff/57195358 Branch: refs/heads/master Commit: 57195358592548e6f7e05bc8e4e292b126a726c5 Parents: 4f580f5 Author: Ben Chambers Authored: Thu Jun 23 22:27:05 2016 -0700 Committer: Ben Chambers Committed: Sun Jun 26 10:06:35 2016 -0700 -- .../beam/examples/DebuggingWordCount.java | 2 +- .../org/apache/beam/examples/WordCount.java | 4 +- .../apache/beam/examples/complete/TfIdf.java| 3 +- .../examples/complete/TopWikipediaSessions.java | 2 +- .../examples/cookbook/DatastoreWordCount.java | 2 +- .../beam/examples/cookbook/DeDupExample.java| 5 +- .../beam/examples/cookbook/TriggerExample.java | 4 +- .../beam/examples/complete/game/GameStats.java | 28 ++- .../examples/complete/game/HourlyTeamScore.java | 5 +- .../examples/complete/game/LeaderBoard.java | 8 +- .../beam/runners/flink/examples/TFIDF.java | 3 +- .../beam/runners/flink/examples/WordCount.java | 4 +- .../flink/examples/streaming/AutoComplete.java | 9 +- .../flink/examples/streaming/JoinExamples.java | 13 +- .../KafkaWindowedWordCountExample.java | 2 +- .../examples/streaming/WindowedWordCount.java | 3 +- .../beam/runners/dataflow/DataflowRunner.java | 2 +- .../DataflowPipelineTranslatorTest.java | 6 +- .../runners/dataflow/DataflowRunnerTest.java| 18 +- .../beam/runners/spark/SimpleWordCountTest.java | 3 +- .../java/org/apache/beam/sdk/io/BigQueryIO.java | 1 - .../beam/sdk/io/AvroIOGeneratedClassTest.java | 192 +-- .../java/org/apache/beam/sdk/io/AvroIOTest.java | 5 +- .../org/apache/beam/sdk/io/BigQueryIOTest.java | 82 +++- .../apache/beam/sdk/io/FileBasedSourceTest.java | 5 +- .../org/apache/beam/sdk/io/PubsubIOTest.java| 4 - .../java/org/apache/beam/sdk/io/TextIOTest.java | 37 +--- .../org/apache/beam/sdk/io/XmlSourceTest.java | 19 +- .../beam/sdk/runners/TransformTreeTest.java | 4 +- .../sdk/transforms/windowing/WindowTest.java| 6 +- .../sdk/transforms/windowing/WindowingTest.java | 2 +- .../src/main/java/DebuggingWordCount.java | 2 +- .../src/main/java/WordCount.java| 4 +- 33 files changed, 158 insertions(+), 331 deletions(-) -- http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/57195358/examples/java/src/main/java/org/apache/beam/examples/DebuggingWordCount.java -- diff --git a/examples/java/src/main/java/org/apache/beam/examples/DebuggingWordCount.java b/examples/java/src/main/java/org/apache/beam/examples/DebuggingWordCount.java index 85823c2..8d85d44 100644 --- a/examples/java/src/main/java/org/apache/beam/examples/DebuggingWordCount.java +++ b/examples/java/src/main/java/org/apache/beam/examples/DebuggingWordCount.java @@ -173,7 +173,7 @@ public class DebuggingWordCount { Pipeline p = Pipeline.create(options); PCollection> filteredWords = -p.apply(TextIO.Read.named("ReadLines").from(options.getInputFile())) +p.apply("ReadLines", TextIO.Read.from(options.getInputFile())) .apply(new WordCount.CountWords()) .apply(ParDo.of(new FilterTextFn(options.getFilterPattern(; http://git-wip-us.apache.org/repos/asf/incubator-beam/blob/57195358/examples/java/src/main/java/org/apache/beam/examples/WordCount.java -- diff --git a/examples/java/src/main/java/org/apache/beam/examples/WordCount.java b/examples/java/src/main/java/org/apache/beam/examples/WordCount.java index cf6c45a..af16c44 100644 --- a/examples/java/src/main/java/org/apache/beam/examples/WordCount.java +++ b/examples/java/src/main/java/org/apache/beam/examples/WordCount.java @@ -205,10 +205,10 @@ public class WordCount { // Concepts #2 and #3: Our pipeline applies the composite CountWords transform, and passes the // static FormatAsTextFn() to the ParDo transform. -p.apply(TextIO.Read.named("ReadLines").from(options.getInputFile())) +p.apply("ReadLines", TextIO.Read.from(options.getInputFile())) .apply(new CountWords()) .apply(MapElements.via(new FormatAsTextFn())) - .apply(TextIO.Write.named("WriteCounts").to(options.getOutput())); + .apply("WriteCounts", TextIO.Write.to(options.getOutput())); p.run(); } http://git-wip-us.apache.org/repos/asf/incubator
[GitHub] incubator-beam pull request #538: Pylint integration for Python SDK
GitHub user aaltay opened a pull request: https://github.com/apache/incubator-beam/pull/538 Pylint integration for Python SDK This change introduces a pylint rule to keep the code base in a consistent state. Pylint configuration is heavily influenced by the existing standards in the code. Even with this there were multiple pylint issues. In order to make this change submit locking and also affect the others minimally I made the following choices: - Enabled line length (80 chars), and indentation rules (2 spaces, and 4 spaces after parentheses.). - Disabled all other rules. I will work on fixing violations of those and enabling them one by one. - Run pylint only on those files under python/sdks/apache_beam/**/*.py and that changed compared to python_sdk branch HEAD. Note that this is files changed not lines changed. - Made it submit blocking. Travis will fail if there are lint errors. You can install pylint (pip install pylint) and run it locally: $ cd sdks/python To run on all files: $ pylint apache_beam To run on all changed files (Diff of last commit to python-sdk branch HEAD): $ ./run_pylint.sh You can merge this pull request into a Git repository by running: $ git pull https://github.com/aaltay/incubator-beam lint Alternatively you can review and apply these changes as the patch at: https://github.com/apache/incubator-beam/pull/538.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #538 commit 5918d7e94239de08aa7d6f50484f33c39c208ba2 Author: Ahmet Altay Date: 2016-06-24T21:04:27Z Pylint integration for Python SDK Runs pylint on modified files under sdks/python/**/*.py. commit 06380d6c5a6a9585acf1f3104992899743477c22 Author: Ahmet Altay Date: 2016-06-24T23:00:01Z Fix line-too-long pylint warnings. commit 7e4d01f48baef9e2c734704f446bd3708198280f Author: Ahmet Altay Date: 2016-06-24T23:32:46Z Fix bad-continuation lint warnings. commit 6d55a33903ab19f2d97908b0003758d16dc119da Author: Ahmet Altay Date: 2016-06-27T16:57:18Z Clean up pylintrc file --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Created] (BEAM-379) DisplayDataEvaluator does not support source transforms of the form PTransform
Vikas Kedigehalli created BEAM-379: -- Summary: DisplayDataEvaluator does not support source transforms of the form PTransform Key: BEAM-379 URL: https://issues.apache.org/jira/browse/BEAM-379 Project: Beam Issue Type: Bug Components: sdk-java-core Reporter: Vikas Kedigehalli Assignee: Vikas Kedigehalli DisplayDataEvaluator (https://github.com/apache/incubator-beam/blob/c0efe568e5291298c1394016a12e7979b37afc44/sdks/java/core/src/test/java/org/apache/beam/sdk/transforms/display/DisplayDataEvaluator.java#L81) takes PTranform, ? extends POutput>, but this doesn't work for source transforms of the form PTransform>. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (BEAM-378) Integrate Python SDK in the Maven build
[ https://issues.apache.org/jira/browse/BEAM-378?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15350687#comment-15350687 ] ASF GitHub Bot commented on BEAM-378: - GitHub user wikier opened a pull request: https://github.com/apache/incubator-beam/pull/537 BEAM-378: integrate setuptools in Maven build This PR provide a initial integration of the Python SDK in the Maven build, relying on the `exec-maven-plugin` to call `setuptools`. Notice I keep out `install` on purpose, because I guess it may need discussion. --- Be sure to do all of the following to help us incorporate your contribution quickly and easily: - [x] Make sure the PR title is formatted like: `[BEAM-] Description of pull request` - [x] Make sure tests pass via `mvn clean verify`. (Even better, enable Travis-CI on your fork and ensure the whole test matrix passes). - [x] Replace `` in the title with the actual Jira issue number, if there is one. - [x] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.txt). You can merge this pull request into a Git repository by running: $ git pull https://github.com/wikier/incubator-beam python-sdk Alternatively you can review and apply these changes as the patch at: https://github.com/apache/incubator-beam/pull/537.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #537 commit 427be241cd8981ad426c20ea0c7d1fd36d436aa1 Author: Sergio Fernández Date: 2016-06-27T08:00:39Z used exec-maven-plugin to map maven's lifecycle phases with python's setuptools commit 23f42c31029b70214e62b50d341de65a08a96eae Author: Sergio Fernández Date: 2016-06-27T08:22:15Z synced version from maven commit 061ef0a5914f658c0754a49ccfd2089f38008df7 Author: Sergio Fernández Date: 2016-06-27T08:32:12Z synced version from maven commit 98ca6f813904621957bd38b68e2a9d19202431e7 Author: Sergio Fernández Date: 2016-06-27T08:33:34Z merged commit 2acaed2b87d6a5b0e3e2416c73b32e6eb18b9dd6 Author: Sergio Fernández Date: 2016-06-27T08:59:00Z mapped package phase commit 43bcc7a899049a54bff2612a9607a61f4ddbec31 Author: Sergio Fernández Date: 2016-06-27T09:00:41Z fixed author metadata > Integrate Python SDK in the Maven build > --- > > Key: BEAM-378 > URL: https://issues.apache.org/jira/browse/BEAM-378 > Project: Beam > Issue Type: Wish > Components: sdk-py >Affects Versions: 0.2.0-incubating >Reporter: Sergio Fernández >Assignee: Frances Perry >Priority: Minor > > It'd be great to have the Python SDK integrated in the Maven build. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] incubator-beam pull request #537: BEAM-378: integrate setuptools in Maven bu...
GitHub user wikier opened a pull request: https://github.com/apache/incubator-beam/pull/537 BEAM-378: integrate setuptools in Maven build This PR provide a initial integration of the Python SDK in the Maven build, relying on the `exec-maven-plugin` to call `setuptools`. Notice I keep out `install` on purpose, because I guess it may need discussion. --- Be sure to do all of the following to help us incorporate your contribution quickly and easily: - [x] Make sure the PR title is formatted like: `[BEAM-] Description of pull request` - [x] Make sure tests pass via `mvn clean verify`. (Even better, enable Travis-CI on your fork and ensure the whole test matrix passes). - [x] Replace `` in the title with the actual Jira issue number, if there is one. - [x] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.txt). You can merge this pull request into a Git repository by running: $ git pull https://github.com/wikier/incubator-beam python-sdk Alternatively you can review and apply these changes as the patch at: https://github.com/apache/incubator-beam/pull/537.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #537 commit 427be241cd8981ad426c20ea0c7d1fd36d436aa1 Author: Sergio Fernández Date: 2016-06-27T08:00:39Z used exec-maven-plugin to map maven's lifecycle phases with python's setuptools commit 23f42c31029b70214e62b50d341de65a08a96eae Author: Sergio Fernández Date: 2016-06-27T08:22:15Z synced version from maven commit 061ef0a5914f658c0754a49ccfd2089f38008df7 Author: Sergio Fernández Date: 2016-06-27T08:32:12Z synced version from maven commit 98ca6f813904621957bd38b68e2a9d19202431e7 Author: Sergio Fernández Date: 2016-06-27T08:33:34Z merged commit 2acaed2b87d6a5b0e3e2416c73b32e6eb18b9dd6 Author: Sergio Fernández Date: 2016-06-27T08:59:00Z mapped package phase commit 43bcc7a899049a54bff2612a9607a61f4ddbec31 Author: Sergio Fernández Date: 2016-06-27T09:00:41Z fixed author metadata --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Created] (BEAM-378) Integrate Python SDK in the Maven build
Sergio Fernández created BEAM-378: - Summary: Integrate Python SDK in the Maven build Key: BEAM-378 URL: https://issues.apache.org/jira/browse/BEAM-378 Project: Beam Issue Type: Wish Components: sdk-py Affects Versions: 0.2.0-incubating Reporter: Sergio Fernández Assignee: Frances Perry Priority: Minor It'd be great to have the Python SDK integrated in the Maven build. -- This message was sent by Atlassian JIRA (v6.3.4#6332)