Re: New contributor to BEAM SQL

2019-09-16 Thread Lukasz Cwik
Welcome Kirill, I have granted you the JIRA permissions you requested.

On Mon, Sep 16, 2019 at 10:59 AM Kirill Kozlov 
wrote:

> Hello everyone!
>
> My name is Kirill Kozlov, I recently joined a Dataflow team at Google and
> will be working on SQL filter pushdown.
> Can I get permission to work issues in jira, my username is: kirillkozlov
> Looking forward to developing Beam together!
>
> Thank you,
> Kirill Kozlov
>


Re: portableWordCountBatch and portableWordCountStreaming failing in Python PreCommit

2019-09-16 Thread Mark Liu
Thank you Hannah!

BTW, the fix is https://github.com/apache/beam/pull/9588.
Since this affects release, https://github.com/apache/beam/pull/9595 will
be cherry-picked to release branch.

On Mon, Sep 16, 2019 at 9:47 PM Hannah Jiang  wrote:

> The fix is merged. I tested with PRs which used to fail and the failures
> are fixed now.
> Please rerun the test if your PR is affected.
>
> On Mon, Sep 16, 2019 at 6:16 PM Mark Liu  wrote:
>
>> Thanks for letting me know. I'll keep tracking on this issue since it's a
>> release blocker. Please update here/jira if you have any progress.
>>
>> Mark
>>
>> On Mon, Sep 16, 2019 at 5:04 PM Hannah Jiang 
>> wrote:
>>
>>> For issue with flink image, I re-opened a ticket which is currently
>>> blocking release.(BEAM-8165)
>>>
>>> On Mon, Sep 16, 2019 at 5:00 PM Ahmet Altay  wrote:
>>>


 On Mon, Sep 16, 2019 at 2:07 PM Kyle Weaver 
 wrote:

> The original issue ("GetJobMetrics is unimplemented") is still
> probably hiding under the docker issue, so I filed
> https://issues.apache.org/jira/browse/BEAM-8245 for it
>

 Thank you. Who should be assigned to this issue?

 If metrics are not supported in all runners, should we update tests to
 make this metrics checks optional?


>
> Kyle Weaver | Software Engineer | github.com/ibzib |
> kcwea...@google.com
>
>
> On Mon, Sep 16, 2019 at 1:14 PM Hannah Jiang 
> wrote:
>
>> If we try to create a flink image from master branch, it will create
>> *apachebeam/flink-job-server*, while the code is expecting 
>> *jenkins-docker-apache.bintray.io/beam/flink-job-server:latest
>> *
>> .
>> This should be introduced when we cut 2.16.0 branch, I will
>> investigate more to see how to fix it.
>>
>
 Do we need a release blocking JIRA for this?

 /cc +Mark Liu 


>
>> On Mon, Sep 16, 2019 at 1:00 PM Lukasz Cwik  wrote:
>>
>>> I'm also being impacted by this on my PR[1]. I found BEAM-6316[2]
>>> that has a similar error but it was resolved Dec 2018.
>>>
>>> 1: https://github.com/apache/beam/pull/9583
>>> 2: https://issues.apache.org/jira/browse/BEAM-6316
>>>
>>> On Mon, Sep 16, 2019 at 12:43 PM Ning Kang  wrote:
>>>
 A new check renders clearer message:

 Unable to find image '
 jenkins-docker-apache.bintray.io/beam/flink-job-server:latest'
 locally
 docker: Error response from daemon: unknown: Repo 'apache' was not
 found.
 See 'docker run --help'.
 ERROR:root:Starting job service with ['docker', 'run', '-v',
 u'/usr/bin/docker:/bin/docker', '-v',
 '/var/run/docker.sock:/var/run/docker.sock', '--network=host', '
 jenkins-docker-apache.bintray.io/beam/flink-job-server:latest',
 '--job-host', 'localhost', '--job-port', '45687', '--artifact-port',
 '39407', '--expansion-port', '43893']
 ERROR:root:Error bringing up job service



 On Mon, Sep 16, 2019 at 12:39 PM Ning Kang 
 wrote:

> To Ahmet, these are warnings, I'm not able to identify the errors
> yet.
>
> Thanks everyone! I'm watching the Jira now.
>
> On Mon, Sep 16, 2019 at 12:07 PM Chad Dombrova 
> wrote:
>
>> Ning, if you're having trouble making sense of the preCommit
>> errors, you may be interested in this Jira:
>> https://issues.apache.org/jira/browse/BEAM-8213#
>>
>> On Mon, Sep 16, 2019 at 12:02 PM Kyle Weaver 
>> wrote:
>>
>>> Python 2 isn't the reason the test is failing, that's just a
>>> warning. The actual error is at the very end of the log (it looks 
>>> familiar
>>> to me, though I don't see a JIRA for it):
>>>
>>> <_Rendezvous of RPC that terminated with:
>>> status = StatusCode.UNIMPLEMENTED
>>> details = "Method
>>> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
>>> unimplemented"
>>> debug_error_string =
>>> "{"created":"@1568424715.449291418","description":"Error received 
>>> from peer
>>> ipv4:127.0.0.1:46117","file":"src/core/lib/surface/call.cc","file_line":1052,"grpc_message":"Method
>>> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
>>> unimplemented","grpc_status":12}"
>>> >
>>>
>>> Kyle Weaver | Software Engineer | github.com/ibzib |
>>> kcwea...@google.com
>>>
>>>
>>> On Mon, Sep 16, 2019 at 11:34 AM Ning Kang 
>>> wrote:
>>>
 Hi! I've been seeing some errors during "Python PreCommit".
 I'm seeing "UserWarning: You are using Apache Beam with Python

Re: portableWordCountBatch and portableWordCountStreaming failing in Python PreCommit

2019-09-16 Thread Hannah Jiang
The fix is merged. I tested with PRs which used to fail and the failures
are fixed now.
Please rerun the test if your PR is affected.

On Mon, Sep 16, 2019 at 6:16 PM Mark Liu  wrote:

> Thanks for letting me know. I'll keep tracking on this issue since it's a
> release blocker. Please update here/jira if you have any progress.
>
> Mark
>
> On Mon, Sep 16, 2019 at 5:04 PM Hannah Jiang 
> wrote:
>
>> For issue with flink image, I re-opened a ticket which is currently
>> blocking release.(BEAM-8165)
>>
>> On Mon, Sep 16, 2019 at 5:00 PM Ahmet Altay  wrote:
>>
>>>
>>>
>>> On Mon, Sep 16, 2019 at 2:07 PM Kyle Weaver  wrote:
>>>
 The original issue ("GetJobMetrics is unimplemented") is still
 probably hiding under the docker issue, so I filed
 https://issues.apache.org/jira/browse/BEAM-8245 for it

>>>
>>> Thank you. Who should be assigned to this issue?
>>>
>>> If metrics are not supported in all runners, should we update tests to
>>> make this metrics checks optional?
>>>
>>>

 Kyle Weaver | Software Engineer | github.com/ibzib |
 kcwea...@google.com


 On Mon, Sep 16, 2019 at 1:14 PM Hannah Jiang 
 wrote:

> If we try to create a flink image from master branch, it will create
> *apachebeam/flink-job-server*, while the code is expecting 
> *jenkins-docker-apache.bintray.io/beam/flink-job-server:latest
> *
> .
> This should be introduced when we cut 2.16.0 branch, I will
> investigate more to see how to fix it.
>

>>> Do we need a release blocking JIRA for this?
>>>
>>> /cc +Mark Liu 
>>>
>>>

> On Mon, Sep 16, 2019 at 1:00 PM Lukasz Cwik  wrote:
>
>> I'm also being impacted by this on my PR[1]. I found BEAM-6316[2]
>> that has a similar error but it was resolved Dec 2018.
>>
>> 1: https://github.com/apache/beam/pull/9583
>> 2: https://issues.apache.org/jira/browse/BEAM-6316
>>
>> On Mon, Sep 16, 2019 at 12:43 PM Ning Kang  wrote:
>>
>>> A new check renders clearer message:
>>>
>>> Unable to find image '
>>> jenkins-docker-apache.bintray.io/beam/flink-job-server:latest'
>>> locally
>>> docker: Error response from daemon: unknown: Repo 'apache' was not
>>> found.
>>> See 'docker run --help'.
>>> ERROR:root:Starting job service with ['docker', 'run', '-v',
>>> u'/usr/bin/docker:/bin/docker', '-v',
>>> '/var/run/docker.sock:/var/run/docker.sock', '--network=host', '
>>> jenkins-docker-apache.bintray.io/beam/flink-job-server:latest',
>>> '--job-host', 'localhost', '--job-port', '45687', '--artifact-port',
>>> '39407', '--expansion-port', '43893']
>>> ERROR:root:Error bringing up job service
>>>
>>>
>>>
>>> On Mon, Sep 16, 2019 at 12:39 PM Ning Kang  wrote:
>>>
 To Ahmet, these are warnings, I'm not able to identify the errors
 yet.

 Thanks everyone! I'm watching the Jira now.

 On Mon, Sep 16, 2019 at 12:07 PM Chad Dombrova 
 wrote:

> Ning, if you're having trouble making sense of the preCommit
> errors, you may be interested in this Jira:
> https://issues.apache.org/jira/browse/BEAM-8213#
>
> On Mon, Sep 16, 2019 at 12:02 PM Kyle Weaver 
> wrote:
>
>> Python 2 isn't the reason the test is failing, that's just a
>> warning. The actual error is at the very end of the log (it looks 
>> familiar
>> to me, though I don't see a JIRA for it):
>>
>> <_Rendezvous of RPC that terminated with:
>> status = StatusCode.UNIMPLEMENTED
>> details = "Method
>> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
>> unimplemented"
>> debug_error_string =
>> "{"created":"@1568424715.449291418","description":"Error received 
>> from peer
>> ipv4:127.0.0.1:46117","file":"src/core/lib/surface/call.cc","file_line":1052,"grpc_message":"Method
>> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
>> unimplemented","grpc_status":12}"
>> >
>>
>> Kyle Weaver | Software Engineer | github.com/ibzib |
>> kcwea...@google.com
>>
>>
>> On Mon, Sep 16, 2019 at 11:34 AM Ning Kang 
>> wrote:
>>
>>> Hi! I've been seeing some errors during "Python PreCommit".
>>> I'm seeing "UserWarning: You are using Apache Beam with Python
>>> 2. New releases of Apache Beam will soon support Python 3 only. 
>>> 'You are
>>> using Apache Beam with Python 2. '"
>>> Is there any plan to remove py2 tests from the pre-commit check
>>> once we stop supporting Python2?
>>> The scan link is:
>>> 

Re: portableWordCountBatch and portableWordCountStreaming failing in Python PreCommit

2019-09-16 Thread Mark Liu
Thanks for letting me know. I'll keep tracking on this issue since it's a
release blocker. Please update here/jira if you have any progress.

Mark

On Mon, Sep 16, 2019 at 5:04 PM Hannah Jiang  wrote:

> For issue with flink image, I re-opened a ticket which is currently
> blocking release.(BEAM-8165)
>
> On Mon, Sep 16, 2019 at 5:00 PM Ahmet Altay  wrote:
>
>>
>>
>> On Mon, Sep 16, 2019 at 2:07 PM Kyle Weaver  wrote:
>>
>>> The original issue ("GetJobMetrics is unimplemented") is still probably
>>> hiding under the docker issue, so I filed
>>> https://issues.apache.org/jira/browse/BEAM-8245 for it
>>>
>>
>> Thank you. Who should be assigned to this issue?
>>
>> If metrics are not supported in all runners, should we update tests to
>> make this metrics checks optional?
>>
>>
>>>
>>> Kyle Weaver | Software Engineer | github.com/ibzib | kcwea...@google.com
>>>
>>>
>>> On Mon, Sep 16, 2019 at 1:14 PM Hannah Jiang 
>>> wrote:
>>>
 If we try to create a flink image from master branch, it will create
 *apachebeam/flink-job-server*, while the code is expecting 
 *jenkins-docker-apache.bintray.io/beam/flink-job-server:latest
 *
 .
 This should be introduced when we cut 2.16.0 branch, I will investigate
 more to see how to fix it.

>>>
>> Do we need a release blocking JIRA for this?
>>
>> /cc +Mark Liu 
>>
>>
>>>
 On Mon, Sep 16, 2019 at 1:00 PM Lukasz Cwik  wrote:

> I'm also being impacted by this on my PR[1]. I found BEAM-6316[2] that
> has a similar error but it was resolved Dec 2018.
>
> 1: https://github.com/apache/beam/pull/9583
> 2: https://issues.apache.org/jira/browse/BEAM-6316
>
> On Mon, Sep 16, 2019 at 12:43 PM Ning Kang  wrote:
>
>> A new check renders clearer message:
>>
>> Unable to find image '
>> jenkins-docker-apache.bintray.io/beam/flink-job-server:latest'
>> locally
>> docker: Error response from daemon: unknown: Repo 'apache' was not
>> found.
>> See 'docker run --help'.
>> ERROR:root:Starting job service with ['docker', 'run', '-v',
>> u'/usr/bin/docker:/bin/docker', '-v',
>> '/var/run/docker.sock:/var/run/docker.sock', '--network=host', '
>> jenkins-docker-apache.bintray.io/beam/flink-job-server:latest',
>> '--job-host', 'localhost', '--job-port', '45687', '--artifact-port',
>> '39407', '--expansion-port', '43893']
>> ERROR:root:Error bringing up job service
>>
>>
>>
>> On Mon, Sep 16, 2019 at 12:39 PM Ning Kang  wrote:
>>
>>> To Ahmet, these are warnings, I'm not able to identify the errors
>>> yet.
>>>
>>> Thanks everyone! I'm watching the Jira now.
>>>
>>> On Mon, Sep 16, 2019 at 12:07 PM Chad Dombrova 
>>> wrote:
>>>
 Ning, if you're having trouble making sense of the preCommit
 errors, you may be interested in this Jira:
 https://issues.apache.org/jira/browse/BEAM-8213#

 On Mon, Sep 16, 2019 at 12:02 PM Kyle Weaver 
 wrote:

> Python 2 isn't the reason the test is failing, that's just a
> warning. The actual error is at the very end of the log (it looks 
> familiar
> to me, though I don't see a JIRA for it):
>
> <_Rendezvous of RPC that terminated with:
> status = StatusCode.UNIMPLEMENTED
> details = "Method
> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
> unimplemented"
> debug_error_string =
> "{"created":"@1568424715.449291418","description":"Error received 
> from peer
> ipv4:127.0.0.1:46117","file":"src/core/lib/surface/call.cc","file_line":1052,"grpc_message":"Method
> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
> unimplemented","grpc_status":12}"
> >
>
> Kyle Weaver | Software Engineer | github.com/ibzib |
> kcwea...@google.com
>
>
> On Mon, Sep 16, 2019 at 11:34 AM Ning Kang 
> wrote:
>
>> Hi! I've been seeing some errors during "Python PreCommit".
>> I'm seeing "UserWarning: You are using Apache Beam with Python 2.
>> New releases of Apache Beam will soon support Python 3 only. 'You 
>> are using
>> Apache Beam with Python 2. '"
>> Is there any plan to remove py2 tests from the pre-commit check
>> once we stop supporting Python2?
>> The scan link is:
>> https://scans.gradle.com/s/vujoeo62uyfpi/console-log?task=:sdks:python:test-suites:portable:py2:portableWordCountBatch
>>
>> Thanks!
>>
>> Ning.
>>
>>


Re: The state of external transforms in Beam

2019-09-16 Thread Chamikara Jayalath
Thanks for the nice write up Chad.

On Mon, Sep 16, 2019 at 12:17 PM Robert Bradshaw 
wrote:

> Thanks for bringing this up again. My thoughts on the open questions below.
>
> On Mon, Sep 16, 2019 at 11:51 AM Chad Dombrova  wrote:
> > That commit solves 2 problems:
> >
> > Adds the pubsub Java deps so that they’re available in our portable
> pipeline
> > Makes the coder for the PubsubIO message-holder type, PubsubMessage,
> available as a standard coder. This is required because both PubsubIO.Read
> and PubsubIO.Write expand to ParDos which pass along these PubsubMessage
> objects, but only “standard” (i.e. portable) coders can be used, so we have
> to hack it to make PubsubMessage appear as a standard coder.
> >
> > More details:
> >
> > There’s a similar magic commit required for Kafka external transforms
> > The Jira issue for this problem is here:
> https://jira.apache.org/jira/browse/BEAM-7870
> > For problem #2 above there seems to be some consensus forming around
> using Avro or schema/row coders to send compound types in a portable way.
> Here’s the PR for making row coders portable
> > https://github.com/apache/beam/pull/9188
>
> +1. Note that this doesn't mean that the IO itself must produce rows;
> part of the Schema work in Java is to make it easy to automatically
> convert from various Java classes to schemas transparently, so this
> same logic that would allow one to apply an SQL filter directly to a
> Kafka/PubSub read would allow cross-language. Even if that doesn't
> work, we need not uglify the Java API; we can have an
> option/alternative transform that appends the convert-to-Row DoFn for
> easier use by external (though the goal of the former work is to make
> this step unnecissary).
>

Updating all IO connectors / transforms to have a version that
produces/consumes a PCollection is infeasible so I agree that we need
an automatic conversion to/from PCollection possibly by injecting
PTransfroms during ExternalTransform expansion.

>
> > I don’t really have any ideas for problem #1
>
> The crux of the issue here is that the jobs API was not designed with
> cross-language in mind, and so the artifact API ties artifacts to jobs
> rather than to environments. To solve this we need to augment the
> notion of environment to allow the specification of additional
> dependencies (e.g. jar files in this specific case, or better as
> maven/pypi/... dependencies (with version ranges) such that
> environment merging and dependency resolution can be sanely done), and
> a way for the expansion service to provide such dependencies.
>
> Max wrote up a summary of the prior discussions at
>
> https://docs.google.com/document/d/1XaiNekAY2sptuQRIXpjGAyaYdSc-wlJ-VKjl04c8N48/edit#heading=h.900gc947qrw8
>
> In the short term, one can build a custom docker image that has all
> the requisite dependencies installed.
>
> This touches on a related but separable issue that one may want to run
> some of these transforms "natively" in the same process as the runner
> (e.g. a Java IO in the Flink Java Runner) rather than via docker.
> (Similarly with subprocess.) Exactly how that works with environment
> specifications is also a bit TBD, but my proposal has been that these
> are best viewed as runner-specific substitutions of standard
> environments.
>

We need a permanent solution for this but for now we have a temporary
solution where additional jar files can be specified through an experiment
when running a Python pipeline:
https://github.com/apache/beam/blob/9678149872de2799ea1643f834f2bec88d346af8/sdks/python/apache_beam/io/external/xlang_parquetio_test.py#L55

Thanks,
Cham


>
> > So the portability expansion system works, and now it’s time to sand off
> some of the rough corners. I’d love to hear others’ thoughts on how to
> resolve some of these remaining issues.
>
> +1
>
>
> On Mon, Sep 16, 2019 at 11:51 AM Chad Dombrova  wrote:
> >
> > Hi all,
> > There was some interest in this topic at the Beam Summit this week (btw,
> great job to everyone involved!), so I thought I’d try to summarize the
> current state of things.
> > First, let me explain the idea behind an external transforms for the
> uninitiated.
> >
> > Problem:
> >
> > there’s a transform that you want to use, but it’s not available in your
> desired language. IO connectors are a good example: there are many
> available in the Java SDK, but not so much in Python or Go.
> >
> > Solution:
> >
> > Create a stub transform in your desired language (e.g. Python) whose
> primary role is to serialize the parameters passed to that transform
> > When you run your portable pipeline, just prior to it being sent to the
> Job Service for execution, your stub transform’s payload is first sent to
> the “Expansion Service” that’s running in the native language (Java), where
> the payload is used to construct an instance of the native transform, which
> is then expanded and converted to a protobuf and sent back to the calling
> process (Python).
> > The protobuf 

Re: portableWordCountBatch and portableWordCountStreaming failing in Python PreCommit

2019-09-16 Thread Hannah Jiang
For issue with flink image, I re-opened a ticket which is currently
blocking release.(BEAM-8165)

On Mon, Sep 16, 2019 at 5:00 PM Ahmet Altay  wrote:

>
>
> On Mon, Sep 16, 2019 at 2:07 PM Kyle Weaver  wrote:
>
>> The original issue ("GetJobMetrics is unimplemented") is still probably
>> hiding under the docker issue, so I filed
>> https://issues.apache.org/jira/browse/BEAM-8245 for it
>>
>
> Thank you. Who should be assigned to this issue?
>
> If metrics are not supported in all runners, should we update tests to
> make this metrics checks optional?
>
>
>>
>> Kyle Weaver | Software Engineer | github.com/ibzib | kcwea...@google.com
>>
>>
>> On Mon, Sep 16, 2019 at 1:14 PM Hannah Jiang 
>> wrote:
>>
>>> If we try to create a flink image from master branch, it will create
>>> *apachebeam/flink-job-server*, while the code is expecting 
>>> *jenkins-docker-apache.bintray.io/beam/flink-job-server:latest
>>> *.
>>> This should be introduced when we cut 2.16.0 branch, I will investigate
>>> more to see how to fix it.
>>>
>>
> Do we need a release blocking JIRA for this?
>
> /cc +Mark Liu 
>
>
>>
>>> On Mon, Sep 16, 2019 at 1:00 PM Lukasz Cwik  wrote:
>>>
 I'm also being impacted by this on my PR[1]. I found BEAM-6316[2] that
 has a similar error but it was resolved Dec 2018.

 1: https://github.com/apache/beam/pull/9583
 2: https://issues.apache.org/jira/browse/BEAM-6316

 On Mon, Sep 16, 2019 at 12:43 PM Ning Kang  wrote:

> A new check renders clearer message:
>
> Unable to find image '
> jenkins-docker-apache.bintray.io/beam/flink-job-server:latest' locally
> docker: Error response from daemon: unknown: Repo 'apache' was not
> found.
> See 'docker run --help'.
> ERROR:root:Starting job service with ['docker', 'run', '-v',
> u'/usr/bin/docker:/bin/docker', '-v',
> '/var/run/docker.sock:/var/run/docker.sock', '--network=host', '
> jenkins-docker-apache.bintray.io/beam/flink-job-server:latest',
> '--job-host', 'localhost', '--job-port', '45687', '--artifact-port',
> '39407', '--expansion-port', '43893']
> ERROR:root:Error bringing up job service
>
>
>
> On Mon, Sep 16, 2019 at 12:39 PM Ning Kang  wrote:
>
>> To Ahmet, these are warnings, I'm not able to identify the errors yet.
>>
>> Thanks everyone! I'm watching the Jira now.
>>
>> On Mon, Sep 16, 2019 at 12:07 PM Chad Dombrova 
>> wrote:
>>
>>> Ning, if you're having trouble making sense of the preCommit errors,
>>> you may be interested in this Jira:
>>> https://issues.apache.org/jira/browse/BEAM-8213#
>>>
>>> On Mon, Sep 16, 2019 at 12:02 PM Kyle Weaver 
>>> wrote:
>>>
 Python 2 isn't the reason the test is failing, that's just a
 warning. The actual error is at the very end of the log (it looks 
 familiar
 to me, though I don't see a JIRA for it):

 <_Rendezvous of RPC that terminated with:
 status = StatusCode.UNIMPLEMENTED
 details = "Method
 org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
 unimplemented"
 debug_error_string =
 "{"created":"@1568424715.449291418","description":"Error received from 
 peer
 ipv4:127.0.0.1:46117","file":"src/core/lib/surface/call.cc","file_line":1052,"grpc_message":"Method
 org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
 unimplemented","grpc_status":12}"
 >

 Kyle Weaver | Software Engineer | github.com/ibzib |
 kcwea...@google.com


 On Mon, Sep 16, 2019 at 11:34 AM Ning Kang 
 wrote:

> Hi! I've been seeing some errors during "Python PreCommit".
> I'm seeing "UserWarning: You are using Apache Beam with Python 2.
> New releases of Apache Beam will soon support Python 3 only. 'You are 
> using
> Apache Beam with Python 2. '"
> Is there any plan to remove py2 tests from the pre-commit check
> once we stop supporting Python2?
> The scan link is:
> https://scans.gradle.com/s/vujoeo62uyfpi/console-log?task=:sdks:python:test-suites:portable:py2:portableWordCountBatch
>
> Thanks!
>
> Ning.
>
>


Re: portableWordCountBatch and portableWordCountStreaming failing in Python PreCommit

2019-09-16 Thread Ahmet Altay
On Mon, Sep 16, 2019 at 2:07 PM Kyle Weaver  wrote:

> The original issue ("GetJobMetrics is unimplemented") is still probably
> hiding under the docker issue, so I filed
> https://issues.apache.org/jira/browse/BEAM-8245 for it
>

Thank you. Who should be assigned to this issue?

If metrics are not supported in all runners, should we update tests to make
this metrics checks optional?


>
> Kyle Weaver | Software Engineer | github.com/ibzib | kcwea...@google.com
>
>
> On Mon, Sep 16, 2019 at 1:14 PM Hannah Jiang 
> wrote:
>
>> If we try to create a flink image from master branch, it will create
>> *apachebeam/flink-job-server*, while the code is expecting 
>> *jenkins-docker-apache.bintray.io/beam/flink-job-server:latest
>> *.
>> This should be introduced when we cut 2.16.0 branch, I will investigate
>> more to see how to fix it.
>>
>
Do we need a release blocking JIRA for this?

/cc +Mark Liu 


>
>> On Mon, Sep 16, 2019 at 1:00 PM Lukasz Cwik  wrote:
>>
>>> I'm also being impacted by this on my PR[1]. I found BEAM-6316[2] that
>>> has a similar error but it was resolved Dec 2018.
>>>
>>> 1: https://github.com/apache/beam/pull/9583
>>> 2: https://issues.apache.org/jira/browse/BEAM-6316
>>>
>>> On Mon, Sep 16, 2019 at 12:43 PM Ning Kang  wrote:
>>>
 A new check renders clearer message:

 Unable to find image '
 jenkins-docker-apache.bintray.io/beam/flink-job-server:latest' locally
 docker: Error response from daemon: unknown: Repo 'apache' was not
 found.
 See 'docker run --help'.
 ERROR:root:Starting job service with ['docker', 'run', '-v',
 u'/usr/bin/docker:/bin/docker', '-v',
 '/var/run/docker.sock:/var/run/docker.sock', '--network=host', '
 jenkins-docker-apache.bintray.io/beam/flink-job-server:latest',
 '--job-host', 'localhost', '--job-port', '45687', '--artifact-port',
 '39407', '--expansion-port', '43893']
 ERROR:root:Error bringing up job service



 On Mon, Sep 16, 2019 at 12:39 PM Ning Kang  wrote:

> To Ahmet, these are warnings, I'm not able to identify the errors yet.
>
> Thanks everyone! I'm watching the Jira now.
>
> On Mon, Sep 16, 2019 at 12:07 PM Chad Dombrova 
> wrote:
>
>> Ning, if you're having trouble making sense of the preCommit errors,
>> you may be interested in this Jira:
>> https://issues.apache.org/jira/browse/BEAM-8213#
>>
>> On Mon, Sep 16, 2019 at 12:02 PM Kyle Weaver 
>> wrote:
>>
>>> Python 2 isn't the reason the test is failing, that's just a
>>> warning. The actual error is at the very end of the log (it looks 
>>> familiar
>>> to me, though I don't see a JIRA for it):
>>>
>>> <_Rendezvous of RPC that terminated with:
>>> status = StatusCode.UNIMPLEMENTED
>>> details = "Method
>>> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
>>> unimplemented"
>>> debug_error_string =
>>> "{"created":"@1568424715.449291418","description":"Error received from 
>>> peer
>>> ipv4:127.0.0.1:46117","file":"src/core/lib/surface/call.cc","file_line":1052,"grpc_message":"Method
>>> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
>>> unimplemented","grpc_status":12}"
>>> >
>>>
>>> Kyle Weaver | Software Engineer | github.com/ibzib |
>>> kcwea...@google.com
>>>
>>>
>>> On Mon, Sep 16, 2019 at 11:34 AM Ning Kang 
>>> wrote:
>>>
 Hi! I've been seeing some errors during "Python PreCommit".
 I'm seeing "UserWarning: You are using Apache Beam with Python 2.
 New releases of Apache Beam will soon support Python 3 only. 'You are 
 using
 Apache Beam with Python 2. '"
 Is there any plan to remove py2 tests from the pre-commit check
 once we stop supporting Python2?
 The scan link is:
 https://scans.gradle.com/s/vujoeo62uyfpi/console-log?task=:sdks:python:test-suites:portable:py2:portableWordCountBatch

 Thanks!

 Ning.




Re: MQTT to Python SDK

2019-09-16 Thread Brian Hulette
If you go down the cross-language route you may find Chad Dombrova's
experience developing a cross-language PubSubIO helpful. It's being
discussed in another thread:
https://lists.apache.org/thread.html/6e2e3b8c2becdf22303ed231ebeda73550a3ce9acbf2f73ccf1982f2@%3Cdev.beam.apache.org%3E

On Mon, Sep 16, 2019 at 1:07 PM Chamikara Jayalath 
wrote:

> Regarding cross-language transforms support, documentation is in flux at
> the moment since API is still being updated and runner support is in
> development.
>
> If you want to try out, I'd say cross-language wordcount example is a good
> starting point:
> https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/wordcount_xlang.py
>
> Currently, we have basic support for Flink runner and support for Dataflow
> is in development.
>
> Another possible avenue is to add a MQTT unbounded source directly to
> Python SDK once we have Splittable DoFn source framework (which is also in
> development) if someone has cycles for that.
>
> Thanks,
> Cham
>
>
>
> On Mon, Sep 16, 2019 at 12:12 PM Jean-Baptiste Onofré 
> wrote:
>
>> Regarding Java SDK, you have MqttIO available.
>>
>> Regards
>> JB
>>
>> On 16/09/2019 21:07, Lucas Magalhães wrote:
>> > Thanks Altay.. Do you know where I could find more about cross language
>> > transforms? Documentation and examples as well.
>> >
>> > thanks again
>> >
>> > On Mon, Sep 16, 2019 at 4:00 PM Ahmet Altay > > > wrote:
>> >
>> > A framework for python sdk to use a native unbounded connector does
>> > not exist yet. You might be able to use the same connector from Java
>> > using cross language transforms.
>> >
>> > /cc +Chamikara Jayalath 
>> >
>> > On Mon, Sep 16, 2019 at 11:00 AM Lucas Magalhães
>> > > > > wrote:
>> >
>> > Hello dears!
>> >
>> > I'm starding a new project here and the mainly source is a MQTT.
>> >
>> > I could´n find any documentantion about to How to develeop a
>> > unbounded connector.
>> >
>> > Could anyone send me some instructions or guide line?
>> >
>> > Thanks a lot
>> >
>> > --
>> > Lucas Magalhães,
>> > CTO
>> >
>> > Paralelo CS - Consultoria e Serviços
>> > Tel: +55 (11) 3090-5557 <+55%2011%203090-5557>
>> 
>> > Cel: +55 (11) 99420-4667 <+55%2011%2099420-4667>
>> 
>> > lucas.magalh...@paralelocs.com.br
>> > 
>> >
>> > www.paralelocs.com.br
>> > 
>> >
>> >
>> >
>> > --
>> > Lucas Magalhães,
>> > CTO
>> >
>> > Paralelo CS - Consultoria e Serviços
>> > Tel: +55 (11) 3090-5557 <+55%2011%203090-5557>
>> > Cel: +55 (11) 99420-4667 <+55%2011%2099420-4667>
>> > lucas.magalh...@paralelocs.com.br > lucas.magalh...@paralelocs.com.br>
>> >
>> > www.paralelocs.com.br
>> > 
>>
>> --
>> Jean-Baptiste Onofré
>> jbono...@apache.org
>> http://blog.nanthrax.net
>> Talend - http://www.talend.com
>>
>


Re: Next LTS?

2019-09-16 Thread Valentyn Tymofieiev
I support nominating 2.16.0 as LTS release since in has robust Python 3
support compared with prior releases, and also for reasons of pending
Python 2 deprecation. This has been discussed before [1]. As Robert pointed
out in that thread, LTS nomination in Beam is currently retroactive. If we
keep the retroactive policy, the question is how long we should wait for a
release to be considered "safe" for nomination.  Looks like in case of
2.7.0 we waited a month, see [2,3].

Thanks,
Valentyn

[1]
https://lists.apache.org/thread.html/eba6caa58ea79a7ecbc8560d1c680a366b44c531d96ce5c699d41535@%3Cdev.beam.apache.org%3E
[2] https://beam.apache.org/blog/2018/10/03/beam-2.7.0.html
[3]
https://lists.apache.org/thread.html/896cbc9fef2e60f19b466d6b1e12ce1aeda49ce5065a0b1156233f01@%3Cdev.beam.apache.org%3E

On Mon, Sep 16, 2019 at 2:46 PM Austin Bennett 
wrote:

> Hi All,
>
> According to our policies page [1]: "There will be at least one new LTS
> release in a 12 month period, and LTS releases are considered deprecated
> after 12 months"
>
> The last LTS was released 2018-10-02 [2].
>
> Does that mean the next release (2.16) should be the next LTS?  It looks
> like we are in danger of not living up to that promise.
>
> Cheers,
> Austin
>
>
>
> [1] https://beam.apache.org/community/policies/
>
> [2]  https://beam.apache.org/get-started/downloads/
>


Next LTS?

2019-09-16 Thread Austin Bennett
Hi All,

According to our policies page [1]: "There will be at least one new LTS
release in a 12 month period, and LTS releases are considered deprecated
after 12 months"

The last LTS was released 2018-10-02 [2].

Does that mean the next release (2.16) should be the next LTS?  It looks
like we are in danger of not living up to that promise.

Cheers,
Austin



[1] https://beam.apache.org/community/policies/

[2]  https://beam.apache.org/get-started/downloads/


Re: portableWordCountBatch and portableWordCountStreaming failing in Python PreCommit

2019-09-16 Thread Kyle Weaver
The original issue ("GetJobMetrics is unimplemented") is still probably
hiding under the docker issue, so I filed
https://issues.apache.org/jira/browse/BEAM-8245 for it

Kyle Weaver | Software Engineer | github.com/ibzib | kcwea...@google.com


On Mon, Sep 16, 2019 at 1:14 PM Hannah Jiang  wrote:

> If we try to create a flink image from master branch, it will create
> *apachebeam/flink-job-server*, while the code is expecting 
> *jenkins-docker-apache.bintray.io/beam/flink-job-server:latest
> *.
> This should be introduced when we cut 2.16.0 branch, I will investigate
> more to see how to fix it.
>
> On Mon, Sep 16, 2019 at 1:00 PM Lukasz Cwik  wrote:
>
>> I'm also being impacted by this on my PR[1]. I found BEAM-6316[2] that
>> has a similar error but it was resolved Dec 2018.
>>
>> 1: https://github.com/apache/beam/pull/9583
>> 2: https://issues.apache.org/jira/browse/BEAM-6316
>>
>> On Mon, Sep 16, 2019 at 12:43 PM Ning Kang  wrote:
>>
>>> A new check renders clearer message:
>>>
>>> Unable to find image '
>>> jenkins-docker-apache.bintray.io/beam/flink-job-server:latest' locally
>>> docker: Error response from daemon: unknown: Repo 'apache' was not found.
>>> See 'docker run --help'.
>>> ERROR:root:Starting job service with ['docker', 'run', '-v',
>>> u'/usr/bin/docker:/bin/docker', '-v',
>>> '/var/run/docker.sock:/var/run/docker.sock', '--network=host', '
>>> jenkins-docker-apache.bintray.io/beam/flink-job-server:latest',
>>> '--job-host', 'localhost', '--job-port', '45687', '--artifact-port',
>>> '39407', '--expansion-port', '43893']
>>> ERROR:root:Error bringing up job service
>>>
>>>
>>>
>>> On Mon, Sep 16, 2019 at 12:39 PM Ning Kang  wrote:
>>>
 To Ahmet, these are warnings, I'm not able to identify the errors yet.

 Thanks everyone! I'm watching the Jira now.

 On Mon, Sep 16, 2019 at 12:07 PM Chad Dombrova 
 wrote:

> Ning, if you're having trouble making sense of the preCommit errors,
> you may be interested in this Jira:
> https://issues.apache.org/jira/browse/BEAM-8213#
>
> On Mon, Sep 16, 2019 at 12:02 PM Kyle Weaver 
> wrote:
>
>> Python 2 isn't the reason the test is failing, that's just a warning.
>> The actual error is at the very end of the log (it looks familiar to me,
>> though I don't see a JIRA for it):
>>
>> <_Rendezvous of RPC that terminated with:
>> status = StatusCode.UNIMPLEMENTED
>> details = "Method
>> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
>> unimplemented"
>> debug_error_string =
>> "{"created":"@1568424715.449291418","description":"Error received from 
>> peer
>> ipv4:127.0.0.1:46117","file":"src/core/lib/surface/call.cc","file_line":1052,"grpc_message":"Method
>> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
>> unimplemented","grpc_status":12}"
>> >
>>
>> Kyle Weaver | Software Engineer | github.com/ibzib |
>> kcwea...@google.com
>>
>>
>> On Mon, Sep 16, 2019 at 11:34 AM Ning Kang 
>> wrote:
>>
>>> Hi! I've been seeing some errors during "Python PreCommit".
>>> I'm seeing "UserWarning: You are using Apache Beam with Python 2.
>>> New releases of Apache Beam will soon support Python 3 only. 'You are 
>>> using
>>> Apache Beam with Python 2. '"
>>> Is there any plan to remove py2 tests from the pre-commit check once
>>> we stop supporting Python2?
>>> The scan link is:
>>> https://scans.gradle.com/s/vujoeo62uyfpi/console-log?task=:sdks:python:test-suites:portable:py2:portableWordCountBatch
>>>
>>> Thanks!
>>>
>>> Ning.
>>>
>>>


Re: portableWordCountBatch and portableWordCountStreaming failing in Python PreCommit

2019-09-16 Thread Hannah Jiang
If we try to create a flink image from master branch, it will create
*apachebeam/flink-job-server*, while the code is expecting
*jenkins-docker-apache.bintray.io/beam/flink-job-server:latest
*.
This should be introduced when we cut 2.16.0 branch, I will investigate
more to see how to fix it.

On Mon, Sep 16, 2019 at 1:00 PM Lukasz Cwik  wrote:

> I'm also being impacted by this on my PR[1]. I found BEAM-6316[2] that has
> a similar error but it was resolved Dec 2018.
>
> 1: https://github.com/apache/beam/pull/9583
> 2: https://issues.apache.org/jira/browse/BEAM-6316
>
> On Mon, Sep 16, 2019 at 12:43 PM Ning Kang  wrote:
>
>> A new check renders clearer message:
>>
>> Unable to find image '
>> jenkins-docker-apache.bintray.io/beam/flink-job-server:latest' locally
>> docker: Error response from daemon: unknown: Repo 'apache' was not found.
>> See 'docker run --help'.
>> ERROR:root:Starting job service with ['docker', 'run', '-v',
>> u'/usr/bin/docker:/bin/docker', '-v',
>> '/var/run/docker.sock:/var/run/docker.sock', '--network=host', '
>> jenkins-docker-apache.bintray.io/beam/flink-job-server:latest',
>> '--job-host', 'localhost', '--job-port', '45687', '--artifact-port',
>> '39407', '--expansion-port', '43893']
>> ERROR:root:Error bringing up job service
>>
>>
>>
>> On Mon, Sep 16, 2019 at 12:39 PM Ning Kang  wrote:
>>
>>> To Ahmet, these are warnings, I'm not able to identify the errors yet.
>>>
>>> Thanks everyone! I'm watching the Jira now.
>>>
>>> On Mon, Sep 16, 2019 at 12:07 PM Chad Dombrova 
>>> wrote:
>>>
 Ning, if you're having trouble making sense of the preCommit errors,
 you may be interested in this Jira:
 https://issues.apache.org/jira/browse/BEAM-8213#

 On Mon, Sep 16, 2019 at 12:02 PM Kyle Weaver 
 wrote:

> Python 2 isn't the reason the test is failing, that's just a warning.
> The actual error is at the very end of the log (it looks familiar to me,
> though I don't see a JIRA for it):
>
> <_Rendezvous of RPC that terminated with:
> status = StatusCode.UNIMPLEMENTED
> details = "Method
> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
> unimplemented"
> debug_error_string =
> "{"created":"@1568424715.449291418","description":"Error received from 
> peer
> ipv4:127.0.0.1:46117","file":"src/core/lib/surface/call.cc","file_line":1052,"grpc_message":"Method
> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
> unimplemented","grpc_status":12}"
> >
>
> Kyle Weaver | Software Engineer | github.com/ibzib |
> kcwea...@google.com
>
>
> On Mon, Sep 16, 2019 at 11:34 AM Ning Kang  wrote:
>
>> Hi! I've been seeing some errors during "Python PreCommit".
>> I'm seeing "UserWarning: You are using Apache Beam with Python 2. New
>> releases of Apache Beam will soon support Python 3 only. 'You are using
>> Apache Beam with Python 2. '"
>> Is there any plan to remove py2 tests from the pre-commit check once
>> we stop supporting Python2?
>> The scan link is:
>> https://scans.gradle.com/s/vujoeo62uyfpi/console-log?task=:sdks:python:test-suites:portable:py2:portableWordCountBatch
>>
>> Thanks!
>>
>> Ning.
>>
>>


Re: MQTT to Python SDK

2019-09-16 Thread Chamikara Jayalath
Regarding cross-language transforms support, documentation is in flux at
the moment since API is still being updated and runner support is in
development.

If you want to try out, I'd say cross-language wordcount example is a good
starting point:
https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/wordcount_xlang.py

Currently, we have basic support for Flink runner and support for Dataflow
is in development.

Another possible avenue is to add a MQTT unbounded source directly to
Python SDK once we have Splittable DoFn source framework (which is also in
development) if someone has cycles for that.

Thanks,
Cham



On Mon, Sep 16, 2019 at 12:12 PM Jean-Baptiste Onofré 
wrote:

> Regarding Java SDK, you have MqttIO available.
>
> Regards
> JB
>
> On 16/09/2019 21:07, Lucas Magalhães wrote:
> > Thanks Altay.. Do you know where I could find more about cross language
> > transforms? Documentation and examples as well.
> >
> > thanks again
> >
> > On Mon, Sep 16, 2019 at 4:00 PM Ahmet Altay  > > wrote:
> >
> > A framework for python sdk to use a native unbounded connector does
> > not exist yet. You might be able to use the same connector from Java
> > using cross language transforms.
> >
> > /cc +Chamikara Jayalath 
> >
> > On Mon, Sep 16, 2019 at 11:00 AM Lucas Magalhães
> >  > > wrote:
> >
> > Hello dears!
> >
> > I'm starding a new project here and the mainly source is a MQTT.
> >
> > I could´n find any documentantion about to How to develeop a
> > unbounded connector.
> >
> > Could anyone send me some instructions or guide line?
> >
> > Thanks a lot
> >
> > --
> > Lucas Magalhães,
> > CTO
> >
> > Paralelo CS - Consultoria e Serviços
> > Tel: +55 (11) 3090-5557 <+55%2011%203090-5557>
> 
> > Cel: +55 (11) 99420-4667 <+55%2011%2099420-4667>
> 
> > lucas.magalh...@paralelocs.com.br
> > 
> >
> > www.paralelocs.com.br
> > 
> >
> >
> >
> > --
> > Lucas Magalhães,
> > CTO
> >
> > Paralelo CS - Consultoria e Serviços
> > Tel: +55 (11) 3090-5557 <+55%2011%203090-5557>
> > Cel: +55 (11) 99420-4667 <+55%2011%2099420-4667>
> > lucas.magalh...@paralelocs.com.br  lucas.magalh...@paralelocs.com.br>
> >
> > www.paralelocs.com.br
> > 
>
> --
> Jean-Baptiste Onofré
> jbono...@apache.org
> http://blog.nanthrax.net
> Talend - http://www.talend.com
>


Re: portableWordCountBatch and portableWordCountStreaming failing in Python PreCommit

2019-09-16 Thread Lukasz Cwik
I'm also being impacted by this on my PR[1]. I found BEAM-6316[2] that has
a similar error but it was resolved Dec 2018.

1: https://github.com/apache/beam/pull/9583
2: https://issues.apache.org/jira/browse/BEAM-6316

On Mon, Sep 16, 2019 at 12:43 PM Ning Kang  wrote:

> A new check renders clearer message:
>
> Unable to find image '
> jenkins-docker-apache.bintray.io/beam/flink-job-server:latest' locally
> docker: Error response from daemon: unknown: Repo 'apache' was not found.
> See 'docker run --help'.
> ERROR:root:Starting job service with ['docker', 'run', '-v',
> u'/usr/bin/docker:/bin/docker', '-v',
> '/var/run/docker.sock:/var/run/docker.sock', '--network=host', '
> jenkins-docker-apache.bintray.io/beam/flink-job-server:latest',
> '--job-host', 'localhost', '--job-port', '45687', '--artifact-port',
> '39407', '--expansion-port', '43893']
> ERROR:root:Error bringing up job service
>
>
>
> On Mon, Sep 16, 2019 at 12:39 PM Ning Kang  wrote:
>
>> To Ahmet, these are warnings, I'm not able to identify the errors yet.
>>
>> Thanks everyone! I'm watching the Jira now.
>>
>> On Mon, Sep 16, 2019 at 12:07 PM Chad Dombrova  wrote:
>>
>>> Ning, if you're having trouble making sense of the preCommit errors, you
>>> may be interested in this Jira:
>>> https://issues.apache.org/jira/browse/BEAM-8213#
>>>
>>> On Mon, Sep 16, 2019 at 12:02 PM Kyle Weaver 
>>> wrote:
>>>
 Python 2 isn't the reason the test is failing, that's just a warning.
 The actual error is at the very end of the log (it looks familiar to me,
 though I don't see a JIRA for it):

 <_Rendezvous of RPC that terminated with:
 status = StatusCode.UNIMPLEMENTED
 details = "Method
 org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
 unimplemented"
 debug_error_string =
 "{"created":"@1568424715.449291418","description":"Error received from peer
 ipv4:127.0.0.1:46117","file":"src/core/lib/surface/call.cc","file_line":1052,"grpc_message":"Method
 org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
 unimplemented","grpc_status":12}"
 >

 Kyle Weaver | Software Engineer | github.com/ibzib |
 kcwea...@google.com


 On Mon, Sep 16, 2019 at 11:34 AM Ning Kang  wrote:

> Hi! I've been seeing some errors during "Python PreCommit".
> I'm seeing "UserWarning: You are using Apache Beam with Python 2. New
> releases of Apache Beam will soon support Python 3 only. 'You are using
> Apache Beam with Python 2. '"
> Is there any plan to remove py2 tests from the pre-commit check once
> we stop supporting Python2?
> The scan link is:
> https://scans.gradle.com/s/vujoeo62uyfpi/console-log?task=:sdks:python:test-suites:portable:py2:portableWordCountBatch
>
> Thanks!
>
> Ning.
>
>


Re: portableWordCountBatch and portableWordCountStreaming failing in Python PreCommit

2019-09-16 Thread Ning Kang
A new check renders clearer message:

Unable to find image '
jenkins-docker-apache.bintray.io/beam/flink-job-server:latest' locally
docker: Error response from daemon: unknown: Repo 'apache' was not found.
See 'docker run --help'.
ERROR:root:Starting job service with ['docker', 'run', '-v',
u'/usr/bin/docker:/bin/docker', '-v',
'/var/run/docker.sock:/var/run/docker.sock', '--network=host', '
jenkins-docker-apache.bintray.io/beam/flink-job-server:latest',
'--job-host', 'localhost', '--job-port', '45687', '--artifact-port',
'39407', '--expansion-port', '43893']
ERROR:root:Error bringing up job service



On Mon, Sep 16, 2019 at 12:39 PM Ning Kang  wrote:

> To Ahmet, these are warnings, I'm not able to identify the errors yet.
>
> Thanks everyone! I'm watching the Jira now.
>
> On Mon, Sep 16, 2019 at 12:07 PM Chad Dombrova  wrote:
>
>> Ning, if you're having trouble making sense of the preCommit errors, you
>> may be interested in this Jira:
>> https://issues.apache.org/jira/browse/BEAM-8213#
>>
>> On Mon, Sep 16, 2019 at 12:02 PM Kyle Weaver  wrote:
>>
>>> Python 2 isn't the reason the test is failing, that's just a warning.
>>> The actual error is at the very end of the log (it looks familiar to me,
>>> though I don't see a JIRA for it):
>>>
>>> <_Rendezvous of RPC that terminated with:
>>> status = StatusCode.UNIMPLEMENTED
>>> details = "Method
>>> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
>>> unimplemented"
>>> debug_error_string =
>>> "{"created":"@1568424715.449291418","description":"Error received from peer
>>> ipv4:127.0.0.1:46117","file":"src/core/lib/surface/call.cc","file_line":1052,"grpc_message":"Method
>>> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
>>> unimplemented","grpc_status":12}"
>>> >
>>>
>>> Kyle Weaver | Software Engineer | github.com/ibzib | kcwea...@google.com
>>>
>>>
>>> On Mon, Sep 16, 2019 at 11:34 AM Ning Kang  wrote:
>>>
 Hi! I've been seeing some errors during "Python PreCommit".
 I'm seeing "UserWarning: You are using Apache Beam with Python 2. New
 releases of Apache Beam will soon support Python 3 only. 'You are using
 Apache Beam with Python 2. '"
 Is there any plan to remove py2 tests from the pre-commit check once we
 stop supporting Python2?
 The scan link is:
 https://scans.gradle.com/s/vujoeo62uyfpi/console-log?task=:sdks:python:test-suites:portable:py2:portableWordCountBatch

 Thanks!

 Ning.




Re: portableWordCountBatch and portableWordCountStreaming failing in Python PreCommit

2019-09-16 Thread Ning Kang
To Ahmet, these are warnings, I'm not able to identify the errors yet.

Thanks everyone! I'm watching the Jira now.

On Mon, Sep 16, 2019 at 12:07 PM Chad Dombrova  wrote:

> Ning, if you're having trouble making sense of the preCommit errors, you
> may be interested in this Jira:
> https://issues.apache.org/jira/browse/BEAM-8213#
>
> On Mon, Sep 16, 2019 at 12:02 PM Kyle Weaver  wrote:
>
>> Python 2 isn't the reason the test is failing, that's just a warning. The
>> actual error is at the very end of the log (it looks familiar to me, though
>> I don't see a JIRA for it):
>>
>> <_Rendezvous of RPC that terminated with:
>> status = StatusCode.UNIMPLEMENTED
>> details = "Method
>> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
>> unimplemented"
>> debug_error_string =
>> "{"created":"@1568424715.449291418","description":"Error received from peer
>> ipv4:127.0.0.1:46117","file":"src/core/lib/surface/call.cc","file_line":1052,"grpc_message":"Method
>> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
>> unimplemented","grpc_status":12}"
>> >
>>
>> Kyle Weaver | Software Engineer | github.com/ibzib | kcwea...@google.com
>>
>>
>> On Mon, Sep 16, 2019 at 11:34 AM Ning Kang  wrote:
>>
>>> Hi! I've been seeing some errors during "Python PreCommit".
>>> I'm seeing "UserWarning: You are using Apache Beam with Python 2. New
>>> releases of Apache Beam will soon support Python 3 only. 'You are using
>>> Apache Beam with Python 2. '"
>>> Is there any plan to remove py2 tests from the pre-commit check once we
>>> stop supporting Python2?
>>> The scan link is:
>>> https://scans.gradle.com/s/vujoeo62uyfpi/console-log?task=:sdks:python:test-suites:portable:py2:portableWordCountBatch
>>>
>>> Thanks!
>>>
>>> Ning.
>>>
>>>


Re: The state of external transforms in Beam

2019-09-16 Thread Robert Bradshaw
Thanks for bringing this up again. My thoughts on the open questions below.

On Mon, Sep 16, 2019 at 11:51 AM Chad Dombrova  wrote:
> That commit solves 2 problems:
>
> Adds the pubsub Java deps so that they’re available in our portable pipeline
> Makes the coder for the PubsubIO message-holder type, PubsubMessage, 
> available as a standard coder. This is required because both PubsubIO.Read 
> and PubsubIO.Write expand to ParDos which pass along these PubsubMessage 
> objects, but only “standard” (i.e. portable) coders can be used, so we have 
> to hack it to make PubsubMessage appear as a standard coder.
>
> More details:
>
> There’s a similar magic commit required for Kafka external transforms
> The Jira issue for this problem is here: 
> https://jira.apache.org/jira/browse/BEAM-7870
> For problem #2 above there seems to be some consensus forming around using 
> Avro or schema/row coders to send compound types in a portable way. Here’s 
> the PR for making row coders portable
> https://github.com/apache/beam/pull/9188

+1. Note that this doesn't mean that the IO itself must produce rows;
part of the Schema work in Java is to make it easy to automatically
convert from various Java classes to schemas transparently, so this
same logic that would allow one to apply an SQL filter directly to a
Kafka/PubSub read would allow cross-language. Even if that doesn't
work, we need not uglify the Java API; we can have an
option/alternative transform that appends the convert-to-Row DoFn for
easier use by external (though the goal of the former work is to make
this step unnecissary).

> I don’t really have any ideas for problem #1

The crux of the issue here is that the jobs API was not designed with
cross-language in mind, and so the artifact API ties artifacts to jobs
rather than to environments. To solve this we need to augment the
notion of environment to allow the specification of additional
dependencies (e.g. jar files in this specific case, or better as
maven/pypi/... dependencies (with version ranges) such that
environment merging and dependency resolution can be sanely done), and
a way for the expansion service to provide such dependencies.

Max wrote up a summary of the prior discussions at
https://docs.google.com/document/d/1XaiNekAY2sptuQRIXpjGAyaYdSc-wlJ-VKjl04c8N48/edit#heading=h.900gc947qrw8

In the short term, one can build a custom docker image that has all
the requisite dependencies installed.

This touches on a related but separable issue that one may want to run
some of these transforms "natively" in the same process as the runner
(e.g. a Java IO in the Flink Java Runner) rather than via docker.
(Similarly with subprocess.) Exactly how that works with environment
specifications is also a bit TBD, but my proposal has been that these
are best viewed as runner-specific substitutions of standard
environments.

> So the portability expansion system works, and now it’s time to sand off some 
> of the rough corners. I’d love to hear others’ thoughts on how to resolve 
> some of these remaining issues.

+1


On Mon, Sep 16, 2019 at 11:51 AM Chad Dombrova  wrote:
>
> Hi all,
> There was some interest in this topic at the Beam Summit this week (btw, 
> great job to everyone involved!), so I thought I’d try to summarize the 
> current state of things.
> First, let me explain the idea behind an external transforms for the 
> uninitiated.
>
> Problem:
>
> there’s a transform that you want to use, but it’s not available in your 
> desired language. IO connectors are a good example: there are many available 
> in the Java SDK, but not so much in Python or Go.
>
> Solution:
>
> Create a stub transform in your desired language (e.g. Python) whose primary 
> role is to serialize the parameters passed to that transform
> When you run your portable pipeline, just prior to it being sent to the Job 
> Service for execution, your stub transform’s payload is first sent to the 
> “Expansion Service” that’s running in the native language (Java), where the 
> payload is used to construct an instance of the native transform, which is 
> then expanded and converted to a protobuf and sent back to the calling 
> process (Python).
> The protobuf representation of the expanded transform gets integrated back 
> into the pipeline that you’re submitting
> Steps 2-3 are repeated for each external transform in your pipeline
> Then the whole pipeline gets sent to the Job Service to be invoked on 
> Flink/Spark/etc
>
> 
>
> Now on to my journey to get PubsubIO working in python on Flink.
>
> The first issue I encountered was that there was a lot of boilerplate 
> involved in serializing the stub python transform’s parameters so they can be 
> sent to the expansion service.
>
> I created a PR to make this simpler, which has just been merged to master: 
> https://github.com/apache/beam/pull/9098
>
> With this feature in place, if you’re using python 3.7 you can use a 
> dataclass and the typing module to 

Re: MQTT to Python SDK

2019-09-16 Thread Jean-Baptiste Onofré
Regarding Java SDK, you have MqttIO available.

Regards
JB

On 16/09/2019 21:07, Lucas Magalhães wrote:
> Thanks Altay.. Do you know where I could find more about cross language
> transforms? Documentation and examples as well.
> 
> thanks again
> 
> On Mon, Sep 16, 2019 at 4:00 PM Ahmet Altay  > wrote:
> 
> A framework for python sdk to use a native unbounded connector does
> not exist yet. You might be able to use the same connector from Java
> using cross language transforms.
> 
> /cc +Chamikara Jayalath   
> 
> On Mon, Sep 16, 2019 at 11:00 AM Lucas Magalhães
>  > wrote:
> 
> Hello dears!
> 
> I'm starding a new project here and the mainly source is a MQTT.
> 
> I could´n find any documentantion about to How to develeop a
> unbounded connector.
> 
> Could anyone send me some instructions or guide line?
> 
> Thanks a lot
> 
> -- 
> Lucas Magalhães,
> CTO
> 
> Paralelo CS - Consultoria e Serviços
> Tel: +55 (11) 3090-5557 
> Cel: +55 (11) 99420-4667 
> lucas.magalh...@paralelocs.com.br
> 
> 
> www.paralelocs.com.br
> 
> 
> 
> 
> -- 
> Lucas Magalhães,
> CTO
> 
> Paralelo CS - Consultoria e Serviços
> Tel: +55 (11) 3090-5557
> Cel: +55 (11) 99420-4667
> lucas.magalh...@paralelocs.com.br 
> 
> www.paralelocs.com.br
> 

-- 
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com


Re: MQTT to Python SDK

2019-09-16 Thread Lucas Magalhães
Thanks Altay.. Do you know where I could find more about cross language
transforms? Documentation and examples as well.

thanks again

On Mon, Sep 16, 2019 at 4:00 PM Ahmet Altay  wrote:

> A framework for python sdk to use a native unbounded connector does not
> exist yet. You might be able to use the same connector from Java using
> cross language transforms.
>
> /cc +Chamikara Jayalath 
>
> On Mon, Sep 16, 2019 at 11:00 AM Lucas Magalhães <
> lucas.magalh...@paralelocs.com.br> wrote:
>
>> Hello dears!
>>
>> I'm starding a new project here and the mainly source is a MQTT.
>>
>> I could´n find any documentantion about to How to develeop a unbounded
>> connector.
>>
>> Could anyone send me some instructions or guide line?
>>
>> Thanks a lot
>>
>> --
>> Lucas Magalhães,
>> CTO
>>
>> Paralelo CS - Consultoria e Serviços
>> Tel: +55 (11) 3090-5557 <+55%2011%203090-5557>
>> Cel: +55 (11) 99420-4667 <+55%2011%2099420-4667>
>> lucas.magalh...@paralelocs.com.br
>>
>> www.paralelocs.com.br
>>
>

-- 
Lucas Magalhães,
CTO

Paralelo CS - Consultoria e Serviços
Tel: +55 (11) 3090-5557
Cel: +55 (11) 99420-4667
lucas.magalh...@paralelocs.com.br

www.paralelocs.com.br


Re: portableWordCountBatch and portableWordCountStreaming failing in Python PreCommit

2019-09-16 Thread Chad Dombrova
Ning, if you're having trouble making sense of the preCommit errors, you
may be interested in this Jira:
https://issues.apache.org/jira/browse/BEAM-8213#

On Mon, Sep 16, 2019 at 12:02 PM Kyle Weaver  wrote:

> Python 2 isn't the reason the test is failing, that's just a warning. The
> actual error is at the very end of the log (it looks familiar to me, though
> I don't see a JIRA for it):
>
> <_Rendezvous of RPC that terminated with:
> status = StatusCode.UNIMPLEMENTED
> details = "Method
> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
> unimplemented"
> debug_error_string =
> "{"created":"@1568424715.449291418","description":"Error received from peer
> ipv4:127.0.0.1:46117","file":"src/core/lib/surface/call.cc","file_line":1052,"grpc_message":"Method
> org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
> unimplemented","grpc_status":12}"
> >
>
> Kyle Weaver | Software Engineer | github.com/ibzib | kcwea...@google.com
>
>
> On Mon, Sep 16, 2019 at 11:34 AM Ning Kang  wrote:
>
>> Hi! I've been seeing some errors during "Python PreCommit".
>> I'm seeing "UserWarning: You are using Apache Beam with Python 2. New
>> releases of Apache Beam will soon support Python 3 only. 'You are using
>> Apache Beam with Python 2. '"
>> Is there any plan to remove py2 tests from the pre-commit check once we
>> stop supporting Python2?
>> The scan link is:
>> https://scans.gradle.com/s/vujoeo62uyfpi/console-log?task=:sdks:python:test-suites:portable:py2:portableWordCountBatch
>>
>> Thanks!
>>
>> Ning.
>>
>>


Re: portableWordCountBatch and portableWordCountStreaming failing in Python PreCommit

2019-09-16 Thread Kyle Weaver
Python 2 isn't the reason the test is failing, that's just a warning. The
actual error is at the very end of the log (it looks familiar to me, though
I don't see a JIRA for it):

<_Rendezvous of RPC that terminated with:
status = StatusCode.UNIMPLEMENTED
details = "Method
org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
unimplemented"
debug_error_string =
"{"created":"@1568424715.449291418","description":"Error received from peer
ipv4:127.0.0.1:46117","file":"src/core/lib/surface/call.cc","file_line":1052,"grpc_message":"Method
org.apache.beam.model.job_management.v1.JobService/GetJobMetrics is
unimplemented","grpc_status":12}"
>

Kyle Weaver | Software Engineer | github.com/ibzib | kcwea...@google.com


On Mon, Sep 16, 2019 at 11:34 AM Ning Kang  wrote:

> Hi! I've been seeing some errors during "Python PreCommit".
> I'm seeing "UserWarning: You are using Apache Beam with Python 2. New
> releases of Apache Beam will soon support Python 3 only. 'You are using
> Apache Beam with Python 2. '"
> Is there any plan to remove py2 tests from the pre-commit check once we
> stop supporting Python2?
> The scan link is:
> https://scans.gradle.com/s/vujoeo62uyfpi/console-log?task=:sdks:python:test-suites:portable:py2:portableWordCountBatch
>
> Thanks!
>
> Ning.
>
>


Re: portableWordCountBatch and portableWordCountStreaming failing in Python PreCommit

2019-09-16 Thread Ahmet Altay
To clarify are they errors or warnings? There is a plan to stop supporting
python 2 by ~end of the year. +Valentyn Tymofieiev  shared
details about it earlier on the dev@ list.

On Mon, Sep 16, 2019 at 11:34 AM Ning Kang  wrote:

> Hi! I've been seeing some errors during "Python PreCommit".
> I'm seeing "UserWarning: You are using Apache Beam with Python 2. New
> releases of Apache Beam will soon support Python 3 only. 'You are using
> Apache Beam with Python 2. '"
> Is there any plan to remove py2 tests from the pre-commit check once we
> stop supporting Python2?
> The scan link is:
> https://scans.gradle.com/s/vujoeo62uyfpi/console-log?task=:sdks:python:test-suites:portable:py2:portableWordCountBatch
>
> Thanks!
>
> Ning.
>
>


Re: MQTT to Python SDK

2019-09-16 Thread Ahmet Altay
A framework for python sdk to use a native unbounded connector does not
exist yet. You might be able to use the same connector from Java using
cross language transforms.

/cc +Chamikara Jayalath 

On Mon, Sep 16, 2019 at 11:00 AM Lucas Magalhães <
lucas.magalh...@paralelocs.com.br> wrote:

> Hello dears!
>
> I'm starding a new project here and the mainly source is a MQTT.
>
> I could´n find any documentantion about to How to develeop a unbounded
> connector.
>
> Could anyone send me some instructions or guide line?
>
> Thanks a lot
>
> --
> Lucas Magalhães,
> CTO
>
> Paralelo CS - Consultoria e Serviços
> Tel: +55 (11) 3090-5557 <+55%2011%203090-5557>
> Cel: +55 (11) 99420-4667 <+55%2011%2099420-4667>
> lucas.magalh...@paralelocs.com.br
>
> www.paralelocs.com.br
>


The state of external transforms in Beam

2019-09-16 Thread Chad Dombrova
Hi all,
There was some interest in this topic at the Beam Summit this week (btw,
great job to everyone involved!), so I thought I’d try to summarize the
current state of things.
First, let me explain the idea behind an external transforms for the
uninitiated.

Problem:

   - there’s a transform that you want to use, but it’s not available in
   your desired language. IO connectors are a good example: there are many
   available in the Java SDK, but not so much in Python or Go.

Solution:

   1. Create a stub transform in your desired language (e.g. Python) whose
   primary role is to serialize the parameters passed to that transform
   2. When you run your portable pipeline, just prior to it being sent to
   the Job Service for execution, your stub transform’s payload is first sent
   to the “Expansion Service” that’s running in the native language (Java),
   where the payload is used to construct an instance of the native transform,
   which is then expanded and converted to a protobuf and sent back to the
   calling process (Python).
   3. The protobuf representation of the expanded transform gets integrated
   back into the pipeline that you’re submitting
   4. Steps 2-3 are repeated for each external transform in your pipeline
   5. Then the whole pipeline gets sent to the Job Service to be invoked on
   Flink/Spark/etc

--

Now on to my journey to get PubsubIO working in python on Flink.

The first issue I encountered was that there was a lot of boilerplate
involved in serializing the stub python transform’s parameters so they can
be sent to the expansion service.

I created a PR to make this simpler, which has just been merged to master:
https://github.com/apache/beam/pull/9098

With this feature in place, if you’re using python 3.7 you can use a
dataclass and the typing module to create your transform and describe your
schema in one go. For example:

@dataclasses.dataclass
class MyAwesomeTransform(beam.ExternalTransform):
  URN = 'beam:external:fakeurn:v1'

  integer_example: int
  string_example: str
  list_of_strings: List[str]
  optional_kv: Optional[Tuple[str, float]] = None
  optional_integer: Optional[int] = None
  expansion_service: dataclasses.InitVar[Optional[str]] = None

For earlier versions of python, you can use typing.NamedTuple to declare
your schema.

MyAwesomeSchema = typing.NamedTuple(
'MyAwesomeSchema',
[
('integer_example', int),
('string_example', unicode),
('list_of_strings', List[unicode]),
('optional_kv', Optional[Tuple[unicode, float]]),
('optional_integer', Optional[int]),
]
)

There’s also an option to generate the schema implicitly based on the
value(s) you wish to serialize.

There was a slight tangent in implementing this feature in that requesting
a coder for typing.List resulted in pickle coder instead of IterableCoder.
That’s bad because only standard/portable coders can be used for expansion
in Java (for obvious reasons), so as a convenience that was solved here:
https://github.com/apache/beam/pull/9344

The next issue that I encountered was that python did not track the
boundedness of PCollections, which made it impossible to use the expansion
service to create unbounded writes. That’s been solved and merged here:
https://github.com/apache/beam/pull/9426

So that brings us to the actual PR for adding external transform support
for PubsubIO: https://github.com/apache/beam/pull/9268

The PR works, but with one big caveat: in order to use it you must build
your Java containers with this special commit:
https://github.com/chadrik/beam/commit/d12b99084809ec34fcf0be616e94301d3aca4870

That commit solves 2 problems:

   1. Adds the pubsub Java deps so that they’re available in our portable
   pipeline
   2. Makes the coder for the PubsubIO message-holder type, PubsubMessage,
   available as a standard coder. This is required because both PubsubIO.Read
   and PubsubIO.Write expand to ParDos which pass along these PubsubMessage
   objects, but only “standard” (i.e. portable) coders can be used, so we have
   to hack it to make PubsubMessage appear as a standard coder.

More details:

   - There’s a similar magic commit required for Kafka external transforms
   - The Jira issue for this problem is here:
   https://jira.apache.org/jira/browse/BEAM-7870
   - For problem #2 above there seems to be some consensus forming around
   using Avro or schema/row coders to send compound types in a portable way.
   Here’s the PR for making row coders portable
   https://github.com/apache/beam/pull/9188
   - I don’t really have any ideas for problem #1

So the portability expansion system works, and now it’s time to sand off
some of the rough corners. I’d love to hear others’ thoughts on how to
resolve some of these remaining issues.

-chad


portableWordCountBatch and portableWordCountStreaming failing in Python PreCommit

2019-09-16 Thread Ning Kang
Hi! I've been seeing some errors during "Python PreCommit".
I'm seeing "UserWarning: You are using Apache Beam with Python 2. New
releases of Apache Beam will soon support Python 3 only. 'You are using
Apache Beam with Python 2. '"
Is there any plan to remove py2 tests from the pre-commit check once we
stop supporting Python2?
The scan link is:
https://scans.gradle.com/s/vujoeo62uyfpi/console-log?task=:sdks:python:test-suites:portable:py2:portableWordCountBatch

Thanks!

Ning.


MQTT to Python SDK

2019-09-16 Thread Lucas Magalhães
Hello dears!

I'm starding a new project here and the mainly source is a MQTT.

I could´n find any documentantion about to How to develeop a unbounded
connector.

Could anyone send me some instructions or guide line?

Thanks a lot

-- 
Lucas Magalhães,
CTO

Paralelo CS - Consultoria e Serviços
Tel: +55 (11) 3090-5557
Cel: +55 (11) 99420-4667
lucas.magalh...@paralelocs.com.br

www.paralelocs.com.br


New contributor to BEAM SQL

2019-09-16 Thread Kirill Kozlov
Hello everyone!

My name is Kirill Kozlov, I recently joined a Dataflow team at Google and
will be working on SQL filter pushdown.
Can I get permission to work issues in jira, my username is: kirillkozlov
Looking forward to developing Beam together!

Thank you,
Kirill Kozlov


Re: using avro instead of json for BigQueryIO.Write

2019-09-16 Thread Steve Niemitz
Our experience has actually been that avro is more efficient than even
parquet, but that might also be skewed from our datasets.

I might try to take a crack at this, I found
https://issues.apache.org/jira/browse/BEAM-2879 tracking it (which
coincidentally references my thread from a couple years ago on the read
side of this :) ).

On Mon, Sep 16, 2019 at 1:38 PM Reuven Lax  wrote:

> It's been talked about, but nobody's done anything. There as some
> difficulties related to type conversion (json and avro don't support the
> same types), but if those are overcome then an avro version would be much
> more efficient. I believe Parquet files would be even more efficient if you
> wanted to go that path, but there might be more code to write (as we
> already have some code in the codebase to convert between TableRows and
> Avro).
>
> Reuven
>
> On Mon, Sep 16, 2019 at 10:33 AM Steve Niemitz 
> wrote:
>
>> Has anyone investigated using avro rather than json to load data into
>> BigQuery using BigQueryIO (+ FILE_LOADS)?
>>
>> I'd be interested in enhancing it to support this, but I'm curious if
>> there's any prior work here.
>>
>


Re: using avro instead of json for BigQueryIO.Write

2019-09-16 Thread Reuven Lax
It's been talked about, but nobody's done anything. There as some
difficulties related to type conversion (json and avro don't support the
same types), but if those are overcome then an avro version would be much
more efficient. I believe Parquet files would be even more efficient if you
wanted to go that path, but there might be more code to write (as we
already have some code in the codebase to convert between TableRows and
Avro).

Reuven

On Mon, Sep 16, 2019 at 10:33 AM Steve Niemitz  wrote:

> Has anyone investigated using avro rather than json to load data into
> BigQuery using BigQueryIO (+ FILE_LOADS)?
>
> I'd be interested in enhancing it to support this, but I'm curious if
> there's any prior work here.
>


using avro instead of json for BigQueryIO.Write

2019-09-16 Thread Steve Niemitz
Has anyone investigated using avro rather than json to load data into
BigQuery using BigQueryIO (+ FILE_LOADS)?

I'd be interested in enhancing it to support this, but I'm curious if
there's any prior work here.


Beam Dependency Check Report (2019-09-16)

2019-09-16 Thread Apache Jenkins Server

High Priority Dependency Updates Of Beam Python SDK:


  Dependency Name
  Current Version
  Latest Version
  Release Date Of the Current Used Version
  Release Date Of The Latest Release
  JIRA Issue
  
mock
2.0.0
3.0.5
2019-05-20
2019-05-20BEAM-7369
oauth2client
3.0.0
4.1.3
2018-12-10
2018-12-10BEAM-6089
Sphinx
1.8.5
2.2.0
2019-05-20
2019-08-19BEAM-7370
High Priority Dependency Updates Of Beam Java SDK:


  Dependency Name
  Current Version
  Latest Version
  Release Date Of the Current Used Version
  Release Date Of The Latest Release
  JIRA Issue
  
com.github.ben-manes.versions:com.github.ben-manes.versions.gradle.plugin
0.20.0
0.25.0
2019-02-11
2019-09-16BEAM-6645
com.github.spotbugs:spotbugs
3.1.12
4.0.0-beta3
2019-03-01
2019-06-23BEAM-7792
com.github.spotbugs:spotbugs-annotations
3.1.12
4.0.0-beta3
2019-03-01
2019-06-23BEAM-6951
javax.servlet:javax.servlet-api
3.1.0
4.0.1
2013-04-25
2018-04-20BEAM-5750
org.conscrypt:conscrypt-openjdk
1.1.3
2.2.1
2018-06-04
2019-08-08BEAM-5748
org.eclipse.jetty:jetty-server
9.2.10.v20150310
10.0.0-alpha0
2015-03-10
2019-07-11BEAM-5752
org.eclipse.jetty:jetty-servlet
9.2.10.v20150310
10.0.0-alpha0
2015-03-10
2019-07-11BEAM-5753
Gradle:
5.2.1
5.6.2
2019-08-19
2019-09-09BEAM-8002

 A dependency update is high priority if it satisfies one of following criteria: 

 It has major versions update available, e.g. org.assertj:assertj-core 2.5.0 -> 3.10.0; 


 It is over 3 minor versions behind the latest version, e.g. org.tukaani:xz 1.5 -> 1.8; 


 The current version is behind the later version for over 180 days, e.g. com.google.auto.service:auto-service 2014-10-24 -> 2017-12-11. 

 In Beam, we make a best-effort attempt at keeping all dependencies up-to-date.
 In the future, issues will be filed and tracked for these automatically,
 but in the meantime you can search for existing issues or open a new one.

 For more information:  Beam Dependency Guide