Jenkins build became unstable: beam_PostCommit_Java_MavenInstall #5951

2018-02-12 Thread Apache Jenkins Server
See 




[jira] [Comment Edited] (BEAM-3647) Default Coder/Reading Coder From File

2018-02-12 Thread Kishan Kumar (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3647?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361880#comment-16361880
 ] 

Kishan Kumar edited comment on BEAM-3647 at 2/13/18 6:37 AM:
-

Thanks,[~kenn] and [~kedin]   But I want to State That For Different Use Case 
We are Using Two Different Templates Because We need to Define Codertheirr Only 
If We can Read Coder at Run Time Then Both Work can Be Done in Single Template 
Because On Both The SQl Selection is Done on Roll Number Only.


was (Author: kishank):
Thanks,[~kenn] and [~kedin]

> Default Coder/Reading Coder From File 
> --
>
> Key: BEAM-3647
> URL: https://issues.apache.org/jira/browse/BEAM-3647
> Project: Beam
>  Issue Type: New Feature
>  Components: beam-model, dsl-sql
>Affects Versions: 2.2.0
>Reporter: Kishan Kumar
>Priority: Major
>
> *Requirement*-: Need to Run Template With Same Logics on Different Tables 
> Data.(Example is Given Below)
>  
> *Need*: Default Coder is Required So According to Data It Make All Fields as 
> String and Read Data else Thier must be Dynamic Options to Read Coder From 
> GCS as JSON FILE and Parse Data on Basis of That (But We can Pass Location 
> Using ValueProvider) or SomeWhere Else so At Runtime Using ValueProvider.
>  
>  
> *Examples*: I Have Two Tables 1 is Having Column (NAME, CLASS, ROLL, 
> SUB_PRICE)
> And 2 Table is (NAME, ROLL, SUB, TEST_MARKS)
>  
> On Both Tables, I am Just Sorting Table on Basis Of Roll Number so if We can 
> Read Coder at Run Time The Same Template Can Be Used For Different Tables at 
> Run Time.
>  
> Such Situations Make Our Work Easy and Make Our job Easy.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3647) Default Coder/Reading Coder From File

2018-02-12 Thread Kishan Kumar (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3647?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361880#comment-16361880
 ] 

Kishan Kumar commented on BEAM-3647:


Thanks,[~kenn] and [~kedin]

> Default Coder/Reading Coder From File 
> --
>
> Key: BEAM-3647
> URL: https://issues.apache.org/jira/browse/BEAM-3647
> Project: Beam
>  Issue Type: New Feature
>  Components: beam-model, dsl-sql
>Affects Versions: 2.2.0
>Reporter: Kishan Kumar
>Priority: Major
>
> *Requirement*-: Need to Run Template With Same Logics on Different Tables 
> Data.(Example is Given Below)
>  
> *Need*: Default Coder is Required So According to Data It Make All Fields as 
> String and Read Data else Thier must be Dynamic Options to Read Coder From 
> GCS as JSON FILE and Parse Data on Basis of That (But We can Pass Location 
> Using ValueProvider) or SomeWhere Else so At Runtime Using ValueProvider.
>  
>  
> *Examples*: I Have Two Tables 1 is Having Column (NAME, CLASS, ROLL, 
> SUB_PRICE)
> And 2 Table is (NAME, ROLL, SUB, TEST_MARKS)
>  
> On Both Tables, I am Just Sorting Table on Basis Of Roll Number so if We can 
> Read Coder at Run Time The Same Template Can Be Used For Different Tables at 
> Run Time.
>  
> Such Situations Make Our Work Easy and Make Our job Easy.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_Python #908

2018-02-12 Thread Apache Jenkins Server
See 


Changes:

[kirpichov] Two fixes to common URN handling

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam7 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision c14fab0f66374e572e5b0681fe3b652dff6185de (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c14fab0f66374e572e5b0681fe3b652dff6185de
Commit message: "Merge pull request #4671: Two fixes to common URN handling"
 > git rev-list 0d91a7920b6ada1424248d4fb7301db23353891a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins1003993540603364672.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins1628260077414904254.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins1349213299956918081.sh
+ virtualenv .env --system-site-packages
New python executable in 

Installing setuptools, pip, wheel...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins7046350535500644856.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in ./.env/lib/python2.7/site-packages
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins2495660312971450225.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins7936546527974795060.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
:318:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/security.html#snimissingwarning.
  SNIMissingWarning
:122:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more 

Jenkins build is back to normal : beam_PerformanceTests_AvroIOIT #138

2018-02-12 Thread Apache Jenkins Server
See 




[jira] [Created] (BEAM-3695) beam_PostCommit_Python_ValidatesContainer_Dataflow red for a few days

2018-02-12 Thread Kenneth Knowles (JIRA)
Kenneth Knowles created BEAM-3695:
-

 Summary: beam_PostCommit_Python_ValidatesContainer_Dataflow red 
for a few days
 Key: BEAM-3695
 URL: https://issues.apache.org/jira/browse/BEAM-3695
 Project: Beam
  Issue Type: Bug
  Components: build-system, runner-dataflow, testing
Reporter: Kenneth Knowles
Assignee: Alan Myrvold


I haven't looked into the logs, but this test last passed 5 days ago.

Can the job or its notifications be disabled while it is under development, 
perhaps?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_ValidatesContainer_Dataflow #28

2018-02-12 Thread Apache Jenkins Server
See 


--
[...truncated 92.63 KB...]
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_messages.py
 -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying apache_beam/runners/dataflow/native_io/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/streaming_create.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/direct/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/bundle_factory.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/clock.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/evaluation_context.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/executor.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/helper_transforms.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/transform_evaluator.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/util.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/watermark_manager.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/experimental/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental
copying apache_beam/runners/experimental/python_rpc_direct/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying 
apache_beam/runners/experimental/python_rpc_direct/python_rpc_direct_runner.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/experimental/python_rpc_direct/server.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/job/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/manager.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/utils.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/portability/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_main.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying 

Jenkins build is back to stable : beam_PostCommit_Java_MavenInstall #5950

2018-02-12 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Python_ValidatesContainer_Dataflow #27

2018-02-12 Thread Apache Jenkins Server
See 


Changes:

[kirpichov] Two fixes to common URN handling

--
[...truncated 97.35 KB...]
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_messages.py
 -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying apache_beam/runners/dataflow/native_io/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/streaming_create.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/direct/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/bundle_factory.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/clock.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/evaluation_context.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/executor.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/helper_transforms.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/sdf_direct_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/transform_evaluator.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/util.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/watermark_manager.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/experimental/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental
copying apache_beam/runners/experimental/python_rpc_direct/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying 
apache_beam/runners/experimental/python_rpc_direct/python_rpc_direct_runner.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/experimental/python_rpc_direct/server.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/experimental/python_rpc_direct
copying apache_beam/runners/job/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/manager.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/job/utils.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/job
copying apache_beam/runners/portability/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/fn_api_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/maptask_executor_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/portability
copying apache_beam/runners/portability/universal_local_runner_main.py -> 

[jira] [Commented] (BEAM-2591) Python shim for submitting to FlinkRunner

2018-02-12 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2591?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361730#comment-16361730
 ] 

Kenneth Knowles commented on BEAM-2591:
---

That's about right, but maybe "implement such a client VM" is almost a noop. 
Quoting [~robertwb]'s emailed reply: "I could imagine an "UniversalJavaClient" 
that is parameterized by the actual runner itself dynamically. But it may also 
make sense for Flink, Spark, etc. managers to serve this API directly when part 
of a service."

I would start with an easy hardcode and generalize when it looks like it makes 
sense. I actually think that the ReferenceRunner work may already have created 
the parameterizable JobService server ([~tgroh]?)

> Python shim for submitting to FlinkRunner
> -
>
> Key: BEAM-2591
> URL: https://issues.apache.org/jira/browse/BEAM-2591
> Project: Beam
>  Issue Type: Sub-task
>  Components: runner-flink, sdk-py-core
>Reporter: Kenneth Knowles
>Priority: Major
>  Labels: portability
>
> Whatever the result of https://s.apache.org/beam-job-api, Python users will 
> need to be able to pass --runner=FlinkRunner and have it work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: Merge pull request #4671: Two fixes to common URN handling

2018-02-12 Thread jkff
This is an automated email from the ASF dual-hosted git repository.

jkff pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit c14fab0f66374e572e5b0681fe3b652dff6185de
Merge: 0d91a79 3e0fc05
Author: Eugene Kirpichov 
AuthorDate: Mon Feb 12 18:10:55 2018 -0800

Merge pull request #4671: Two fixes to common URN handling

Two fixes to common URN handling

 .../main/java/org/apache/beam/runners/core/construction/UrnUtils.java | 4 ++--
 .../beam/runners/fnexecution/graph/LengthPrefixUnknownCoders.java | 2 +-
 2 files changed, 3 insertions(+), 3 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
j...@apache.org.


[beam] branch master updated (0d91a79 -> c14fab0)

2018-02-12 Thread jkff
This is an automated email from the ASF dual-hosted git repository.

jkff pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 0d91a79  [BEAM-3176] support drop table (#4184)
 add 3e0fc05  Two fixes to common URN handling
 new c14fab0  Merge pull request #4671: Two fixes to common URN handling

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../main/java/org/apache/beam/runners/core/construction/UrnUtils.java | 4 ++--
 .../beam/runners/fnexecution/graph/LengthPrefixUnknownCoders.java | 2 +-
 2 files changed, 3 insertions(+), 3 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
j...@apache.org.


Jenkins build is still unstable: beam_PostCommit_Java_MavenInstall #5949

2018-02-12 Thread Apache Jenkins Server
See 




Jenkins build is back to normal : beam_PostRelease_NightlySnapshot #43

2018-02-12 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-2591) Python shim for submitting to FlinkRunner

2018-02-12 Thread Thomas Weise (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2591?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361635#comment-16361635
 ] 

Thomas Weise commented on BEAM-2591:


[~kenn] will the "production/portable endpoint" be provided by the "Flink 
Client JVM" in [https://s.apache.org/portable-flink-runner-overview] and is the 
assumption that every runner will implement such a client JVM?

> Python shim for submitting to FlinkRunner
> -
>
> Key: BEAM-2591
> URL: https://issues.apache.org/jira/browse/BEAM-2591
> Project: Beam
>  Issue Type: Sub-task
>  Components: runner-flink, sdk-py-core
>Reporter: Kenneth Knowles
>Priority: Major
>  Labels: portability
>
> Whatever the result of https://s.apache.org/beam-job-api, Python users will 
> need to be able to pass --runner=FlinkRunner and have it work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to normal : beam_PerformanceTests_TextIOIT #151

2018-02-12 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Python #907

2018-02-12 Thread Apache Jenkins Server
See 


Changes:

[robertwb] [BEAM-3074] Serialize DoFns by portable id in Dataflow runner.

[jbonofre] [BEAM-3692] Remove maven deploy plugin configuration with skip in the

[XuMingmin] [BEAM-3176] support drop table (#4184)

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam4 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0d91a7920b6ada1424248d4fb7301db23353891a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0d91a7920b6ada1424248d4fb7301db23353891a
Commit message: "[BEAM-3176] support drop table (#4184)"
 > git rev-list a0071ed64569982d19ccd03047600d15fd743fdc # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3503218485134743109.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6374752903158586051.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins1233305736604954643.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe /tmp/jenkins600297972479041957.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/43/41/033a273f9a25cb63050a390ee8397acbc7eae2159195d85f06f17e7be45a/setuptools-38.5.1-py2.py3-none-any.whl#md5=908b8b5e50bf429e520b2b5fa1b350e5
Downloading/unpacking pip from 
https://pypi.python.org/packages/b6/ac/7015eb97dc749283ffdec1c3a88ddb8ae03b8fad0f0e611408f196358da3/pip-9.0.1-py2.py3-none-any.whl#md5=297dbd16ef53bcef0447d245815f5144
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins4611644791443606822.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe /tmp/jenkins47567281944981606.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
:318:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS 

Build failed in Jenkins: beam_PostRelease_NightlySnapshot #42

2018-02-12 Thread Apache Jenkins Server
See 


--
[...truncated 509.95 KB...]
copying apache_beam/utils/retry_test.py -> apache-beam-2.3.0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> apache-beam-2.3.0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> 
apache-beam-2.3.0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.3.0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> 
apache-beam-2.3.0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> 
apache-beam-2.3.0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> 
apache-beam-2.3.0/apache_beam/utils
Writing apache-beam-2.3.0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.3.0' (and everything under it)
Processing ./dist/apache-beam-2.3.0.tar.gz
Collecting avro<2.0.0,>=1.8.1 (from apache-beam==2.3.0)
/tmp/groovy-generated-7662706818481476792-tmpdir/apache-beam-2.3.0/sdks/python/temp_virtualenv/local/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/util/ssl_.py:318:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/security.html#snimissingwarning.
  SNIMissingWarning
/tmp/groovy-generated-7662706818481476792-tmpdir/apache-beam-2.3.0/sdks/python/temp_virtualenv/local/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/util/ssl_.py:122:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/security.html#insecureplatformwarning.
  InsecurePlatformWarning
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.3.0)
Collecting dill==0.2.6 (from apache-beam==2.3.0)
Collecting grpcio<2,>=1.0 (from apache-beam==2.3.0)
  Using cached grpcio-1.9.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting httplib2<0.10,>=0.8 (from apache-beam==2.3.0)
Collecting mock<3.0.0,>=1.0.1 (from apache-beam==2.3.0)
  Using cached mock-2.0.0-py2.py3-none-any.whl
Collecting oauth2client<5,>=2.0.1 (from apache-beam==2.3.0)
  Using cached oauth2client-4.1.2-py2.py3-none-any.whl
Collecting protobuf<4,>=3.5.0.post1 (from apache-beam==2.3.0)
  Using cached protobuf-3.5.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting pyyaml<4.0.0,>=3.12 (from apache-beam==2.3.0)
Collecting pyvcf<0.7.0,>=0.6.8 (from apache-beam==2.3.0)
Collecting six<1.12,>=1.9 (from apache-beam==2.3.0)
  Using cached six-1.11.0-py2.py3-none-any.whl
Collecting typing<3.7.0,>=3.6.0 (from apache-beam==2.3.0)
  Using cached typing-3.6.4-py2-none-any.whl
Collecting futures<4.0.0,>=3.1.1 (from apache-beam==2.3.0)
  Using cached futures-3.2.0-py2-none-any.whl
Collecting hdfs3<0.4.0,>=0.3.0 (from apache-beam==2.3.0)
  Using cached hdfs3-0.3.0-py2.py3-none-any.whl
Collecting google-apitools<=0.5.20,>=0.5.18 (from apache-beam==2.3.0)
  Using cached google_apitools-0.5.20-py2-none-any.whl
Collecting proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 (from 
apache-beam==2.3.0)
Collecting googledatastore==7.0.1 (from apache-beam==2.3.0)
Collecting google-cloud-pubsub==0.26.0 (from apache-beam==2.3.0)
  Using cached google_cloud_pubsub-0.26.0-py2.py3-none-any.whl
Collecting google-cloud-bigquery==0.25.0 (from apache-beam==2.3.0)
  Using cached google_cloud_bigquery-0.25.0-py2.py3-none-any.whl
Collecting enum34>=1.0.4 (from grpcio<2,>=1.0->apache-beam==2.3.0)
  Using cached enum34-1.1.6-py2-none-any.whl
Collecting funcsigs>=1; python_version < "3.3" (from 
mock<3.0.0,>=1.0.1->apache-beam==2.3.0)
  Using cached funcsigs-1.0.2-py2.py3-none-any.whl
Collecting pbr>=0.11 (from mock<3.0.0,>=1.0.1->apache-beam==2.3.0)
  Using cached pbr-3.1.1-py2.py3-none-any.whl
Collecting rsa>=3.1.4 (from oauth2client<5,>=2.0.1->apache-beam==2.3.0)
  Using cached rsa-3.4.2-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5 (from 
oauth2client<5,>=2.0.1->apache-beam==2.3.0)
  Using cached pyasn1_modules-0.2.1-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7 (from oauth2client<5,>=2.0.1->apache-beam==2.3.0)
  Using cached pyasn1-0.4.2-py2.py3-none-any.whl
Requirement already satisfied: setuptools in 
./temp_virtualenv/lib/python2.7/site-packages (from 
protobuf<4,>=3.5.0.post1->apache-beam==2.3.0)
Collecting fasteners>=0.14 (from 
google-apitools<=0.5.20,>=0.5.18->apache-beam==2.3.0)
  Using cached fasteners-0.14.1-py2.py3-none-any.whl
Collecting googleapis-common-protos<2.0dev,>=1.5.2 (from 
proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.3.0)
Collecting 

Build failed in Jenkins: beam_PerformanceTests_AvroIOIT #137

2018-02-12 Thread Apache Jenkins Server
See 


Changes:

[robertwb] [BEAM-3074] Serialize DoFns by portable id in Dataflow runner.

[jbonofre] [BEAM-3692] Remove maven deploy plugin configuration with skip in the

[XuMingmin] [BEAM-3176] support drop table (#4184)

--
[...truncated 717.01 KB...]
[INFO] Excluding org.apache.httpcomponents:httpclient:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding org.apache.httpcomponents:httpcore:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding commons-codec:commons-codec:jar:1.3 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded 
jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-io-google-cloud-platform:jar:2.4.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-protobuf:jar:2.4.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev355-1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.api:gax-grpc:jar:0.20.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.auto.value:auto-value:jar:1.5.1 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-pubsub:jar:v1-rev10-1.22.0 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-resolver:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-all:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-okhttp:jar:1.2.0 from the shaded jar.
[INFO] Excluding 

Build failed in Jenkins: beam_PostRelease_NightlySnapshot #41

2018-02-12 Thread Apache Jenkins Server
See 


--
[...truncated 3.15 MB...]
INFO: Stopped Spark@68062acb{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Feb 12, 2018 11:35:08 PM org.apache.spark.internal.Logging$class logInfo
INFO: Lost task 3.0 in stage 0.0 (TID 3) on localhost, executor driver: 
java.io.IOException (Connection from /127.0.0.1:45763 closed) [duplicate 2]
Feb 12, 2018 11:35:08 PM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://127.0.0.1:4040
Feb 12, 2018 11:35:08 PM org.apache.spark.network.client.TransportClientFactory 
createClient
INFO: Found inactive connection to /127.0.0.1:45763, creating a new one.
Feb 12, 2018 11:35:08 PM org.apache.spark.network.client.TransportClientFactory 
createClient
INFO: Successfully created connection to /127.0.0.1:45763 after 7 ms (0 ms 
spent in bootstraps)
Feb 12, 2018 11:35:08 PM org.apache.spark.internal.Logging$class logInfo
INFO: Fetching spark://127.0.0.1:45763/jars/datastore-v1-protos-1.3.0.jar to 
/tmp/spark-c28bd06c-cf7a-4e01-b9e3-e3f905cabb80/userFiles-3796d348-f7f3-4c97-9bed-490ecee5d6e0/fetchFileTemp7658653485519795242.tmp
Feb 12, 2018 11:35:08 PM 
org.apache.spark.network.server.TransportRequestHandler lambda$respond$0
SEVERE: Error sending result 
StreamResponse{streamId=/jars/datastore-v1-protos-1.3.0.jar, byteCount=447403, 
body=FileSegmentManagedBuffer{file=/tmp/groovy-generated-7642397720739362925-tmpdir/.m2/repository/com/google/cloud/datastore/datastore-v1-protos/1.3.0/datastore-v1-protos-1.3.0.jar,
 offset=0, length=447403}} to /127.0.0.1:38190; closing connection
java.lang.AbstractMethodError
at io.netty.util.ReferenceCountUtil.touch(ReferenceCountUtil.java:73)
at 
io.netty.channel.DefaultChannelPipeline.touch(DefaultChannelPipeline.java:107)
at 
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:811)
at 
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:724)
at 
io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:111)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:739)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:731)
at 
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:817)
at 
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:724)
at 
io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:305)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:739)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeWriteAndFlush(AbstractChannelHandlerContext.java:802)
at 
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:815)
at 
io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:795)
at 
io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:832)
at 
io.netty.channel.DefaultChannelPipeline.writeAndFlush(DefaultChannelPipeline.java:1032)
at 
io.netty.channel.AbstractChannel.writeAndFlush(AbstractChannel.java:296)
at 
org.apache.spark.network.server.TransportRequestHandler.respond(TransportRequestHandler.java:192)
at 
org.apache.spark.network.server.TransportRequestHandler.processStreamRequest(TransportRequestHandler.java:148)
at 
org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:109)
at 
org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:349)
at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:341)
at 
io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:349)
at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:341)
at 
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363)
at 

Jenkins build is still unstable: beam_PostCommit_Java_MavenInstall #5948

2018-02-12 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Python_ValidatesContainer_Dataflow #26

2018-02-12 Thread Apache Jenkins Server
-temp_location=$GCS_LOCATION/temp-validatesrunner-test \
--output=$GCS_LOCATION/output \
--sdk_location=$SDK_LOCATION \
--num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesContainer_Dataflow/ws/src/sdks/python/container/local/lib/python2.7/site-packages/setuptools/dist.py>:355:
 UserWarning: Normalizing '2.4.0.dev' to '2.4.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesContainer_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py>:166:
 DeprecationWarning: object() takes no parameters
  super(GcsIO, cls).__new__(cls, storage_client))
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) 
... ok

--
Ran 1 test in 399.161s

OK

# Delete the container locally and remotely
docker rmi $CONTAINER:$TAG
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20180212-230802
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python@sha256:f7ab700669cf82e0cdb1ecb1290c2bc097109c245853cc186405715053d2f267
Deleted: sha256:68aef607f54433a56cbeb2f7f1c01d669cd498a4c27e9e475f8097cd726e76dd
Deleted: sha256:ec59a29ce7d81a615f716f7cd8454230e50c6f92fdb57fe82c848e292ad43dcd
Deleted: sha256:fc1644140ed2c811e29a9bddcd77b9696e90d1028418f5e6549ad63075157ba5
Deleted: sha256:57aece7195145a55b9864a8b6bff7835126c871afe7cd16832f76226ce678553
Deleted: sha256:3322aa8bc138e968a802e609f4167fde233935f14d5494b8abbb74ab804af154
Deleted: sha256:c87fbef349b1383def946b388ea9cc8045b8a0c76a57d76aba7b890a267854fa
Deleted: sha256:6cb570d34526050c65762263fa2671024e91f1da98760fa51c0d27809a5f02d9
Deleted: sha256:350b0db2cd6be6fcff58adba45fe20a8b465d38844dfb47d7fdc6b330cda3089
gcloud container images delete $CONTAINER:$TAG --quiet
Usage: gcloud container [optional flags] 
  group may be   clusters | node-pools | operations
  command may be get-server-config

Deploy and manage clusters of machines for running containers.

flags:
  Run `gcloud container --help`
  for the full list of available flags for this command.

global flags:
  Run `gcloud -h` for a description of flags available to all commands.

command groups:
  clusters   Deploy and teardown Google Container Engine clusters.
  node-pools Create and delete operations for Google Container
 Engine node pools.
  operations Get and list operations for Google Container Engine
 clusters.

commands:
  get-server-config  Get Container Engine server config.


For more detailed information on this command and its flags, run:
  gcloud container --help

ERROR: (gcloud.container) Invalid choice: 'images'.

Valid choices are [clusters, get-server-config, node-pools, operations].
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user aljoscha.kret...@gmail.com
Not sending mail to unregistered user eh...@google.com
Not sending mail to unregistered user z...@giggles.nyc.corp.google.com
Not sending mail to unregistered user sid...@google.com
Not sending mail to unregistered user xuming...@users.noreply.github.com
Not sending mail to unregistered user j...@nanthrax.net
Not sending mail to unregistered user pawel.pk.kaczmarc...@gmail.com
Not sending mail to unregistered user ke...@google.com
Not sending mail to unregistered user aromanenko@gmail.com
Not sending mail to unregistered user joey.bar...@gmail.com
Not sending mail to unregistered user dariusz.aniszew...@polidea.com
Not sending mail to unregistered user mott...@gmail.com
Not sending mail to unregistered user ccla...@bluewin.ch
Not sending mail to unregistered user w...@google.com
Not sending mail to unregistered user hero...@google.com
Not sending mail to unregistered user c...@google.com
Not sending mail to unregistered user ekirpic...@gmail.com
Not sending mail to unregistered user daniel.o.program...@gmail.com
Not sending mail to unregistered user mari...@mariagh.svl.corp.google.com
Not sending mail to unregistered user g...@telligent-data.com
Not sending mail to unregistered user apill...@google.com
Not sending mail to unregistered user jiang...@gmail.com
Not sending mail to unregistered user fjetum...@gmail.com
Not sending mail to unregistered user kirpic...@google.com
Not sending mail to unregistered user git...@alasdairhodge.co.uk
Not sending mail to unregistered user k...@google.com
Not sending mail to unregistered user mair...@google.com


[jira] [Assigned] (BEAM-3608) Pre-shade Guava for things we want to keep using

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3608?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles reassigned BEAM-3608:
-

Assignee: (was: Kenneth Knowles)

> Pre-shade Guava for things we want to keep using
> 
>
> Key: BEAM-3608
> URL: https://issues.apache.org/jira/browse/BEAM-3608
> Project: Beam
>  Issue Type: Sub-task
>  Components: runner-core, sdk-java-core
>Reporter: Kenneth Knowles
>Priority: Major
>
> Instead of shading as part of our build, we can shade before build so that it 
> is apparent when reading code, and in IDEs, that a particular class resides 
> in a hidden namespace.
> {{import com.google.common.reflect.TypeToken}}
> becomes something like
> {{import org.apache.beam.private.guava21.com.google.common.reflect.TypeToken}}
> So we can very trivially ban `org.apache.beam.private` from public APIs 
> unless they are annotated {{@Internal}}, and it makes sharing between our own 
> modules never get broken by shading again.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-3608) Pre-shade Guava for things we want to keep using

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3608?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-3608:
--
Labels:   (was: portability)

> Pre-shade Guava for things we want to keep using
> 
>
> Key: BEAM-3608
> URL: https://issues.apache.org/jira/browse/BEAM-3608
> Project: Beam
>  Issue Type: Sub-task
>  Components: runner-core, sdk-java-core
>Reporter: Kenneth Knowles
>Assignee: Kenneth Knowles
>Priority: Major
>
> Instead of shading as part of our build, we can shade before build so that it 
> is apparent when reading code, and in IDEs, that a particular class resides 
> in a hidden namespace.
> {{import com.google.common.reflect.TypeToken}}
> becomes something like
> {{import org.apache.beam.private.guava21.com.google.common.reflect.TypeToken}}
> So we can very trivially ban `org.apache.beam.private` from public APIs 
> unless they are annotated {{@Internal}}, and it makes sharing between our own 
> modules never get broken by shading again.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (BEAM-3693) On ubuntu 16 and Java 1.8 the package beam-sdks-java-core 2.2.0 throws Invalid value for MonthOfYear

2018-02-12 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-3693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía resolved BEAM-3693.

   Resolution: Fixed
Fix Version/s: Not applicable

Thanks for reporting I am closing it since it is already solved and the fix 
should be included as part of the release 2.3.0. Please reopen if it still not 
works with Beam 2.3.0.

Also remember that Beam is not yet Java 9 compatible. For progress on this 
follow BEAM-2530

> On ubuntu 16 and Java 1.8 the package beam-sdks-java-core 2.2.0 throws 
> Invalid value for MonthOfYear
> 
>
> Key: BEAM-3693
> URL: https://issues.apache.org/jira/browse/BEAM-3693
> Project: Beam
>  Issue Type: Bug
>  Components: build-system
>Affects Versions: 2.2.0
> Environment: Apache Maven 3.3.9
> Maven home: /usr/share/maven
> Java version: 9.0.4, vendor: Oracle Corporation
> Java home: /usr/lib/jvm/java-9-oracle
> Default locale: en_NZ, platform encoding: UTF-8
> OS name: "linux", version: "4.13.0-32-generic", arch: "amd64", family: 
> "unix"
> java version "1.8.0_161"
> Java(TM) SE Runtime Environment (build 1.8.0_161-b12)
> Java HotSpot(TM) 64-Bit Server VM (build 25.161-b12, mixed mode)
> No LSB modules are available.
> Distributor ID: Ubuntu
> Description:Ubuntu 16.04.3 LTS
> Release:16.04
> Codename:   xenial
>Reporter: Thiago Henrique Ramos da Mata
>Priority: Major
> Fix For: Not applicable
>
>
> The problem is caused by the package beam-sdks-java-core in the version 
> 2.2.0. This package was being loaded into the package my.projects.models.
> Here is the isolated program that replicates the same error message:
>  
> h3. /pom.xml
> {code:xml}
>  
> http://maven.apache.org/POM/4.0.0; 
> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;
>  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
> http://maven.apache.org/xsd/maven-4.0.0.xsd;>
> 4.0.0
> com.mock.bug
> mock-bug
> 0.123
> 
> 
> junit
> junit
> 4.12
> test
> 
> 
> org.apache.beam
> beam-sdks-java-core
> 2.2.0
> 
> 
> 
> 
> 
> maven-compiler-plugin
> 3.5
> 
> 1.8
> 1.8
> 
> 
> 
> org.apache.maven.plugins
> maven-source-plugin
> 3.0.1
> 
> 
> attach-sources
> 
> jar
> 
> 
> 
> 
> 
> 
> 
> {code}
>
> h3. ./src/main/java/com/mock/bug/Main.java
> {code:java}
>  package com.mock.bug;
> public class Main {
> public void main(String[] input){
> System.out.println("hello workd");
> }
> }
> {code}
> h3. ./src/test/java/com/mock/bug/DummyHealthCheckTest.java
> {code:java}
> package com.mock.bug;
> import org.junit.Test;
> import static org.junit.Assert.*;
> public class DummyHealthCheckTest {
> @Test
> public void DummyCheckTest() {
> boolean t = true;
> assertTrue(t);
> }
> }
> {code}
> h3. Command to fire the error message:
> {code:sh}
> mvn -U clean package
> {code}
> h3. Exception Message
> {code:sh}
> WARNING: An illegal reflective access operation has occurred
> WARNING: Illegal reflective access by 
> com.google.inject.internal.cglib.core.$ReflectUtils$1 
> (file:/usr/share/maven/lib/guice.jar) to method 
> java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
> WARNING: Please consider reporting this to the maintainers of 
> com.google.inject.internal.cglib.core.$ReflectUtils$1
> WARNING: Use --illegal-access=warn to enable warnings of further illegal 
> reflective access operations
> WARNING: All illegal access operations will be denied in a future release
> [INFO] Scanning for projects...
> [INFO]
>  
> [INFO] 
> 
> [INFO] Building mock-bug 0.123
> [INFO] 
> 
> [INFO] 
> [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ mock-bug ---
> [INFO] 
> [INFO] 

[beam] branch master updated: [BEAM-3176] support drop table (#4184)

2018-02-12 Thread mingmxu
This is an automated email from the ASF dual-hosted git repository.

mingmxu pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/master by this push:
 new 0d91a79  [BEAM-3176] support drop table (#4184)
0d91a79 is described below

commit 0d91a7920b6ada1424248d4fb7301db23353891a
Author: James Xu 
AuthorDate: Tue Feb 13 06:16:44 2018 +0800

[BEAM-3176] support drop table (#4184)

* [BEAM-3176] support drop table

* add UnitTest to assert table are removed from BeamQueryPlanner once droped

* fix findbugs found bugs..
---
 .../sql/src/main/codegen/data/Parser.tdd   |  3 +-
 .../sql/src/main/codegen/includes/parserImpls.ftl  | 17 +
 .../apache/beam/sdk/extensions/sql/BeamSqlCli.java |  8 +++
 .../beam/sdk/extensions/sql/impl/BeamSqlEnv.java   | 30 +++-
 .../extensions/sql/impl/parser/SqlDropTable.java   | 79 ++
 .../sql/meta/provider/TableProvider.java   |  7 ++
 .../meta/provider/kafka/KafkaTableProvider.java|  4 ++
 .../sql/meta/provider/text/TextTableProvider.java  |  4 ++
 .../sql/meta/store/InMemoryMetaStore.java  | 10 +++
 .../sdk/extensions/sql/meta/store/MetaStore.java   |  6 +-
 .../beam/sdk/extensions/sql/BeamSqlCliTest.java| 45 
 .../sql/impl/parser/BeamSqlParserTest.java | 12 
 .../sql/meta/store/InMemoryMetaStoreTest.java  |  4 ++
 13 files changed, 226 insertions(+), 3 deletions(-)

diff --git a/sdks/java/extensions/sql/src/main/codegen/data/Parser.tdd 
b/sdks/java/extensions/sql/src/main/codegen/data/Parser.tdd
index 09a5379..1afa73d 100644
--- a/sdks/java/extensions/sql/src/main/codegen/data/Parser.tdd
+++ b/sdks/java/extensions/sql/src/main/codegen/data/Parser.tdd
@@ -36,7 +36,8 @@
 
   # List of methods for parsing custom SQL statements.
   statementParserMethods: [
-"SqlCreateTable()"
+"SqlCreateTable()",
+"SqlDropTable()"
   ]
 
   # List of methods for parsing custom literals.
diff --git a/sdks/java/extensions/sql/src/main/codegen/includes/parserImpls.ftl 
b/sdks/java/extensions/sql/src/main/codegen/includes/parserImpls.ftl
index 136c728..ce1d2ae 100644
--- a/sdks/java/extensions/sql/src/main/codegen/includes/parserImpls.ftl
+++ b/sdks/java/extensions/sql/src/main/codegen/includes/parserImpls.ftl
@@ -87,3 +87,20 @@ SqlNode SqlCreateTable() :
 location, tbl_properties, select);
 }
 }
+
+/**
+ * DROP TABLE table_name
+ */
+SqlNode SqlDropTable() :
+{
+SqlParserPos pos;
+SqlIdentifier tblName;
+}
+{
+ { pos = getPos(); }
+
+tblName = SimpleIdentifier() {
+return new SqlDropTable(pos, tblName);
+}
+}
+
diff --git 
a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/BeamSqlCli.java
 
b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/BeamSqlCli.java
index ebac783..eadda35 100644
--- 
a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/BeamSqlCli.java
+++ 
b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/BeamSqlCli.java
@@ -24,6 +24,7 @@ import org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv;
 import org.apache.beam.sdk.extensions.sql.impl.parser.BeamSqlParser;
 import org.apache.beam.sdk.extensions.sql.impl.parser.ParserUtils;
 import org.apache.beam.sdk.extensions.sql.impl.parser.SqlCreateTable;
+import org.apache.beam.sdk.extensions.sql.impl.parser.SqlDropTable;
 import org.apache.beam.sdk.extensions.sql.impl.rel.BeamRelNode;
 import org.apache.beam.sdk.extensions.sql.meta.Table;
 import org.apache.beam.sdk.extensions.sql.meta.store.MetaStore;
@@ -80,6 +81,8 @@ public class BeamSqlCli {
 
 if (sqlNode instanceof SqlCreateTable) {
   handleCreateTable((SqlCreateTable) sqlNode, metaStore);
+} else if (sqlNode instanceof SqlDropTable) {
+  handleDropTable((SqlDropTable) sqlNode);
 } else {
   PipelineOptions options = PipelineOptionsFactory.fromArgs(new String[] 
{}).withValidation()
   .as(PipelineOptions.class);
@@ -103,6 +106,11 @@ public class BeamSqlCli {
 env.registerTable(table.getName(), 
metaStore.buildBeamSqlTable(table.getName()));
   }
 
+  private void handleDropTable(SqlDropTable stmt) {
+metaStore.dropTable(stmt.tableName());
+env.deregisterTable(stmt.tableName());
+  }
+
   /**
* compile SQL, and return a {@link Pipeline}.
*/
diff --git 
a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamSqlEnv.java
 
b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamSqlEnv.java
index a8bc48e..11f1a46 100644
--- 
a/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamSqlEnv.java
+++ 
b/sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/impl/BeamSqlEnv.java
@@ -17,7 +17,11 @@
  */
 package org.apache.beam.sdk.extensions.sql.impl;
 
+import 

[jira] [Assigned] (BEAM-3693) On ubuntu 16 and Java 1.8 the package beam-sdks-java-core 2.2.0 throws Invalid value for MonthOfYear

2018-02-12 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-3693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía reassigned BEAM-3693:
--

Assignee: (was: Kenneth Knowles)

> On ubuntu 16 and Java 1.8 the package beam-sdks-java-core 2.2.0 throws 
> Invalid value for MonthOfYear
> 
>
> Key: BEAM-3693
> URL: https://issues.apache.org/jira/browse/BEAM-3693
> Project: Beam
>  Issue Type: Bug
>  Components: build-system
>Affects Versions: 2.2.0
> Environment: Apache Maven 3.3.9
> Maven home: /usr/share/maven
> Java version: 9.0.4, vendor: Oracle Corporation
> Java home: /usr/lib/jvm/java-9-oracle
> Default locale: en_NZ, platform encoding: UTF-8
> OS name: "linux", version: "4.13.0-32-generic", arch: "amd64", family: 
> "unix"
> java version "1.8.0_161"
> Java(TM) SE Runtime Environment (build 1.8.0_161-b12)
> Java HotSpot(TM) 64-Bit Server VM (build 25.161-b12, mixed mode)
> No LSB modules are available.
> Distributor ID: Ubuntu
> Description:Ubuntu 16.04.3 LTS
> Release:16.04
> Codename:   xenial
>Reporter: Thiago Henrique Ramos da Mata
>Priority: Major
>
> The problem is caused by the package beam-sdks-java-core in the version 
> 2.2.0. This package was being loaded into the package my.projects.models.
> Here is the isolated program that replicates the same error message:
>  
> h3. /pom.xml
> {code:xml}
>  
> http://maven.apache.org/POM/4.0.0; 
> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;
>  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
> http://maven.apache.org/xsd/maven-4.0.0.xsd;>
> 4.0.0
> com.mock.bug
> mock-bug
> 0.123
> 
> 
> junit
> junit
> 4.12
> test
> 
> 
> org.apache.beam
> beam-sdks-java-core
> 2.2.0
> 
> 
> 
> 
> 
> maven-compiler-plugin
> 3.5
> 
> 1.8
> 1.8
> 
> 
> 
> org.apache.maven.plugins
> maven-source-plugin
> 3.0.1
> 
> 
> attach-sources
> 
> jar
> 
> 
> 
> 
> 
> 
> 
> {code}
>
> h3. ./src/main/java/com/mock/bug/Main.java
> {code:java}
>  package com.mock.bug;
> public class Main {
> public void main(String[] input){
> System.out.println("hello workd");
> }
> }
> {code}
> h3. ./src/test/java/com/mock/bug/DummyHealthCheckTest.java
> {code:java}
> package com.mock.bug;
> import org.junit.Test;
> import static org.junit.Assert.*;
> public class DummyHealthCheckTest {
> @Test
> public void DummyCheckTest() {
> boolean t = true;
> assertTrue(t);
> }
> }
> {code}
> h3. Command to fire the error message:
> {code:sh}
> mvn -U clean package
> {code}
> h3. Exception Message
> {code:sh}
> WARNING: An illegal reflective access operation has occurred
> WARNING: Illegal reflective access by 
> com.google.inject.internal.cglib.core.$ReflectUtils$1 
> (file:/usr/share/maven/lib/guice.jar) to method 
> java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
> WARNING: Please consider reporting this to the maintainers of 
> com.google.inject.internal.cglib.core.$ReflectUtils$1
> WARNING: Use --illegal-access=warn to enable warnings of further illegal 
> reflective access operations
> WARNING: All illegal access operations will be denied in a future release
> [INFO] Scanning for projects...
> [INFO]
>  
> [INFO] 
> 
> [INFO] Building mock-bug 0.123
> [INFO] 
> 
> [INFO] 
> [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ mock-bug ---
> [INFO] 
> [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ 
> mock-bug ---
> [WARNING] Using platform encoding (UTF-8 actually) to copy filtered 
> resources, i.e. build is platform dependent!
> [INFO] skip non existing resourceDirectory 
> /home/me/projects/mock-bug/src/main/resources
> [INFO] 
> [INFO] --- 

[jira] [Updated] (BEAM-3693) On ubuntu 16 and Java 1.8 the package beam-sdks-java-core 2.2.0 throws Invalid value for MonthOfYear

2018-02-12 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-3693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía updated BEAM-3693:
---
Fix Version/s: (was: 2.1.0)

> On ubuntu 16 and Java 1.8 the package beam-sdks-java-core 2.2.0 throws 
> Invalid value for MonthOfYear
> 
>
> Key: BEAM-3693
> URL: https://issues.apache.org/jira/browse/BEAM-3693
> Project: Beam
>  Issue Type: Bug
>  Components: build-system
>Affects Versions: 2.2.0
> Environment: Apache Maven 3.3.9
> Maven home: /usr/share/maven
> Java version: 9.0.4, vendor: Oracle Corporation
> Java home: /usr/lib/jvm/java-9-oracle
> Default locale: en_NZ, platform encoding: UTF-8
> OS name: "linux", version: "4.13.0-32-generic", arch: "amd64", family: 
> "unix"
> java version "1.8.0_161"
> Java(TM) SE Runtime Environment (build 1.8.0_161-b12)
> Java HotSpot(TM) 64-Bit Server VM (build 25.161-b12, mixed mode)
> No LSB modules are available.
> Distributor ID: Ubuntu
> Description:Ubuntu 16.04.3 LTS
> Release:16.04
> Codename:   xenial
>Reporter: Thiago Henrique Ramos da Mata
>Assignee: Kenneth Knowles
>Priority: Major
>
> The problem is caused by the package beam-sdks-java-core in the version 
> 2.2.0. This package was being loaded into the package my.projects.models.
> Here is the isolated program that replicates the same error message:
>  
> h3. /pom.xml
> {code:xml}
>  
> http://maven.apache.org/POM/4.0.0; 
> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;
>  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
> http://maven.apache.org/xsd/maven-4.0.0.xsd;>
> 4.0.0
> com.mock.bug
> mock-bug
> 0.123
> 
> 
> junit
> junit
> 4.12
> test
> 
> 
> org.apache.beam
> beam-sdks-java-core
> 2.2.0
> 
> 
> 
> 
> 
> maven-compiler-plugin
> 3.5
> 
> 1.8
> 1.8
> 
> 
> 
> org.apache.maven.plugins
> maven-source-plugin
> 3.0.1
> 
> 
> attach-sources
> 
> jar
> 
> 
> 
> 
> 
> 
> 
> {code}
>
> h3. ./src/main/java/com/mock/bug/Main.java
> {code:java}
>  package com.mock.bug;
> public class Main {
> public void main(String[] input){
> System.out.println("hello workd");
> }
> }
> {code}
> h3. ./src/test/java/com/mock/bug/DummyHealthCheckTest.java
> {code:java}
> package com.mock.bug;
> import org.junit.Test;
> import static org.junit.Assert.*;
> public class DummyHealthCheckTest {
> @Test
> public void DummyCheckTest() {
> boolean t = true;
> assertTrue(t);
> }
> }
> {code}
> h3. Command to fire the error message:
> {code:sh}
> mvn -U clean package
> {code}
> h3. Exception Message
> {code:sh}
> WARNING: An illegal reflective access operation has occurred
> WARNING: Illegal reflective access by 
> com.google.inject.internal.cglib.core.$ReflectUtils$1 
> (file:/usr/share/maven/lib/guice.jar) to method 
> java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
> WARNING: Please consider reporting this to the maintainers of 
> com.google.inject.internal.cglib.core.$ReflectUtils$1
> WARNING: Use --illegal-access=warn to enable warnings of further illegal 
> reflective access operations
> WARNING: All illegal access operations will be denied in a future release
> [INFO] Scanning for projects...
> [INFO]
>  
> [INFO] 
> 
> [INFO] Building mock-bug 0.123
> [INFO] 
> 
> [INFO] 
> [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ mock-bug ---
> [INFO] 
> [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ 
> mock-bug ---
> [WARNING] Using platform encoding (UTF-8 actually) to copy filtered 
> resources, i.e. build is platform dependent!
> [INFO] skip non existing resourceDirectory 
> /home/me/projects/mock-bug/src/main/resources
> [INFO] 

[jira] [Updated] (BEAM-3693) On ubuntu 16 and Java 1.8 the package beam-sdks-java-core 2.2.0 throws Invalid value for MonthOfYear

2018-02-12 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-3693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía updated BEAM-3693:
---
Component/s: (was: beam-model)
 build-system

> On ubuntu 16 and Java 1.8 the package beam-sdks-java-core 2.2.0 throws 
> Invalid value for MonthOfYear
> 
>
> Key: BEAM-3693
> URL: https://issues.apache.org/jira/browse/BEAM-3693
> Project: Beam
>  Issue Type: Bug
>  Components: build-system
>Affects Versions: 2.2.0
> Environment: Apache Maven 3.3.9
> Maven home: /usr/share/maven
> Java version: 9.0.4, vendor: Oracle Corporation
> Java home: /usr/lib/jvm/java-9-oracle
> Default locale: en_NZ, platform encoding: UTF-8
> OS name: "linux", version: "4.13.0-32-generic", arch: "amd64", family: 
> "unix"
> java version "1.8.0_161"
> Java(TM) SE Runtime Environment (build 1.8.0_161-b12)
> Java HotSpot(TM) 64-Bit Server VM (build 25.161-b12, mixed mode)
> No LSB modules are available.
> Distributor ID: Ubuntu
> Description:Ubuntu 16.04.3 LTS
> Release:16.04
> Codename:   xenial
>Reporter: Thiago Henrique Ramos da Mata
>Assignee: Kenneth Knowles
>Priority: Major
>
> The problem is caused by the package beam-sdks-java-core in the version 
> 2.2.0. This package was being loaded into the package my.projects.models.
> Here is the isolated program that replicates the same error message:
>  
> h3. /pom.xml
> {code:xml}
>  
> http://maven.apache.org/POM/4.0.0; 
> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;
>  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
> http://maven.apache.org/xsd/maven-4.0.0.xsd;>
> 4.0.0
> com.mock.bug
> mock-bug
> 0.123
> 
> 
> junit
> junit
> 4.12
> test
> 
> 
> org.apache.beam
> beam-sdks-java-core
> 2.2.0
> 
> 
> 
> 
> 
> maven-compiler-plugin
> 3.5
> 
> 1.8
> 1.8
> 
> 
> 
> org.apache.maven.plugins
> maven-source-plugin
> 3.0.1
> 
> 
> attach-sources
> 
> jar
> 
> 
> 
> 
> 
> 
> 
> {code}
>
> h3. ./src/main/java/com/mock/bug/Main.java
> {code:java}
>  package com.mock.bug;
> public class Main {
> public void main(String[] input){
> System.out.println("hello workd");
> }
> }
> {code}
> h3. ./src/test/java/com/mock/bug/DummyHealthCheckTest.java
> {code:java}
> package com.mock.bug;
> import org.junit.Test;
> import static org.junit.Assert.*;
> public class DummyHealthCheckTest {
> @Test
> public void DummyCheckTest() {
> boolean t = true;
> assertTrue(t);
> }
> }
> {code}
> h3. Command to fire the error message:
> {code:sh}
> mvn -U clean package
> {code}
> h3. Exception Message
> {code:sh}
> WARNING: An illegal reflective access operation has occurred
> WARNING: Illegal reflective access by 
> com.google.inject.internal.cglib.core.$ReflectUtils$1 
> (file:/usr/share/maven/lib/guice.jar) to method 
> java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
> WARNING: Please consider reporting this to the maintainers of 
> com.google.inject.internal.cglib.core.$ReflectUtils$1
> WARNING: Use --illegal-access=warn to enable warnings of further illegal 
> reflective access operations
> WARNING: All illegal access operations will be denied in a future release
> [INFO] Scanning for projects...
> [INFO]
>  
> [INFO] 
> 
> [INFO] Building mock-bug 0.123
> [INFO] 
> 
> [INFO] 
> [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ mock-bug ---
> [INFO] 
> [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ 
> mock-bug ---
> [WARNING] Using platform encoding (UTF-8 actually) to copy filtered 
> resources, i.e. build is platform dependent!
> [INFO] skip non existing resourceDirectory 
> 

Build failed in Jenkins: beam_PostRelease_NightlySnapshot #40

2018-02-12 Thread Apache Jenkins Server
See 


--
[...truncated 2.32 MB...]
hard linking apache_beam/io/gcp/internal/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal
hard linking apache_beam/io/gcp/internal/clients/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal/clients
hard linking apache_beam/io/gcp/internal/clients/bigquery/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/bigquery
hard linking apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py 
-> apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/bigquery
hard linking 
apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_messages.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/bigquery
hard linking apache_beam/io/gcp/internal/clients/storage/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/storage
hard linking apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py 
-> apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/storage
hard linking apache_beam/io/gcp/internal/clients/storage/storage_v1_messages.py 
-> apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/storage
hard linking apache_beam/io/gcp/tests/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/io/gcp/tests/bigquery_matcher.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/io/gcp/tests/bigquery_matcher_test.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/io/gcp/tests/utils.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/io/gcp/tests/utils_test.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/metrics/__init__.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/cells.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/cells_test.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/execution.pxd -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/execution.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/execution_test.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/metric.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/metric_test.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/metricbase.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/options/__init__.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/pipeline_options.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/pipeline_options_test.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/pipeline_options_validator.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/pipeline_options_validator_test.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/value_provider.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/value_provider_test.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/portability/__init__.py -> 
apache-beam-2.3.0/apache_beam/portability
hard linking apache_beam/portability/api/__init__.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_artifact_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_artifact_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_fn_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_fn_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_job_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_job_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_provision_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_provision_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_runner_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_runner_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/endpoints_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/endpoints_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/standard_window_fns_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/standard_window_fns_pb2_grpc.py 

[jira] [Updated] (BEAM-3693) On ubuntu 16 and Java 1.8 the package beam-sdks-java-core 2.2.0 throws Invalid value for MonthOfYear

2018-02-12 Thread Thiago Henrique Ramos da Mata (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thiago Henrique Ramos da Mata updated BEAM-3693:

Environment: 
{code:sh}
$ mvn version

Apache Maven 3.3.9
Maven home: /usr/share/maven
Java version: 9.0.4, vendor: Oracle Corporation
Java home: /usr/lib/jvm/java-9-oracle
Default locale: en_NZ, platform encoding: UTF-8
OS name: "linux", version: "4.13.0-32-generic", arch: "amd64", family: 
"unix"

$ java -version

java version "1.8.0_161"
Java(TM) SE Runtime Environment (build 1.8.0_161-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.161-b12, mixed mode)

$ lsb_release -a

No LSB modules are available.
Distributor ID: Ubuntu
Description:Ubuntu 16.04.3 LTS
Release:16.04
Codename:   xenial
{code}

  was:

$ mvn version

Apache Maven 3.3.9
Maven home: /usr/share/maven
Java version: 9.0.4, vendor: Oracle Corporation
Java home: /usr/lib/jvm/java-9-oracle
Default locale: en_NZ, platform encoding: UTF-8
OS name: "linux", version: "4.13.0-32-generic", arch: "amd64", family: 
"unix"

$ java -version

java version "1.8.0_161"
Java(TM) SE Runtime Environment (build 1.8.0_161-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.161-b12, mixed mode)

$ lsb_release -a

No LSB modules are available.
Distributor ID: Ubuntu
Description:Ubuntu 16.04.3 LTS
Release:16.04
Codename:   xenial


> On ubuntu 16 and Java 1.8 the package beam-sdks-java-core 2.2.0 throws 
> Invalid value for MonthOfYear
> 
>
> Key: BEAM-3693
> URL: https://issues.apache.org/jira/browse/BEAM-3693
> Project: Beam
>  Issue Type: Bug
>  Components: beam-model
>Affects Versions: 2.2.0
> Environment: {code:sh}
> $ mvn version
> Apache Maven 3.3.9
> Maven home: /usr/share/maven
> Java version: 9.0.4, vendor: Oracle Corporation
> Java home: /usr/lib/jvm/java-9-oracle
> Default locale: en_NZ, platform encoding: UTF-8
> OS name: "linux", version: "4.13.0-32-generic", arch: "amd64", family: 
> "unix"
> $ java -version
> java version "1.8.0_161"
> Java(TM) SE Runtime Environment (build 1.8.0_161-b12)
> Java HotSpot(TM) 64-Bit Server VM (build 25.161-b12, mixed mode)
> $ lsb_release -a
> No LSB modules are available.
> Distributor ID: Ubuntu
> Description:Ubuntu 16.04.3 LTS
> Release:16.04
> Codename:   xenial
> {code}
>Reporter: Thiago Henrique Ramos da Mata
>Assignee: Kenneth Knowles
>Priority: Major
> Fix For: 2.1.0
>
>
> The problem is caused by the package beam-sdks-java-core in the version 
> 2.2.0. This package was being loaded into the package my.projects.models.
> Here is the isolated program that replicates the same error message:
>  
> h3. /pom.xml
> {code:xml}
>  
> http://maven.apache.org/POM/4.0.0; 
> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;
>  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
> http://maven.apache.org/xsd/maven-4.0.0.xsd;>
> 4.0.0
> com.mock.bug
> mock-bug
> 0.123
> 
> 
> junit
> junit
> 4.12
> test
> 
> 
> org.apache.beam
> beam-sdks-java-core
> 2.2.0
> 
> 
> 
> 
> 
> maven-compiler-plugin
> 3.5
> 
> 1.8
> 1.8
> 
> 
> 
> org.apache.maven.plugins
> maven-source-plugin
> 3.0.1
> 
> 
> attach-sources
> 
> jar
> 
> 
> 
> 
> 
> 
> 
> {code}
>
> h3. ./src/main/java/com/mock/bug/Main.java
> {code:java}
>  package com.mock.bug;
> public class Main {
> public void main(String[] input){
> System.out.println("hello workd");
> }
> }
> {code}
> h3. ./src/test/java/com/mock/bug/DummyHealthCheckTest.java
> {code:java}
> package com.mock.bug;
> import org.junit.Test;
> import static org.junit.Assert.*;
> public class DummyHealthCheckTest {
> @Test
> public void DummyCheckTest() {
> boolean t = true;
> assertTrue(t);
> }
> }
> {code}
> h3. 

[jira] [Updated] (BEAM-3693) On ubuntu 16 and Java 1.8 the package beam-sdks-java-core 2.2.0 throws Invalid value for MonthOfYear

2018-02-12 Thread Thiago Henrique Ramos da Mata (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thiago Henrique Ramos da Mata updated BEAM-3693:

Environment: 
Apache Maven 3.3.9
Maven home: /usr/share/maven
Java version: 9.0.4, vendor: Oracle Corporation
Java home: /usr/lib/jvm/java-9-oracle
Default locale: en_NZ, platform encoding: UTF-8
OS name: "linux", version: "4.13.0-32-generic", arch: "amd64", family: 
"unix"
java version "1.8.0_161"
Java(TM) SE Runtime Environment (build 1.8.0_161-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.161-b12, mixed mode)
No LSB modules are available.
Distributor ID: Ubuntu
Description:Ubuntu 16.04.3 LTS
Release:16.04
Codename:   xenial

  was:
{code:sh}
$ mvn version

Apache Maven 3.3.9
Maven home: /usr/share/maven
Java version: 9.0.4, vendor: Oracle Corporation
Java home: /usr/lib/jvm/java-9-oracle
Default locale: en_NZ, platform encoding: UTF-8
OS name: "linux", version: "4.13.0-32-generic", arch: "amd64", family: 
"unix"

$ java -version

java version "1.8.0_161"
Java(TM) SE Runtime Environment (build 1.8.0_161-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.161-b12, mixed mode)

$ lsb_release -a

No LSB modules are available.
Distributor ID: Ubuntu
Description:Ubuntu 16.04.3 LTS
Release:16.04
Codename:   xenial
{code}


> On ubuntu 16 and Java 1.8 the package beam-sdks-java-core 2.2.0 throws 
> Invalid value for MonthOfYear
> 
>
> Key: BEAM-3693
> URL: https://issues.apache.org/jira/browse/BEAM-3693
> Project: Beam
>  Issue Type: Bug
>  Components: beam-model
>Affects Versions: 2.2.0
> Environment: Apache Maven 3.3.9
> Maven home: /usr/share/maven
> Java version: 9.0.4, vendor: Oracle Corporation
> Java home: /usr/lib/jvm/java-9-oracle
> Default locale: en_NZ, platform encoding: UTF-8
> OS name: "linux", version: "4.13.0-32-generic", arch: "amd64", family: 
> "unix"
> java version "1.8.0_161"
> Java(TM) SE Runtime Environment (build 1.8.0_161-b12)
> Java HotSpot(TM) 64-Bit Server VM (build 25.161-b12, mixed mode)
> No LSB modules are available.
> Distributor ID: Ubuntu
> Description:Ubuntu 16.04.3 LTS
> Release:16.04
> Codename:   xenial
>Reporter: Thiago Henrique Ramos da Mata
>Assignee: Kenneth Knowles
>Priority: Major
> Fix For: 2.1.0
>
>
> The problem is caused by the package beam-sdks-java-core in the version 
> 2.2.0. This package was being loaded into the package my.projects.models.
> Here is the isolated program that replicates the same error message:
>  
> h3. /pom.xml
> {code:xml}
>  
> http://maven.apache.org/POM/4.0.0; 
> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;
>  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
> http://maven.apache.org/xsd/maven-4.0.0.xsd;>
> 4.0.0
> com.mock.bug
> mock-bug
> 0.123
> 
> 
> junit
> junit
> 4.12
> test
> 
> 
> org.apache.beam
> beam-sdks-java-core
> 2.2.0
> 
> 
> 
> 
> 
> maven-compiler-plugin
> 3.5
> 
> 1.8
> 1.8
> 
> 
> 
> org.apache.maven.plugins
> maven-source-plugin
> 3.0.1
> 
> 
> attach-sources
> 
> jar
> 
> 
> 
> 
> 
> 
> 
> {code}
>
> h3. ./src/main/java/com/mock/bug/Main.java
> {code:java}
>  package com.mock.bug;
> public class Main {
> public void main(String[] input){
> System.out.println("hello workd");
> }
> }
> {code}
> h3. ./src/test/java/com/mock/bug/DummyHealthCheckTest.java
> {code:java}
> package com.mock.bug;
> import org.junit.Test;
> import static org.junit.Assert.*;
> public class DummyHealthCheckTest {
> @Test
> public void DummyCheckTest() {
> boolean t = true;
> assertTrue(t);
> }
> }
> {code}
> h3. Command to fire the error message:
> {code:sh}
> mvn -U clean package
> {code}
> h3. Exception Message
> {code:sh}
> WARNING: An illegal reflective 

[jira] [Updated] (BEAM-3693) On ubuntu 16 and Java 1.8 the package beam-sdks-java-core 2.2.0 throws Invalid value for MonthOfYear

2018-02-12 Thread Thiago Henrique Ramos da Mata (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thiago Henrique Ramos da Mata updated BEAM-3693:

Description: 
The problem is caused by the package beam-sdks-java-core in the version 2.2.0. 
This package was being loaded into the package my.projects.models.

Here is the isolated program that replicates the same error message:

 
h3. /pom.xml

{code:xml}
 
http://maven.apache.org/POM/4.0.0; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;
 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd;>
4.0.0
com.mock.bug
mock-bug
0.123


junit
junit
4.12
test


org.apache.beam
beam-sdks-java-core
2.2.0





maven-compiler-plugin
3.5

1.8
1.8



org.apache.maven.plugins
maven-source-plugin
3.0.1


attach-sources

jar







{code}
   
h3. ./src/main/java/com/mock/bug/Main.java

{code:java}
 package com.mock.bug;

public class Main {
public void main(String[] input){
System.out.println("hello workd");
}
}
{code}

h3. ./src/test/java/com/mock/bug/DummyHealthCheckTest.java

{code:java}
package com.mock.bug;

import org.junit.Test;

import static org.junit.Assert.*;

public class DummyHealthCheckTest {
@Test
public void DummyCheckTest() {
boolean t = true;
assertTrue(t);
}
}
{code}

h3. Command to fire the error message:

{code:sh}
mvn -U clean package
{code}

h3. Exception Message


{code:sh}
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by 
com.google.inject.internal.cglib.core.$ReflectUtils$1 
(file:/usr/share/maven/lib/guice.jar) to method 
java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
WARNING: Please consider reporting this to the maintainers of 
com.google.inject.internal.cglib.core.$ReflectUtils$1
WARNING: Use --illegal-access=warn to enable warnings of further illegal 
reflective access operations
WARNING: All illegal access operations will be denied in a future release
[INFO] Scanning for projects...
[INFO] 
[INFO] 
[INFO] Building mock-bug 0.123
[INFO] 
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ mock-bug ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ mock-bug 
---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, 
i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory 
/home/me/projects/mock-bug/src/main/resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.5:compile (default-compile) @ mock-bug ---
[INFO] Changes detected - recompiling the module!
[WARNING] File encoding has not been set, using platform encoding UTF-8, i.e. 
build is platform dependent!
[INFO] Compiling 1 source file to /home/me/projects/mock-bug/target/classes
An exception has occurred in the compiler (9.0.4). Please file a bug against 
the Java compiler via the Java bug reporting page (http://bugreport.java.com) 
after checking the Bug Database (http://bugs.java.com) for duplicates. Include 
your program and the following diagnostic in your report. Thank you.
java.time.DateTimeException: Invalid value for MonthOfYear (valid values 1 - 
12): 0
at 
java.base/java.time.temporal.ValueRange.checkValidValue(ValueRange.java:311)
at 
java.base/java.time.temporal.ChronoField.checkValidValue(ChronoField.java:714)
at java.base/java.time.LocalDate.of(LocalDate.java:269)
at java.base/java.time.LocalDateTime.of(LocalDateTime.java:336)
at jdk.zipfs/jdk.nio.zipfs.ZipUtils.dosToJavaTime(ZipUtils.java:109)
at 
jdk.zipfs/jdk.nio.zipfs.ZipFileSystem$Entry.cen(ZipFileSystem.java:1950)
at 
jdk.zipfs/jdk.nio.zipfs.ZipFileSystem$Entry.readCEN(ZipFileSystem.java:1937)
at 
jdk.zipfs/jdk.nio.zipfs.ZipFileSystem.getEntry(ZipFileSystem.java:1324)
at 

[jira] [Updated] (BEAM-3693) On ubuntu 16 and Java 1.8 the package beam-sdks-java-core 2.2.0 throws Invalid value for MonthOfYear

2018-02-12 Thread Thiago Henrique Ramos da Mata (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thiago Henrique Ramos da Mata updated BEAM-3693:

Description: 
The problem is caused by the package beam-sdks-java-core in the version 2.2.0. 
This package was being loaded into the package my.projects.models.

Here is the isolated program that replicates the same error message:

 
h3. /pom.xml

{code:xml}
 
http://maven.apache.org/POM/4.0.0; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;
 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd;>
4.0.0
com.mock.bug
mock-bug
0.123


junit
junit
4.12
test


org.apache.beam
beam-sdks-java-core
2.2.0





maven-compiler-plugin
3.5

1.8
1.8



org.apache.maven.plugins
maven-source-plugin
3.0.1


attach-sources

jar







{code}
   
h3. ./src/main/java/com/mock/bug/Main.java

{code:java}
 package com.mock.bug;

public class Main {
public void main(String[] input){
System.out.println("hello workd");
}
}
{code}

h3. ./src/test/java/com/mock/bug/DummyHealthCheckTest.java

{code:java}
package com.mock.bug;

import org.junit.Test;

import static org.junit.Assert.*;

public class DummyHealthCheckTest {
@Test
public void DummyCheckTest() {
boolean t = true;
assertTrue(t);
}
}
{code}

h3. Command to fire the error message:

mvn -U clean package

h3. Exception Message


{code:sh}
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by 
com.google.inject.internal.cglib.core.$ReflectUtils$1 
(file:/usr/share/maven/lib/guice.jar) to method 
java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
WARNING: Please consider reporting this to the maintainers of 
com.google.inject.internal.cglib.core.$ReflectUtils$1
WARNING: Use --illegal-access=warn to enable warnings of further illegal 
reflective access operations
WARNING: All illegal access operations will be denied in a future release
[INFO] Scanning for projects...
[INFO] 
[INFO] 
[INFO] Building mock-bug 0.123
[INFO] 
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ mock-bug ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ mock-bug 
---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, 
i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory 
/home/me/projects/mock-bug/src/main/resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.5:compile (default-compile) @ mock-bug ---
[INFO] Changes detected - recompiling the module!
[WARNING] File encoding has not been set, using platform encoding UTF-8, i.e. 
build is platform dependent!
[INFO] Compiling 1 source file to /home/me/projects/mock-bug/target/classes
An exception has occurred in the compiler (9.0.4). Please file a bug against 
the Java compiler via the Java bug reporting page (http://bugreport.java.com) 
after checking the Bug Database (http://bugs.java.com) for duplicates. Include 
your program and the following diagnostic in your report. Thank you.
java.time.DateTimeException: Invalid value for MonthOfYear (valid values 1 - 
12): 0
at 
java.base/java.time.temporal.ValueRange.checkValidValue(ValueRange.java:311)
at 
java.base/java.time.temporal.ChronoField.checkValidValue(ChronoField.java:714)
at java.base/java.time.LocalDate.of(LocalDate.java:269)
at java.base/java.time.LocalDateTime.of(LocalDateTime.java:336)
at jdk.zipfs/jdk.nio.zipfs.ZipUtils.dosToJavaTime(ZipUtils.java:109)
at 
jdk.zipfs/jdk.nio.zipfs.ZipFileSystem$Entry.cen(ZipFileSystem.java:1950)
at 
jdk.zipfs/jdk.nio.zipfs.ZipFileSystem$Entry.readCEN(ZipFileSystem.java:1937)
at 
jdk.zipfs/jdk.nio.zipfs.ZipFileSystem.getEntry(ZipFileSystem.java:1324)
at 

[jira] [Updated] (BEAM-3693) On ubuntu 16 and Java 1.8 the package beam-sdks-java-core 2.2.0 throws Invalid value for MonthOfYear

2018-02-12 Thread Thiago Henrique Ramos da Mata (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thiago Henrique Ramos da Mata updated BEAM-3693:

Description: 
The problem is caused by the package beam-sdks-java-core in the version 2.2.0. 
This package was being loaded into the package my.projects.models.

Here is the isolated program that replicates the same error message:

 
h3. /pom.xml

{code:xml}
 
http://maven.apache.org/POM/4.0.0; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;
 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd;>
4.0.0
com.mock.bug
mock-bug
0.123


junit
junit
4.12
test


org.apache.beam
beam-sdks-java-core
2.2.0





maven-compiler-plugin
3.5

1.8
1.8



org.apache.maven.plugins
maven-source-plugin
3.0.1


attach-sources

jar







{code}
   
h3. ./src/main/java/com/mock/bug/Main.java

{code:java}
 package com.mock.bug;

public class Main {
public void main(String[] input){
System.out.println("hello workd");
}
}
{code}

h3. ./src/test/java/com/mock/bug/DummyHealthCheckTest.java

{code:java}
package com.mock.bug;

import org.junit.Test;

import static org.junit.Assert.*;

public class DummyHealthCheckTest {
@Test
public void DummyCheckTest() {
boolean t = true;
assertTrue(t);
}
}
{code}

h3. Command to fire the error message:

mvn -U clean package

h3. Exception Message


{code:shell}
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by 
com.google.inject.internal.cglib.core.$ReflectUtils$1 
(file:/usr/share/maven/lib/guice.jar) to method 
java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
WARNING: Please consider reporting this to the maintainers of 
com.google.inject.internal.cglib.core.$ReflectUtils$1
WARNING: Use --illegal-access=warn to enable warnings of further illegal 
reflective access operations
WARNING: All illegal access operations will be denied in a future release
[INFO] Scanning for projects...
[INFO] 
[INFO] 
[INFO] Building mock-bug 0.123
[INFO] 
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ mock-bug ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ mock-bug 
---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, 
i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory 
/home/me/projects/mock-bug/src/main/resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.5:compile (default-compile) @ mock-bug ---
[INFO] Changes detected - recompiling the module!
[WARNING] File encoding has not been set, using platform encoding UTF-8, i.e. 
build is platform dependent!
[INFO] Compiling 1 source file to /home/me/projects/mock-bug/target/classes
An exception has occurred in the compiler (9.0.4). Please file a bug against 
the Java compiler via the Java bug reporting page (http://bugreport.java.com) 
after checking the Bug Database (http://bugs.java.com) for duplicates. Include 
your program and the following diagnostic in your report. Thank you.
java.time.DateTimeException: Invalid value for MonthOfYear (valid values 1 - 
12): 0
at 
java.base/java.time.temporal.ValueRange.checkValidValue(ValueRange.java:311)
at 
java.base/java.time.temporal.ChronoField.checkValidValue(ChronoField.java:714)
at java.base/java.time.LocalDate.of(LocalDate.java:269)
at java.base/java.time.LocalDateTime.of(LocalDateTime.java:336)
at jdk.zipfs/jdk.nio.zipfs.ZipUtils.dosToJavaTime(ZipUtils.java:109)
at 
jdk.zipfs/jdk.nio.zipfs.ZipFileSystem$Entry.cen(ZipFileSystem.java:1950)
at 
jdk.zipfs/jdk.nio.zipfs.ZipFileSystem$Entry.readCEN(ZipFileSystem.java:1937)
at 
jdk.zipfs/jdk.nio.zipfs.ZipFileSystem.getEntry(ZipFileSystem.java:1324)
at 

[jira] [Created] (BEAM-3694) [SQL] Update SQL documentation

2018-02-12 Thread Anton Kedin (JIRA)
Anton Kedin created BEAM-3694:
-

 Summary: [SQL] Update SQL documentation
 Key: BEAM-3694
 URL: https://issues.apache.org/jira/browse/BEAM-3694
 Project: Beam
  Issue Type: Improvement
  Components: dsl-sql
Reporter: Anton Kedin


Update / add:
 * windowing inheritance;
 * HOP parameters swap;
 * datetime functions;
 * joins behavior;



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-3693) On ubuntu 16 and Java 1.8 the package beam-sdks-java-core 2.2.0 throws Invalid value for MonthOfYear

2018-02-12 Thread Thiago Henrique Ramos da Mata (JIRA)
Thiago Henrique Ramos da Mata created BEAM-3693:
---

 Summary: On ubuntu 16 and Java 1.8 the package beam-sdks-java-core 
2.2.0 throws Invalid value for MonthOfYear
 Key: BEAM-3693
 URL: https://issues.apache.org/jira/browse/BEAM-3693
 Project: Beam
  Issue Type: Bug
  Components: beam-model
Affects Versions: 2.2.0
 Environment: 
$ mvn version

Apache Maven 3.3.9
Maven home: /usr/share/maven
Java version: 9.0.4, vendor: Oracle Corporation
Java home: /usr/lib/jvm/java-9-oracle
Default locale: en_NZ, platform encoding: UTF-8
OS name: "linux", version: "4.13.0-32-generic", arch: "amd64", family: 
"unix"

$ java -version

java version "1.8.0_161"
Java(TM) SE Runtime Environment (build 1.8.0_161-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.161-b12, mixed mode)

$ lsb_release -a

No LSB modules are available.
Distributor ID: Ubuntu
Description:Ubuntu 16.04.3 LTS
Release:16.04
Codename:   xenial
Reporter: Thiago Henrique Ramos da Mata
Assignee: Kenneth Knowles
 Fix For: 2.1.0


The problem is caused by the package beam-sdks-java-core in the version 2.2.0. 
This package was being loaded into the package my.projects.models.

Here is the isolated program that replicates the same error message:

 
## ./pom.xml

 
http://maven.apache.org/POM/4.0.0; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;
 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd;>
4.0.0
com.mock.bug
mock-bug
0.123


junit
junit
4.12
test


org.apache.beam
beam-sdks-java-core
2.2.0





maven-compiler-plugin
3.5

1.8
1.8



org.apache.maven.plugins
maven-source-plugin
3.0.1


attach-sources

jar







   
## ./src/main/java/com/mock/bug/Main.java

 package com.mock.bug;

public class Main {
public void main(String[] input){
System.out.println("hello workd");
}
}

## ./src/test/java/com/mock/bug/DummyHealthCheckTest.java

package com.mock.bug;

import org.junit.Test;

import static org.junit.Assert.*;

public class DummyHealthCheckTest {
@Test
public void DummyCheckTest() {
boolean t = true;
assertTrue(t);
}
}

## Command to fire the error message:

mvn -U clean package

## Exception Message
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by 
com.google.inject.internal.cglib.core.$ReflectUtils$1 
(file:/usr/share/maven/lib/guice.jar) to method 
java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
WARNING: Please consider reporting this to the maintainers of 
com.google.inject.internal.cglib.core.$ReflectUtils$1
WARNING: Use --illegal-access=warn to enable warnings of further illegal 
reflective access operations
WARNING: All illegal access operations will be denied in a future release
[INFO] Scanning for projects...
[INFO] 
[INFO] 
[INFO] Building mock-bug 0.123
[INFO] 
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ mock-bug ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ mock-bug 
---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, 
i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory 
/home/me/projects/mock-bug/src/main/resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.5:compile (default-compile) @ mock-bug ---
[INFO] Changes detected - recompiling the module!
[WARNING] File encoding has not been set, using platform encoding UTF-8, i.e. 
build is platform dependent!
[INFO] Compiling 1 source file to /home/me/projects/mock-bug/target/classes
An exception has occurred in the compiler (9.0.4). Please file a bug against 
the Java compiler via the Java bug reporting page (http://bugreport.java.com) 
after checking the 

[jira] [Resolved] (BEAM-3630) Have Dataflow use the windowing strategy within the PCollectionToView/CollectionToSingleton expansion

2018-02-12 Thread Luke Cwik (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3630?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Luke Cwik resolved BEAM-3630.
-
   Resolution: Fixed
Fix Version/s: Not applicable

> Have Dataflow use the windowing strategy within the 
> PCollectionToView/CollectionToSingleton expansion
> -
>
> Key: BEAM-3630
> URL: https://issues.apache.org/jira/browse/BEAM-3630
> Project: Beam
>  Issue Type: Sub-task
>  Components: runner-dataflow
>Reporter: Luke Cwik
>Assignee: Luke Cwik
>Priority: Major
>  Labels: portability
> Fix For: Not applicable
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #39

2018-02-12 Thread Apache Jenkins Server
See 


--
[...truncated 2.32 MB...]
hard linking apache_beam/io/gcp/internal/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal
hard linking apache_beam/io/gcp/internal/clients/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal/clients
hard linking apache_beam/io/gcp/internal/clients/bigquery/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/bigquery
hard linking apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py 
-> apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/bigquery
hard linking 
apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_messages.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/bigquery
hard linking apache_beam/io/gcp/internal/clients/storage/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/storage
hard linking apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py 
-> apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/storage
hard linking apache_beam/io/gcp/internal/clients/storage/storage_v1_messages.py 
-> apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/storage
hard linking apache_beam/io/gcp/tests/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/io/gcp/tests/bigquery_matcher.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/io/gcp/tests/bigquery_matcher_test.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/io/gcp/tests/utils.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/io/gcp/tests/utils_test.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/metrics/__init__.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/cells.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/cells_test.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/execution.pxd -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/execution.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/execution_test.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/metric.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/metric_test.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/metricbase.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/options/__init__.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/pipeline_options.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/pipeline_options_test.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/pipeline_options_validator.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/pipeline_options_validator_test.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/value_provider.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/value_provider_test.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/portability/__init__.py -> 
apache-beam-2.3.0/apache_beam/portability
hard linking apache_beam/portability/api/__init__.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_artifact_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_artifact_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_fn_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_fn_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_job_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_job_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_provision_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_provision_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_runner_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_runner_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/endpoints_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/endpoints_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/standard_window_fns_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/standard_window_fns_pb2_grpc.py 

[jira] [Closed] (BEAM-3205) Publicly document known coder wire formats and their URNs

2018-02-12 Thread Robert Bradshaw (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3205?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Bradshaw closed BEAM-3205.
-
   Resolution: Fixed
Fix Version/s: 2.4.0

> Publicly document known coder wire formats and their URNs
> -
>
> Key: BEAM-3205
> URL: https://issues.apache.org/jira/browse/BEAM-3205
> Project: Beam
>  Issue Type: Improvement
>  Components: beam-model
>Reporter: Kenneth Knowles
>Assignee: Robert Bradshaw
>Priority: Major
>  Labels: portability
> Fix For: 2.4.0
>
>
> Overarching issue: We need to get our Google Docs, markdown, and email 
> threads that sketch the Beam model as it is developed into a centralized 
> place with clear information architecture / navigation, and draw the line 
> that "if it isn't reachable from here in an obvious way it isn't the spec". 
> [1]
> Specific issue: Which coders are required for a runner and SDK to understand? 
> Which coders are otherwise considered standardized? What is the abstract 
> specification for their wire format?
> Today we have 
> https://github.com/apache/beam/blob/master/model/fn-execution/src/test/resources/org/apache/beam/model/fnexecution/v1/standard_coders.yaml
>  which is the beginning of a compliance test suite for standardized coders.
> This would really benefit from:
>  - narrative descriptions of the formats, including _abstract_ specification 
> (not examples) and perhaps motivation
>  - specification of which are required and which are merely "well known"
>  - ties into BEAM-3203 in terms of which coders are required to decode to 
> compatible value in every SDK
>  - once we have an abstract spec and some examples, and one language has 
> robust coders that pass the examples, we could turn it around and treat that 
> implementation as a reference impl for fuzz testing
> Any sort of fancy hacking that blends the tests with the narrative is fine, 
> though mostly I think they'll end up covering disjoint topics.
> [1] I filed BEAM-2567 and BEAM-2568 and ported 
> https://beam.apache.org/contribute/runner-guide/, and [~herohde] put together 
> https://beam.apache.org/contribute/portability/ and 
> https://github.com/apache/beam/blob/master/sdks/CONTAINERS.md



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (BEAM-3074) Propagate pipeline protos through Dataflow API from Python

2018-02-12 Thread Robert Bradshaw (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3074?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Bradshaw resolved BEAM-3074.
---
   Resolution: Fixed
Fix Version/s: 2.4.0

> Propagate pipeline protos through Dataflow API from Python
> --
>
> Key: BEAM-3074
> URL: https://issues.apache.org/jira/browse/BEAM-3074
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Kenneth Knowles
>Assignee: Robert Bradshaw
>Priority: Major
>  Labels: portability
> Fix For: 2.4.0
>
>  Time Spent: 1.5h
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #38

2018-02-12 Thread Apache Jenkins Server
See 


--
[...truncated 2.32 MB...]
hard linking apache_beam/io/gcp/internal/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal
hard linking apache_beam/io/gcp/internal/clients/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal/clients
hard linking apache_beam/io/gcp/internal/clients/bigquery/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/bigquery
hard linking apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py 
-> apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/bigquery
hard linking 
apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_messages.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/bigquery
hard linking apache_beam/io/gcp/internal/clients/storage/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/storage
hard linking apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py 
-> apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/storage
hard linking apache_beam/io/gcp/internal/clients/storage/storage_v1_messages.py 
-> apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/storage
hard linking apache_beam/io/gcp/tests/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/io/gcp/tests/bigquery_matcher.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/io/gcp/tests/bigquery_matcher_test.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/io/gcp/tests/utils.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/io/gcp/tests/utils_test.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/metrics/__init__.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/cells.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/cells_test.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/execution.pxd -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/execution.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/execution_test.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/metric.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/metric_test.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/metricbase.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/options/__init__.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/pipeline_options.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/pipeline_options_test.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/pipeline_options_validator.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/pipeline_options_validator_test.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/value_provider.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/value_provider_test.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/portability/__init__.py -> 
apache-beam-2.3.0/apache_beam/portability
hard linking apache_beam/portability/api/__init__.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_artifact_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_artifact_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_fn_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_fn_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_job_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_job_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_provision_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_provision_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_runner_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_runner_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/endpoints_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/endpoints_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/standard_window_fns_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/standard_window_fns_pb2_grpc.py 

[jira] [Commented] (BEAM-3280) There are no examples or tests about how to use typehints with TaggedOutput

2018-02-12 Thread Ahmet Altay (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3280?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361424#comment-16361424
 ] 

Ahmet Altay commented on BEAM-3280:
---

Related discussion from the dev@ list: 
[https://lists.apache.org/thread.html/c0a46816c2d0861e682b12e7626f38106d598d77c9d4a6c30e4c8a37@%3Cdev.beam.apache.org%3E]

Support for typehints for tagged outputs need to be implemented before adding 
examples.

> There are no examples or tests about how to use typehints with TaggedOutput
> ---
>
> Key: BEAM-3280
> URL: https://issues.apache.org/jira/browse/BEAM-3280
> Project: Beam
>  Issue Type: Bug
>  Components: examples-python
>Reporter: Daniel Ho
>Assignee: Norio Akagi
>Priority: Minor
>
> There are no examples or tests about how to use typehints with TaggedOutput.
> (how do typehints look like when there are multiple outputs?)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #37

2018-02-12 Thread Apache Jenkins Server
See 


--
[...truncated 2.32 MB...]
hard linking apache_beam/io/gcp/internal/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal
hard linking apache_beam/io/gcp/internal/clients/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal/clients
hard linking apache_beam/io/gcp/internal/clients/bigquery/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/bigquery
hard linking apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py 
-> apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/bigquery
hard linking 
apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_messages.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/bigquery
hard linking apache_beam/io/gcp/internal/clients/storage/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/storage
hard linking apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py 
-> apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/storage
hard linking apache_beam/io/gcp/internal/clients/storage/storage_v1_messages.py 
-> apache-beam-2.3.0/apache_beam/io/gcp/internal/clients/storage
hard linking apache_beam/io/gcp/tests/__init__.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/io/gcp/tests/bigquery_matcher.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/io/gcp/tests/bigquery_matcher_test.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/io/gcp/tests/utils.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/io/gcp/tests/utils_test.py -> 
apache-beam-2.3.0/apache_beam/io/gcp/tests
hard linking apache_beam/metrics/__init__.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/cells.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/cells_test.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/execution.pxd -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/execution.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/execution_test.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/metric.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/metric_test.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/metrics/metricbase.py -> 
apache-beam-2.3.0/apache_beam/metrics
hard linking apache_beam/options/__init__.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/pipeline_options.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/pipeline_options_test.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/pipeline_options_validator.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/pipeline_options_validator_test.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/value_provider.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/options/value_provider_test.py -> 
apache-beam-2.3.0/apache_beam/options
hard linking apache_beam/portability/__init__.py -> 
apache-beam-2.3.0/apache_beam/portability
hard linking apache_beam/portability/api/__init__.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_artifact_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_artifact_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_fn_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_fn_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_job_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_job_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_provision_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_provision_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_runner_api_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/beam_runner_api_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/endpoints_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/endpoints_pb2_grpc.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/standard_window_fns_pb2.py -> 
apache-beam-2.3.0/apache_beam/portability/api
hard linking apache_beam/portability/api/standard_window_fns_pb2_grpc.py 

[jira] [Commented] (BEAM-3684) Update well-known coder URNs in Go SDK

2018-02-12 Thread Henning Rohde (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3684?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361340#comment-16361340
 ] 

Henning Rohde commented on BEAM-3684:
-

FYI [~wcn3] [~lostluck]

> Update well-known coder URNs in Go SDK
> --
>
> Key: BEAM-3684
> URL: https://issues.apache.org/jira/browse/BEAM-3684
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Reporter: Eugene Kirpichov
>Assignee: Henning Rohde
>Priority: Major
>  Labels: portability
>
> [https://github.com/apache/beam/blob/master/runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/ModelCoderRegistrar.java]
>  has recently changed the URNs of well-known coders. The Go SDK needs to 
> update accordingly 
> https://github.com/apache/beam/blob/go-sdk/sdks/go/pkg/beam/core/runtime/graphx/coder.go#L34



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (BEAM-3126) Portable flattens in Python SDK Harness

2018-02-12 Thread Daniel Oliveira (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3126?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Oliveira resolved BEAM-3126.
---
   Resolution: Implemented
Fix Version/s: 2.4.0

Should be working now.

> Portable flattens in Python SDK Harness
> ---
>
> Key: BEAM-3126
> URL: https://issues.apache.org/jira/browse/BEAM-3126
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-harness
>Reporter: Daniel Oliveira
>Assignee: Daniel Oliveira
>Priority: Major
>  Labels: portability
> Fix For: 2.4.0
>
>
> Get flattens working for portability in Python SDK Harness.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-3423) Distinct.withRepresentativeValueFn throws CoderException "cannot encode null KV"

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3423?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-3423:
--
Affects Version/s: 2.3.0

> Distinct.withRepresentativeValueFn throws CoderException "cannot encode null 
> KV" 
> -
>
> Key: BEAM-3423
> URL: https://issues.apache.org/jira/browse/BEAM-3423
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Affects Versions: 2.2.0, 2.3.0
> Environment: ubuntu16.04, idea, java8
>Reporter: huangjianhuang
>Assignee: Kenneth Knowles
>Priority: Major
> Fix For: 2.4.0
>
>
> My code as follow:
> {code:java}
> pipeline
> //Read data
> .apply("Read from kafka",
> KafkaIO.read()
> .withBootstrapServers("localhost:9092")
> .withTopic(topic)
> .withKeyDeserializer(StringDeserializer.class)
> 
> .withValueDeserializer(StringDeserializer.class)
> .withoutMetadata()
> )
> .apply(Window. String>>into(FixedWindows.of(Duration.standardSeconds(10)))
> .triggering(AfterWatermark.pastEndOfWindow()
> 
> .withEarlyFirings(AfterProcessingTime.pastFirstElementInPane().plusDelayOf(Duration.standardSeconds(5
> 
> .discardingFiredPanes().withAllowedLateness(Duration.ZERO))
> //works fine
> //.apply(Distinct.create())
> //ops! -> CoderException: cannot encode a null KV
> .apply(Distinct.withRepresentativeValueFn(new 
> Val()).withRepresentativeType(TypeDescriptors.strings()))
> .apply(MapElements.into(TypeDescriptors.nulls())
> .via(input -> {
> System.out.println(Instant.now());
> System.out.println(input);
> return null;
> }));
> private static class Val implements SerializableFunction String>, String> {
> @Override
> public String apply(KV input) {
> return input.getValue();
> }
> }
> {code}
> Input words to Kafka:
> word1
> //after 10s
> word2
> Then got exceptions as follow:
> {code:java}
> begin
> 2018-01-06T11:18:52.971Z
> KV{null, a}
> Exception in thread "main" 
> org.apache.beam.sdk.Pipeline$PipelineExecutionException: 
> java.lang.RuntimeException: org.apache.beam.sdk.coders.CoderException: cannot 
> encode a null KV
>   at 
> org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:344)
>   at 
> org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:314)
>   at 
> org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:208)
>   at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:62)
>   at org.apache.beam.sdk.Pipeline.run(Pipeline.java:303)
>   at org.apache.beam.sdk.Pipeline.run(Pipeline.java:289)
>   at com.xiaomi.huyu.processor.dev.EntryPoint.main(EntryPoint.java:37)
> Caused by: java.lang.RuntimeException: 
> org.apache.beam.sdk.coders.CoderException: cannot encode a null KV
>   at 
> org.apache.beam.runners.direct.ImmutabilityCheckingBundleFactory$ImmutabilityEnforcingBundle.add(ImmutabilityCheckingBundleFactory.java:113)
>   at 
> org.apache.beam.runners.direct.ParDoEvaluator$BundleOutputManager.output(ParDoEvaluator.java:235)
>   at 
> org.apache.beam.runners.direct.repackaged.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
>   at 
> org.apache.beam.runners.direct.repackaged.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
>   at 
> org.apache.beam.runners.direct.repackaged.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
>   at 
> org.apache.beam.runners.direct.repackaged.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
>   at 
> org.apache.beam.sdk.transforms.Combine$GroupedValues$1.processElement(Combine.java:2149)
> Caused by: org.apache.beam.sdk.coders.CoderException: cannot encode a null KV
>   at org.apache.beam.sdk.coders.KvCoder.encode(KvCoder.java:70)
>   at org.apache.beam.sdk.coders.KvCoder.encode(KvCoder.java:36)
>   at org.apache.beam.sdk.coders.KvCoder.encode(KvCoder.java:73)
>   at org.apache.beam.sdk.coders.KvCoder.encode(KvCoder.java:36)
>   at 
> org.apache.beam.sdk.util.CoderUtils.encodeToSafeStream(CoderUtils.java:93)

Jenkins build is still unstable: beam_PostCommit_Java_MavenInstall #5947

2018-02-12 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-3647) Default Coder/Reading Coder From File

2018-02-12 Thread Anton Kedin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3647?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361329#comment-16361329
 ] 

Anton Kedin commented on BEAM-3647:
---

It does look like a good use case for schema aware PCollections. Some SQL work 
is tracked in BEAM-3157

> Default Coder/Reading Coder From File 
> --
>
> Key: BEAM-3647
> URL: https://issues.apache.org/jira/browse/BEAM-3647
> Project: Beam
>  Issue Type: New Feature
>  Components: beam-model, dsl-sql
>Affects Versions: 2.2.0
>Reporter: Kishan Kumar
>Priority: Major
>
> *Requirement*-: Need to Run Template With Same Logics on Different Tables 
> Data.(Example is Given Below)
>  
> *Need*: Default Coder is Required So According to Data It Make All Fields as 
> String and Read Data else Thier must be Dynamic Options to Read Coder From 
> GCS as JSON FILE and Parse Data on Basis of That (But We can Pass Location 
> Using ValueProvider) or SomeWhere Else so At Runtime Using ValueProvider.
>  
>  
> *Examples*: I Have Two Tables 1 is Having Column (NAME, CLASS, ROLL, 
> SUB_PRICE)
> And 2 Table is (NAME, ROLL, SUB, TEST_MARKS)
>  
> On Both Tables, I am Just Sorting Table on Basis Of Roll Number so if We can 
> Read Coder at Run Time The Same Template Can Be Used For Different Tables at 
> Run Time.
>  
> Such Situations Make Our Work Easy and Make Our job Easy.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3157) BeamSql transform should support other PCollection types

2018-02-12 Thread Anton Kedin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3157?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361319#comment-16361319
 ] 

Anton Kedin commented on BEAM-3157:
---

Yes, this should be open until we have the code generation piece wired up with 
user-friendly APIs. Specifics of the API would be defined by the schema-aware 
PCollections design that [~reuvenlax] is working on.

I have put together a functional prototype of how RowType generation can be 
wired up to SQL: 
[https://github.com/apache/beam/pull/4649/commits/e12d54725ab092c260a7084f50012e4fe3d7e81b#diff-30ffc29d9ac0817e4e88622a81da8ec3R58]

I will be gathering feedback and waiting for further work on the schema-aware 
PCollections before submitting this

> BeamSql transform should support other PCollection types
> 
>
> Key: BEAM-3157
> URL: https://issues.apache.org/jira/browse/BEAM-3157
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Ismaël Mejía
>Assignee: Anton Kedin
>Priority: Major
> Fix For: 2.4.0
>
>  Time Spent: 5h 20m
>  Remaining Estimate: 0h
>
> Currently the Beam SQL transform only supports input and output data 
> represented as a BeamRecord. This seems to me like an usability limitation 
> (even if we can do a ParDo to prepare objects before and after the transform).
> I suppose this constraint comes from the fact that we need to map 
> name/type/value from an object field into Calcite so it is convenient to have 
> a specific data type (BeamRecord) for this. However we can accomplish the 
> same by using a PCollection of JavaBean (where we know the same information 
> via the field names/types/values) or by using Avro records where we also have 
> the Schema information. For the output PCollection we can map the object via 
> a Reference (e.g. a JavaBean to be filled with the names of an Avro object).
> Note: I am assuming for the moment simple mappings since the SQL does not 
> support composite types for the moment.
> A simple API idea would be something like this:
> A simple filter:
> PCollection col = BeamSql.query("SELECT * FROM  WHERE 
> ...").from(MyPojo.class);
> A projection:
> PCollection newCol = BeamSql.query("SELECT id, 
> name").from(MyPojo.class).as(MyNewPojo.class);
> A first approach could be to just add the extra ParDos + transform DoFns 
> however I suppose that for memory use reasons maybe mapping directly into 
> Calcite would make sense.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-3272) ParDoTranslatorTest: Error creating local cluster while creating checkpoint file

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3272?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-3272:
--
Labels: flake sickbay  (was: flake)

> ParDoTranslatorTest: Error creating local cluster while creating checkpoint 
> file
> 
>
> Key: BEAM-3272
> URL: https://issues.apache.org/jira/browse/BEAM-3272
> Project: Beam
>  Issue Type: Bug
>  Components: runner-apex
>Reporter: Eugene Kirpichov
>Priority: Critical
>  Labels: flake, sickbay
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> Failed build: 
> https://builds.apache.org/job/beam_PostCommit_Java_MavenInstall/org.apache.beam$beam-runners-apex/5330/console
> Key output:
> {code}
> 2017-11-29T01:21:26.956 [ERROR] 
> testAssertionFailure(org.apache.beam.runners.apex.translation.ParDoTranslatorTest)
>   Time elapsed: 2.007 s  <<< ERROR!
> java.lang.RuntimeException: Error creating local cluster
>   at 
> org.apache.apex.engine.EmbeddedAppLauncherImpl.getController(EmbeddedAppLauncherImpl.java:122)
>   at 
> org.apache.apex.engine.EmbeddedAppLauncherImpl.launchApp(EmbeddedAppLauncherImpl.java:71)
>   at 
> org.apache.apex.engine.EmbeddedAppLauncherImpl.launchApp(EmbeddedAppLauncherImpl.java:46)
>   at org.apache.beam.runners.apex.ApexRunner.run(ApexRunner.java:197)
>   at 
> org.apache.beam.runners.apex.TestApexRunner.run(TestApexRunner.java:57)
>   at 
> org.apache.beam.runners.apex.TestApexRunner.run(TestApexRunner.java:31)
>   at org.apache.beam.sdk.Pipeline.run(Pipeline.java:304)
>   at org.apache.beam.sdk.Pipeline.run(Pipeline.java:290)
>   at 
> org.apache.beam.runners.apex.translation.ParDoTranslatorTest.runExpectingAssertionFailure(ParDoTranslatorTest.java:156)
> {code}
> ...
> {code}
> Caused by: ExitCodeException exitCode=1: chmod: cannot access 
> ‘/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Java_MavenInstall/src/runners/apex/target/com.datatorrent.stram.StramLocalCluster/checkpoints/2/_tmp’:
>  No such file or directory
>   at org.apache.hadoop.util.Shell.runCommand(Shell.java:582)
>   at org.apache.hadoop.util.Shell.run(Shell.java:479)
>   at 
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:773)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:866)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:849)
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:733)
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.(RawLocalFileSystem.java:225)
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.(RawLocalFileSystem.java:209)
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem.createOutputStreamWithMode(RawLocalFileSystem.java:307)
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:296)
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:328)
>   at org.apache.hadoop.fs.FileSystem.primitiveCreate(FileSystem.java:1017)
>   at 
> org.apache.hadoop.fs.DelegateToFileSystem.createInternal(DelegateToFileSystem.java:99)
>   at 
> org.apache.hadoop.fs.ChecksumFs$ChecksumFSOutputSummer.(ChecksumFs.java:352)
>   at org.apache.hadoop.fs.ChecksumFs.createInternal(ChecksumFs.java:399)
>   at 
> org.apache.hadoop.fs.AbstractFileSystem.create(AbstractFileSystem.java:584)
>   at org.apache.hadoop.fs.FileContext$3.next(FileContext.java:686)
>   at org.apache.hadoop.fs.FileContext$3.next(FileContext.java:682)
>   at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
>   at org.apache.hadoop.fs.FileContext.create(FileContext.java:688)
>   at 
> com.datatorrent.common.util.AsyncFSStorageAgent.copyToHDFS(AsyncFSStorageAgent.java:119)
>   ... 50 more
> {code}
> By inspecting code at the stack frames, seems it's trying to copy an 
> operator's checkpoint "to HDFS" (which in this case is the local disk), but 
> fails while creating the target file of the copy - creation creates the file 
> (successfully) and chmods it writable (unsuccessfully). Barring something 
> subtle (e.g. chmod being not allowed to call immediately after creating a 
> FileOutputStream), this looks like the whole directory was possibly deleted 
> from under the process. I don't know why this would be the case though, or 
> how to debug it.
> Either way, the path being accessed is funky: 
> /home/jenkins/jenkins-slave/workspace/beam_PostCommit_Java_MavenInstall/src/runners/apex/target/...
>  - I think it'd be better if this test used a "@Rule TemporaryFolder" to 
> store Apex checkpoints. I don't know whether the Apex runner allows that, but 
> I can see how it could help reduce interference between 

Jenkins build became unstable: beam_PostCommit_Java_MavenInstall #5946

2018-02-12 Thread Apache Jenkins Server
See 




[jira] [Resolved] (BEAM-2950) Provide implicit access to State

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2950?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles resolved BEAM-2950.
---
   Resolution: Won't Fix
Fix Version/s: Not applicable

> Provide implicit access to State
> 
>
> Key: BEAM-2950
> URL: https://issues.apache.org/jira/browse/BEAM-2950
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-java-core
>Reporter: Eugene Kirpichov
>Assignee: Kenneth Knowles
>Priority: Major
> Fix For: Not applicable
>
>
> https://github.com/apache/beam/pull/3814 provides implicit access to side 
> inputs (without a ProcessContext). Luke suggests to have the same for State 
> and, I suppose, timers. We could also have it for PipelineOptions: in any 
> given user code invocation, these are all unambiguous.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-3647) Default Coder/Reading Coder From File

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3647?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-3647:
--
Component/s: (was: beam-model)

> Default Coder/Reading Coder From File 
> --
>
> Key: BEAM-3647
> URL: https://issues.apache.org/jira/browse/BEAM-3647
> Project: Beam
>  Issue Type: New Feature
>  Components: beam-model, dsl-sql
>Affects Versions: 2.2.0
>Reporter: Kishan Kumar
>Assignee: Kenneth Knowles
>Priority: Critical
>
> *Requirement*-: Need to Run Template With Same Logics on Different Tables 
> Data.(Example is Given Below)
>  
> *Need*: Default Coder is Required So According to Data It Make All Fields as 
> String and Read Data else Thier must be Dynamic Options to Read Coder From 
> GCS as JSON FILE and Parse Data on Basis of That (But We can Pass Location 
> Using ValueProvider) or SomeWhere Else so At Runtime Using ValueProvider.
>  
>  
> *Examples*: I Have Two Tables 1 is Having Column (NAME, CLASS, ROLL, 
> SUB_PRICE)
> And 2 Table is (NAME, ROLL, SUB, TEST_MARKS)
>  
> On Both Tables, I am Just Sorting Table on Basis Of Roll Number so if We can 
> Read Coder at Run Time The Same Template Can Be Used For Different Tables at 
> Run Time.
>  
> Such Situations Make Our Work Easy and Make Our job Easy.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-3647) Default Coder/Reading Coder From File

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3647?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-3647:
--
Priority: Major  (was: Critical)

> Default Coder/Reading Coder From File 
> --
>
> Key: BEAM-3647
> URL: https://issues.apache.org/jira/browse/BEAM-3647
> Project: Beam
>  Issue Type: New Feature
>  Components: beam-model, dsl-sql
>Affects Versions: 2.2.0
>Reporter: Kishan Kumar
>Priority: Major
>
> *Requirement*-: Need to Run Template With Same Logics on Different Tables 
> Data.(Example is Given Below)
>  
> *Need*: Default Coder is Required So According to Data It Make All Fields as 
> String and Read Data else Thier must be Dynamic Options to Read Coder From 
> GCS as JSON FILE and Parse Data on Basis of That (But We can Pass Location 
> Using ValueProvider) or SomeWhere Else so At Runtime Using ValueProvider.
>  
>  
> *Examples*: I Have Two Tables 1 is Having Column (NAME, CLASS, ROLL, 
> SUB_PRICE)
> And 2 Table is (NAME, ROLL, SUB, TEST_MARKS)
>  
> On Both Tables, I am Just Sorting Table on Basis Of Roll Number so if We can 
> Read Coder at Run Time The Same Template Can Be Used For Different Tables at 
> Run Time.
>  
> Such Situations Make Our Work Easy and Make Our job Easy.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-3647) Default Coder/Reading Coder From File

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3647?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-3647:
--
Component/s: beam-model

> Default Coder/Reading Coder From File 
> --
>
> Key: BEAM-3647
> URL: https://issues.apache.org/jira/browse/BEAM-3647
> Project: Beam
>  Issue Type: New Feature
>  Components: beam-model, dsl-sql
>Affects Versions: 2.2.0
>Reporter: Kishan Kumar
>Assignee: Kenneth Knowles
>Priority: Critical
>
> *Requirement*-: Need to Run Template With Same Logics on Different Tables 
> Data.(Example is Given Below)
>  
> *Need*: Default Coder is Required So According to Data It Make All Fields as 
> String and Read Data else Thier must be Dynamic Options to Read Coder From 
> GCS as JSON FILE and Parse Data on Basis of That (But We can Pass Location 
> Using ValueProvider) or SomeWhere Else so At Runtime Using ValueProvider.
>  
>  
> *Examples*: I Have Two Tables 1 is Having Column (NAME, CLASS, ROLL, 
> SUB_PRICE)
> And 2 Table is (NAME, ROLL, SUB, TEST_MARKS)
>  
> On Both Tables, I am Just Sorting Table on Basis Of Roll Number so if We can 
> Read Coder at Run Time The Same Template Can Be Used For Different Tables at 
> Run Time.
>  
> Such Situations Make Our Work Easy and Make Our job Easy.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #36

2018-02-12 Thread Apache Jenkins Server
See 


--
[...truncated 450.81 KB...]
  inflating: apache-beam-2.3.0/sdks/python/apache_beam/metrics/metric.py  
  inflating: apache-beam-2.3.0/sdks/python/apache_beam/metrics/metric_test.py  
  inflating: apache-beam-2.3.0/sdks/python/apache_beam/metrics/metricbase.py  
   creating: apache-beam-2.3.0/sdks/python/apache_beam/options/
  inflating: apache-beam-2.3.0/sdks/python/apache_beam/options/__init__.py  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/options/pipeline_options.py  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/options/pipeline_options_test.py  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/options/pipeline_options_validator.py 
 
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/options/pipeline_options_validator_test.py
  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/options/value_provider.py  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/options/value_provider_test.py  
  inflating: apache-beam-2.3.0/sdks/python/apache_beam/pipeline.py  
  inflating: apache-beam-2.3.0/sdks/python/apache_beam/pipeline_test.py  
   creating: apache-beam-2.3.0/sdks/python/apache_beam/portability/
  inflating: apache-beam-2.3.0/sdks/python/apache_beam/portability/__init__.py  
   creating: apache-beam-2.3.0/sdks/python/apache_beam/portability/api/
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/portability/api/__init__.py  
  inflating: apache-beam-2.3.0/sdks/python/apache_beam/pvalue.py  
  inflating: apache-beam-2.3.0/sdks/python/apache_beam/pvalue_test.py  
   creating: apache-beam-2.3.0/sdks/python/apache_beam/runners/
  inflating: apache-beam-2.3.0/sdks/python/apache_beam/runners/__init__.py  
  inflating: apache-beam-2.3.0/sdks/python/apache_beam/runners/common.pxd  
  inflating: apache-beam-2.3.0/sdks/python/apache_beam/runners/common.py  
  inflating: apache-beam-2.3.0/sdks/python/apache_beam/runners/common_test.py  
   creating: apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/__init__.py  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/dataflow_metrics.py  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/dataflow_metrics_test.py
  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/dataflow_runner_test.py
  
   creating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/internal/
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/internal/__init__.py 
 
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py
  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/internal/apiclient_test.py
  
   creating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/internal/clients/
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/internal/clients/__init__.py
  
   creating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/__init__.py
  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py
  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_messages.py
  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers.py
  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers_test.py
  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/internal/dependency.py
  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/internal/dependency_test.py
  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/internal/names.py  
   creating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/native_io/
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/native_io/__init__.py
  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/native_io/iobase.py  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/native_io/iobase_test.py
  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/native_io/streaming_create.py
  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py
  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/template_runner_test.py
  
  inflating: 
apache-beam-2.3.0/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py
  
   creating: 

[jira] [Updated] (BEAM-3685) It should be an error to run a Pipeline without ever specifying options

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3685?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-3685:
--
Labels: beginner newbie starter  (was: )

> It should be an error to run a Pipeline without ever specifying options
> ---
>
> Key: BEAM-3685
> URL: https://issues.apache.org/jira/browse/BEAM-3685
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Thomas Groh
>Priority: Major
>  Labels: beginner, newbie, starter
>
> Doing so lets users run some pipelines without specifying any configuration, 
> which is dangerous.
>  
> At minimum, it should log a very obvious warning.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (BEAM-3601) Switch to Java 8 futures

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3601?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles resolved BEAM-3601.
---
   Resolution: Fixed
Fix Version/s: 2.4.0

> Switch to Java 8 futures
> 
>
> Key: BEAM-3601
> URL: https://issues.apache.org/jira/browse/BEAM-3601
> Project: Beam
>  Issue Type: Sub-task
>  Components: runner-core
>Reporter: Kenneth Knowles
>Assignee: Kenneth Knowles
>Priority: Major
>  Labels: portability
> Fix For: 2.4.0
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3685) It should be an error to run a Pipeline without ever specifying options

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3685?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles reassigned BEAM-3685:
-

Assignee: (was: Kenneth Knowles)

> It should be an error to run a Pipeline without ever specifying options
> ---
>
> Key: BEAM-3685
> URL: https://issues.apache.org/jira/browse/BEAM-3685
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Thomas Groh
>Priority: Major
>  Labels: beginner, newbie, starter
>
> Doing so lets users run some pipelines without specifying any configuration, 
> which is dangerous.
>  
> At minimum, it should log a very obvious warning.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3650) Deprecate and remove DoFnTester

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3650?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles reassigned BEAM-3650:
-

Assignee: (was: Kenneth Knowles)

> Deprecate and remove DoFnTester
> ---
>
> Key: BEAM-3650
> URL: https://issues.apache.org/jira/browse/BEAM-3650
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Kenneth Knowles
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3647) Default Coder/Reading Coder From File

2018-02-12 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3647?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361303#comment-16361303
 ] 

Kenneth Knowles commented on BEAM-3647:
---

I may not understand the request here. It sounds like you want PCollection 
schemas associated with non-Row objects. That is currently under discussion.

But on the other hand, templates with ValueProvider will not work with SQL 
because a SQL statement has a customized expansion, and a template has a fixed 
expansion currently.

> Default Coder/Reading Coder From File 
> --
>
> Key: BEAM-3647
> URL: https://issues.apache.org/jira/browse/BEAM-3647
> Project: Beam
>  Issue Type: New Feature
>  Components: beam-model, dsl-sql
>Affects Versions: 2.2.0
>Reporter: Kishan Kumar
>Assignee: Anton Kedin
>Priority: Critical
>
> *Requirement*-: Need to Run Template With Same Logics on Different Tables 
> Data.(Example is Given Below)
>  
> *Need*: Default Coder is Required So According to Data It Make All Fields as 
> String and Read Data else Thier must be Dynamic Options to Read Coder From 
> GCS as JSON FILE and Parse Data on Basis of That (But We can Pass Location 
> Using ValueProvider) or SomeWhere Else so At Runtime Using ValueProvider.
>  
>  
> *Examples*: I Have Two Tables 1 is Having Column (NAME, CLASS, ROLL, 
> SUB_PRICE)
> And 2 Table is (NAME, ROLL, SUB, TEST_MARKS)
>  
> On Both Tables, I am Just Sorting Table on Basis Of Roll Number so if We can 
> Read Coder at Run Time The Same Template Can Be Used For Different Tables at 
> Run Time.
>  
> Such Situations Make Our Work Easy and Make Our job Easy.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (BEAM-3423) Distinct.withRepresentativeValueFn throws CoderException "cannot encode null KV"

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3423?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles resolved BEAM-3423.
---
   Resolution: Fixed
Fix Version/s: 2.4.0

> Distinct.withRepresentativeValueFn throws CoderException "cannot encode null 
> KV" 
> -
>
> Key: BEAM-3423
> URL: https://issues.apache.org/jira/browse/BEAM-3423
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Affects Versions: 2.2.0, 2.3.0
> Environment: ubuntu16.04, idea, java8
>Reporter: huangjianhuang
>Assignee: Kenneth Knowles
>Priority: Major
> Fix For: 2.4.0
>
>
> My code as follow:
> {code:java}
> pipeline
> //Read data
> .apply("Read from kafka",
> KafkaIO.read()
> .withBootstrapServers("localhost:9092")
> .withTopic(topic)
> .withKeyDeserializer(StringDeserializer.class)
> 
> .withValueDeserializer(StringDeserializer.class)
> .withoutMetadata()
> )
> .apply(Window. String>>into(FixedWindows.of(Duration.standardSeconds(10)))
> .triggering(AfterWatermark.pastEndOfWindow()
> 
> .withEarlyFirings(AfterProcessingTime.pastFirstElementInPane().plusDelayOf(Duration.standardSeconds(5
> 
> .discardingFiredPanes().withAllowedLateness(Duration.ZERO))
> //works fine
> //.apply(Distinct.create())
> //ops! -> CoderException: cannot encode a null KV
> .apply(Distinct.withRepresentativeValueFn(new 
> Val()).withRepresentativeType(TypeDescriptors.strings()))
> .apply(MapElements.into(TypeDescriptors.nulls())
> .via(input -> {
> System.out.println(Instant.now());
> System.out.println(input);
> return null;
> }));
> private static class Val implements SerializableFunction String>, String> {
> @Override
> public String apply(KV input) {
> return input.getValue();
> }
> }
> {code}
> Input words to Kafka:
> word1
> //after 10s
> word2
> Then got exceptions as follow:
> {code:java}
> begin
> 2018-01-06T11:18:52.971Z
> KV{null, a}
> Exception in thread "main" 
> org.apache.beam.sdk.Pipeline$PipelineExecutionException: 
> java.lang.RuntimeException: org.apache.beam.sdk.coders.CoderException: cannot 
> encode a null KV
>   at 
> org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:344)
>   at 
> org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:314)
>   at 
> org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:208)
>   at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:62)
>   at org.apache.beam.sdk.Pipeline.run(Pipeline.java:303)
>   at org.apache.beam.sdk.Pipeline.run(Pipeline.java:289)
>   at com.xiaomi.huyu.processor.dev.EntryPoint.main(EntryPoint.java:37)
> Caused by: java.lang.RuntimeException: 
> org.apache.beam.sdk.coders.CoderException: cannot encode a null KV
>   at 
> org.apache.beam.runners.direct.ImmutabilityCheckingBundleFactory$ImmutabilityEnforcingBundle.add(ImmutabilityCheckingBundleFactory.java:113)
>   at 
> org.apache.beam.runners.direct.ParDoEvaluator$BundleOutputManager.output(ParDoEvaluator.java:235)
>   at 
> org.apache.beam.runners.direct.repackaged.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
>   at 
> org.apache.beam.runners.direct.repackaged.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
>   at 
> org.apache.beam.runners.direct.repackaged.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
>   at 
> org.apache.beam.runners.direct.repackaged.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
>   at 
> org.apache.beam.sdk.transforms.Combine$GroupedValues$1.processElement(Combine.java:2149)
> Caused by: org.apache.beam.sdk.coders.CoderException: cannot encode a null KV
>   at org.apache.beam.sdk.coders.KvCoder.encode(KvCoder.java:70)
>   at org.apache.beam.sdk.coders.KvCoder.encode(KvCoder.java:36)
>   at org.apache.beam.sdk.coders.KvCoder.encode(KvCoder.java:73)
>   at org.apache.beam.sdk.coders.KvCoder.encode(KvCoder.java:36)
>   at 
> 

[jira] [Resolved] (BEAM-230) Remove WindowedValue#valueInEmptyWindows

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-230?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles resolved BEAM-230.
--
   Resolution: Fixed
Fix Version/s: (was: Not applicable)
   2.4.0

> Remove WindowedValue#valueInEmptyWindows
> 
>
> Key: BEAM-230
> URL: https://issues.apache.org/jira/browse/BEAM-230
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Thomas Groh
>Assignee: Kenneth Knowles
>Priority: Major
> Fix For: 2.4.0
>
>
> A WindowedValue in no windows does not exist, and can be dropped by a runner 
> at any time.
> We should also assert that any collection of windows is nonempty when 
> creating a new WindowedValue. If a user wants to drop an element, they should 
> explicitly filter it out rather than expecting it to be dropped by the runner.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3647) Default Coder/Reading Coder From File

2018-02-12 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3647?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361304#comment-16361304
 ] 

Kenneth Knowles commented on BEAM-3647:
---

Pinging [~kedin] and [~reuvenlax] as this is a good example use case, if I 
understand correctly.

> Default Coder/Reading Coder From File 
> --
>
> Key: BEAM-3647
> URL: https://issues.apache.org/jira/browse/BEAM-3647
> Project: Beam
>  Issue Type: New Feature
>  Components: beam-model, dsl-sql
>Affects Versions: 2.2.0
>Reporter: Kishan Kumar
>Priority: Major
>
> *Requirement*-: Need to Run Template With Same Logics on Different Tables 
> Data.(Example is Given Below)
>  
> *Need*: Default Coder is Required So According to Data It Make All Fields as 
> String and Read Data else Thier must be Dynamic Options to Read Coder From 
> GCS as JSON FILE and Parse Data on Basis of That (But We can Pass Location 
> Using ValueProvider) or SomeWhere Else so At Runtime Using ValueProvider.
>  
>  
> *Examples*: I Have Two Tables 1 is Having Column (NAME, CLASS, ROLL, 
> SUB_PRICE)
> And 2 Table is (NAME, ROLL, SUB, TEST_MARKS)
>  
> On Both Tables, I am Just Sorting Table on Basis Of Roll Number so if We can 
> Read Coder at Run Time The Same Template Can Be Used For Different Tables at 
> Run Time.
>  
> Such Situations Make Our Work Easy and Make Our job Easy.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3647) Default Coder/Reading Coder From File

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3647?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles reassigned BEAM-3647:
-

Assignee: (was: Anton Kedin)

> Default Coder/Reading Coder From File 
> --
>
> Key: BEAM-3647
> URL: https://issues.apache.org/jira/browse/BEAM-3647
> Project: Beam
>  Issue Type: New Feature
>  Components: beam-model, dsl-sql
>Affects Versions: 2.2.0
>Reporter: Kishan Kumar
>Priority: Critical
>
> *Requirement*-: Need to Run Template With Same Logics on Different Tables 
> Data.(Example is Given Below)
>  
> *Need*: Default Coder is Required So According to Data It Make All Fields as 
> String and Read Data else Thier must be Dynamic Options to Read Coder From 
> GCS as JSON FILE and Parse Data on Basis of That (But We can Pass Location 
> Using ValueProvider) or SomeWhere Else so At Runtime Using ValueProvider.
>  
>  
> *Examples*: I Have Two Tables 1 is Having Column (NAME, CLASS, ROLL, 
> SUB_PRICE)
> And 2 Table is (NAME, ROLL, SUB, TEST_MARKS)
>  
> On Both Tables, I am Just Sorting Table on Basis Of Roll Number so if We can 
> Read Coder at Run Time The Same Template Can Be Used For Different Tables at 
> Run Time.
>  
> Such Situations Make Our Work Easy and Make Our job Easy.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3647) Default Coder/Reading Coder From File

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3647?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles reassigned BEAM-3647:
-

Assignee: Anton Kedin  (was: Kenneth Knowles)

> Default Coder/Reading Coder From File 
> --
>
> Key: BEAM-3647
> URL: https://issues.apache.org/jira/browse/BEAM-3647
> Project: Beam
>  Issue Type: New Feature
>  Components: beam-model, dsl-sql
>Affects Versions: 2.2.0
>Reporter: Kishan Kumar
>Assignee: Anton Kedin
>Priority: Critical
>
> *Requirement*-: Need to Run Template With Same Logics on Different Tables 
> Data.(Example is Given Below)
>  
> *Need*: Default Coder is Required So According to Data It Make All Fields as 
> String and Read Data else Thier must be Dynamic Options to Read Coder From 
> GCS as JSON FILE and Parse Data on Basis of That (But We can Pass Location 
> Using ValueProvider) or SomeWhere Else so At Runtime Using ValueProvider.
>  
>  
> *Examples*: I Have Two Tables 1 is Having Column (NAME, CLASS, ROLL, 
> SUB_PRICE)
> And 2 Table is (NAME, ROLL, SUB, TEST_MARKS)
>  
> On Both Tables, I am Just Sorting Table on Basis Of Roll Number so if We can 
> Read Coder at Run Time The Same Template Can Be Used For Different Tables at 
> Run Time.
>  
> Such Situations Make Our Work Easy and Make Our job Easy.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to stable : beam_PostCommit_Java_MavenInstall #5944

2018-02-12 Thread Apache Jenkins Server
See 




[jira] [Updated] (BEAM-3647) Default Coder/Reading Coder From File

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3647?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-3647:
--
Labels:   (was: newbie)

> Default Coder/Reading Coder From File 
> --
>
> Key: BEAM-3647
> URL: https://issues.apache.org/jira/browse/BEAM-3647
> Project: Beam
>  Issue Type: New Feature
>  Components: beam-model, dsl-sql
>Affects Versions: 2.2.0
>Reporter: Kishan Kumar
>Assignee: Kenneth Knowles
>Priority: Critical
>
> *Requirement*-: Need to Run Template With Same Logics on Different Tables 
> Data.(Example is Given Below)
>  
> *Need*: Default Coder is Required So According to Data It Make All Fields as 
> String and Read Data else Thier must be Dynamic Options to Read Coder From 
> GCS as JSON FILE and Parse Data on Basis of That (But We can Pass Location 
> Using ValueProvider) or SomeWhere Else so At Runtime Using ValueProvider.
>  
>  
> *Examples*: I Have Two Tables 1 is Having Column (NAME, CLASS, ROLL, 
> SUB_PRICE)
> And 2 Table is (NAME, ROLL, SUB, TEST_MARKS)
>  
> On Both Tables, I am Just Sorting Table on Basis Of Roll Number so if We can 
> Read Coder at Run Time The Same Template Can Be Used For Different Tables at 
> Run Time.
>  
> Such Situations Make Our Work Easy and Make Our job Easy.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-2950) Provide implicit access to State

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2950?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-2950:
--
Issue Type: New Feature  (was: Bug)

> Provide implicit access to State
> 
>
> Key: BEAM-2950
> URL: https://issues.apache.org/jira/browse/BEAM-2950
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-java-core
>Reporter: Eugene Kirpichov
>Assignee: Kenneth Knowles
>Priority: Major
>
> https://github.com/apache/beam/pull/3814 provides implicit access to side 
> inputs (without a ProcessContext). Luke suggests to have the same for State 
> and, I suppose, timers. We could also have it for PipelineOptions: in any 
> given user code invocation, these are all unambiguous.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (BEAM-3627) Dataflow ValidatesRunner failing ViewTest

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3627?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles resolved BEAM-3627.
---
Resolution: Fixed

> Dataflow ValidatesRunner failing ViewTest
> -
>
> Key: BEAM-3627
> URL: https://issues.apache.org/jira/browse/BEAM-3627
> Project: Beam
>  Issue Type: Bug
>  Components: runner-dataflow
>Reporter: Kenneth Knowles
>Assignee: Kenneth Knowles
>Priority: Blocker
> Fix For: 2.4.0
>
>  Time Spent: 40m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3272) ParDoTranslatorTest: Error creating local cluster while creating checkpoint file

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3272?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles reassigned BEAM-3272:
-

Assignee: (was: Kenneth Knowles)

> ParDoTranslatorTest: Error creating local cluster while creating checkpoint 
> file
> 
>
> Key: BEAM-3272
> URL: https://issues.apache.org/jira/browse/BEAM-3272
> Project: Beam
>  Issue Type: Bug
>  Components: runner-apex
>Reporter: Eugene Kirpichov
>Priority: Critical
>  Labels: flake, sickbay
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> Failed build: 
> https://builds.apache.org/job/beam_PostCommit_Java_MavenInstall/org.apache.beam$beam-runners-apex/5330/console
> Key output:
> {code}
> 2017-11-29T01:21:26.956 [ERROR] 
> testAssertionFailure(org.apache.beam.runners.apex.translation.ParDoTranslatorTest)
>   Time elapsed: 2.007 s  <<< ERROR!
> java.lang.RuntimeException: Error creating local cluster
>   at 
> org.apache.apex.engine.EmbeddedAppLauncherImpl.getController(EmbeddedAppLauncherImpl.java:122)
>   at 
> org.apache.apex.engine.EmbeddedAppLauncherImpl.launchApp(EmbeddedAppLauncherImpl.java:71)
>   at 
> org.apache.apex.engine.EmbeddedAppLauncherImpl.launchApp(EmbeddedAppLauncherImpl.java:46)
>   at org.apache.beam.runners.apex.ApexRunner.run(ApexRunner.java:197)
>   at 
> org.apache.beam.runners.apex.TestApexRunner.run(TestApexRunner.java:57)
>   at 
> org.apache.beam.runners.apex.TestApexRunner.run(TestApexRunner.java:31)
>   at org.apache.beam.sdk.Pipeline.run(Pipeline.java:304)
>   at org.apache.beam.sdk.Pipeline.run(Pipeline.java:290)
>   at 
> org.apache.beam.runners.apex.translation.ParDoTranslatorTest.runExpectingAssertionFailure(ParDoTranslatorTest.java:156)
> {code}
> ...
> {code}
> Caused by: ExitCodeException exitCode=1: chmod: cannot access 
> ‘/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Java_MavenInstall/src/runners/apex/target/com.datatorrent.stram.StramLocalCluster/checkpoints/2/_tmp’:
>  No such file or directory
>   at org.apache.hadoop.util.Shell.runCommand(Shell.java:582)
>   at org.apache.hadoop.util.Shell.run(Shell.java:479)
>   at 
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:773)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:866)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:849)
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:733)
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.(RawLocalFileSystem.java:225)
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.(RawLocalFileSystem.java:209)
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem.createOutputStreamWithMode(RawLocalFileSystem.java:307)
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:296)
>   at 
> org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:328)
>   at org.apache.hadoop.fs.FileSystem.primitiveCreate(FileSystem.java:1017)
>   at 
> org.apache.hadoop.fs.DelegateToFileSystem.createInternal(DelegateToFileSystem.java:99)
>   at 
> org.apache.hadoop.fs.ChecksumFs$ChecksumFSOutputSummer.(ChecksumFs.java:352)
>   at org.apache.hadoop.fs.ChecksumFs.createInternal(ChecksumFs.java:399)
>   at 
> org.apache.hadoop.fs.AbstractFileSystem.create(AbstractFileSystem.java:584)
>   at org.apache.hadoop.fs.FileContext$3.next(FileContext.java:686)
>   at org.apache.hadoop.fs.FileContext$3.next(FileContext.java:682)
>   at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
>   at org.apache.hadoop.fs.FileContext.create(FileContext.java:688)
>   at 
> com.datatorrent.common.util.AsyncFSStorageAgent.copyToHDFS(AsyncFSStorageAgent.java:119)
>   ... 50 more
> {code}
> By inspecting code at the stack frames, seems it's trying to copy an 
> operator's checkpoint "to HDFS" (which in this case is the local disk), but 
> fails while creating the target file of the copy - creation creates the file 
> (successfully) and chmods it writable (unsuccessfully). Barring something 
> subtle (e.g. chmod being not allowed to call immediately after creating a 
> FileOutputStream), this looks like the whole directory was possibly deleted 
> from under the process. I don't know why this would be the case though, or 
> how to debug it.
> Either way, the path being accessed is funky: 
> /home/jenkins/jenkins-slave/workspace/beam_PostCommit_Java_MavenInstall/src/runners/apex/target/...
>  - I think it'd be better if this test used a "@Rule TemporaryFolder" to 
> store Apex checkpoints. I don't know whether the Apex runner allows that, but 
> I can see how it could help reduce interference 

Build failed in Jenkins: beam_PostCommit_Python_ValidatesContainer_Dataflow #25

2018-02-12 Thread Apache Jenkins Server
d dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nocapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
--runner=TestDataflowRunner \
--project=$PROJECT \
--worker_harness_container_image=$CONTAINER:$TAG \
--staging_location=$GCS_LOCATION/staging-validatesrunner-test \
--temp_location=$GCS_LOCATION/temp-validatesrunner-test \
--output=$GCS_LOCATION/output \
--sdk_location=$SDK_LOCATION \
--num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesContainer_Dataflow/ws/src/sdks/python/container/local/lib/python2.7/site-packages/setuptools/dist.py>:355:
 UserWarning: Normalizing '2.4.0.dev' to '2.4.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesContainer_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py>:166:
 DeprecationWarning: object() takes no parameters
  super(GcsIO, cls).__new__(cls, storage_client))
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) 
... ok

--
Ran 1 test in 458.634s

OK

# Delete the container locally and remotely
docker rmi $CONTAINER:$TAG
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20180212-184356
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python@sha256:1dd593e836d21d7fcf0a2924e44ff0b727096aeb088564f4af1279a99f237163
Deleted: sha256:c5938ac3efdffa8ebf184d638c404cc4877012d69328ceec845b482dbb0e8a1c
Deleted: sha256:a4f6eec4da67af4d6eaf8296ca8e93f57688d1a2f6200d962236f2eaf51fbf59
Deleted: sha256:c63897b9540b8faa29de3bd3354acacb12716d18040a3a5689a7ba42508c78c1
Deleted: sha256:4485dc397f60dd47529503815e9c90c01a8e3501e91d3e0aa534fbc019695af8
Deleted: sha256:31e229c1d8a35fa969ecaf39a4de105b15929076e207837f6cf7f2fee51ed217
Deleted: sha256:6ec7453e8980ef3c9bce697049ea615b3ddc9dd6854dca89394106989865c2ac
Deleted: sha256:da68ea36fb5f96b5f38e0ac7b3c113641fa5b3143227bfca1527564b1a7665d1
Deleted: sha256:b59c4a03b5eef748f74a5f5aee9a98dd9115e8bdea8785ba025d2a28b81dd0e4
gcloud container images delete $CONTAINER:$TAG --quiet
Usage: gcloud container [optional flags] 
  group may be   clusters | node-pools | operations
  command may be get-server-config

Deploy and manage clusters of machines for running containers.

flags:
  Run `gcloud container --help`
  for the full list of available flags for this command.

global flags:
  Run `gcloud -h` for a description of flags available to all commands.

command groups:
  clusters   Deploy and teardown Google Container Engine clusters.
  node-pools Create and delete operations for Google Container
 Engine node pools.
  operations Get and list operations for Google Container Engine
 clusters.

commands:
  get-server-config  Get Container Engine server config.


For more detailed information on this command and its flags, run:
  gcloud container --help

ERROR: (gcloud.container) Invalid choice: 'images'.

Valid choices are [clusters, get-server-config, node-pools, operations].
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user aljoscha.kret...@gmail.com
Not sending mail to unregistered user eh...@google.com
Not sending mail to unregistered user z...@giggles.nyc.corp.google.com
Not sending mail to unregistered user sid...@google.com
Not sending mail to unregistered user xuming...@users.noreply.github.com
Not sending mail to unregistered user j...@nanthrax.net
Not sending mail to unregistered user pawel.pk.kaczmarc...@gmail.com
Not sending mail to unregistered user ke...@google.com
Not sending mail to unregistered user aromanenko@gmail.com
Not sending mail to unregistered user joey.bar...@gmail.com
Not sending mail to unregistered user dariusz.aniszew...@polidea.com
Not sending mail to unregistered user mott...@gmail.com
Not sending mail to unregistered user ccla...@bluewin.ch
Not sending mail to unregistered user w...@google.com
Not sending mail to unregistered user hero...@google.com
Not sending mail to unregister

Build failed in Jenkins: beam_PostRelease_NightlySnapshot #35

2018-02-12 Thread Apache Jenkins Server
See 


--
GitHub pull request #4665 of commit 5a5b86740ce88d8a0a25081a74b58f54051a, 
no merge conflicts.
Setting status of 5a5b86740ce88d8a0a25081a74b58f54051a to PENDING with url 
https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/35/ and message: 
'Build started sha1 is merged.'
Using context: Jenkins: ./gradlew :release:runQuickstartsPython
[EnvInject] - Loading node environment variables.
Building remotely on beam4 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/4665/*:refs/remotes/origin/pr/4665/*
 > git rev-parse refs/remotes/origin/pr/4665/merge^{commit} # timeout=10
 > git rev-parse refs/remotes/origin/origin/pr/4665/merge^{commit} # timeout=10
Checking out Revision ed019fd281929d86c8e4a7744e04772bb874a00d 
(refs/remotes/origin/pr/4665/merge)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f ed019fd281929d86c8e4a7744e04772bb874a00d
Commit message: "Merge 5a5b86740ce88d8a0a25081a74b58f54051a into 
fd5c891f1c7310b481153ab7f9f4316e8e63c1c8"
First time build. Skipping changelog.
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 
 
-DghprbTriggerAuthorLogin=yifanzou -DghprbGhRepository=apache/beam 
-DghprbAuthorRepoGitUrl=https://github.com/yifanzou/beam.git 
-DghprbTriggerAuthor= -Dsnapshot_url= 
-DghprbPullLink=https://github.com/apache/beam/pull/4665 
-DghprbActualCommit=5a5b86740ce88d8a0a25081a74b58f54051a 
-DghprbPullAuthorLogin=yifanzou -DghprbSourceBranch=python-sdk-postrelease 
-DghprbTriggerAuthorLoginMention=@yifanzou 
"-DghprbPullLongDescription=DESCRIPTION 
HERE\r\n\r\n\r\n\r\nFollow this checklist to help us 
incorporate your contribution quickly and easily:\r\n\r\n - [ ] Make sure there 
is a [JIRA issue](https://issues.apache.org/jira/projects/BEAM/issues/) filed 
for the change (usually before you start working on it).  Trivial changes like 
typos do not require a JIRA issue.  Your pull request should address just this 
issue, without pulling in other changes.\r\n - [ ] Format the pull request 
title like `[BEAM-XXX] Fixes bug in ApproximateQuantiles`, where you replace 
`BEAM-XXX` with the appropriate JIRA issue.\r\n - [ ] Write a pull request 
description that is detailed enough to understand:\r\n   - [ ] What the pull 
request does\r\n   - [ ] Why it does it\r\n   - [ ] How it does it\r\n   - [ ] 
Why this approach\r\n - [ ] Each commit in the pull request should have a 
meaningful subject line and body.\r\n - [ ] Run `mvn clean verify` to make sure 
basic checks pass. A more thorough check will be performed on your pull request 
automatically.\r\n - [ ] If this contribution is large, please file an Apache 
[Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.pdf).\r\n\r\n" 
-DghprbTriggerAuthorEmail= "-DghprbPullTitle=[BEAM-3339] Adding Quickstarts 
validation for python sdk" -DghprbActualCommitAuthorEmail= -Dsnapshot_version= 
-DghprbTargetBranch=master -DghprbPullId=4665 
-DghprbCredentialsId=c09f9fd9-1eef-4129-b5ac-047de7d90d04 
-DghprbPullAuthorEmail= -Dsha1=origin/pr/4665/merge 
-DghprbPullAuthorLoginMention=@yifanzou "-DghprbPullDescription=GitHub pull 
request #4665 of commit 5a5b86740ce88d8a0a25081a74b58f54051a, no merge 
conflicts." -DghprbActualCommitAuthor= -DGIT_BRANCH=python-sdk-postrelease 
"-DghprbCommentBody=Run Dataflow PostRelease" -Pver= -Prepourl= 
release:runQuickstartsPython
Parallel execution with configuration on demand is an incubating feature.
Applying build_rules.gradle to src

FAILURE: Build failed with an exception.

* Where:
Build file 
'
 line: 38

* What went wrong:
A problem occurred evaluating project ':release'.
> Could not create task 'runQuickstartsPython': Unknown argument(s) in task 
> definition: [dependsoOn]

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full 

[jira] [Resolved] (BEAM-3692) Hadoop Input Format module is skipped from deployment after mix with Java 1.8

2018-02-12 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-3692?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía resolved BEAM-3692.

   Resolution: Fixed
Fix Version/s: 2.4.0

> Hadoop Input Format module is skipped from deployment after mix with Java 1.8
> -
>
> Key: BEAM-3692
> URL: https://issues.apache.org/jira/browse/BEAM-3692
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Reporter: Ismaël Mejía
>Assignee: Jean-Baptiste Onofré
>Priority: Major
> Fix For: 2.4.0
>
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> An error on the merge of the build scrips for Java 8 added a skip for this 
> module.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: Merge pull request #4663: [BEAM-3692] Remove maven deploy plugin configuration with skip in the hadoop-input-format IO module

2018-02-12 Thread iemejia
This is an automated email from the ASF dual-hosted git repository.

iemejia pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit fd5c891f1c7310b481153ab7f9f4316e8e63c1c8
Merge: aec829a 2c9ee0b
Author: Ismaël Mejía 
AuthorDate: Mon Feb 12 19:29:03 2018 +0100

Merge pull request #4663: [BEAM-3692] Remove maven deploy plugin 
configuration with skip in the hadoop-input-format IO module

[BEAM-3692] Remove maven deploy plugin configuration with skip in the 
hadoop-input-format IO module

 sdks/java/io/hadoop-input-format/pom.xml | 7 ---
 1 file changed, 7 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
ieme...@apache.org.


[beam] branch master updated (aec829a -> fd5c891)

2018-02-12 Thread iemejia
This is an automated email from the ASF dual-hosted git repository.

iemejia pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from aec829a  Merge pull request #4572 [BEAM-3074] Serialize DoFns by 
portable id in Dataflow runner.
 add 2c9ee0b  [BEAM-3692] Remove maven deploy plugin configuration with 
skip in the hadoop-input-format IO module
 new fd5c891  Merge pull request #4663: [BEAM-3692] Remove maven deploy 
plugin configuration with skip in the hadoop-input-format IO module

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/java/io/hadoop-input-format/pom.xml | 7 ---
 1 file changed, 7 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
ieme...@apache.org.


[jira] [Assigned] (BEAM-3634) [SQL] Refactor BeamRelNodes into PTransforms

2018-02-12 Thread Anton Kedin (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3634?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Anton Kedin reassigned BEAM-3634:
-

Assignee: Kenneth Knowles

> [SQL] Refactor BeamRelNodes into PTransforms
> 
>
> Key: BEAM-3634
> URL: https://issues.apache.org/jira/browse/BEAM-3634
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Anton Kedin
>Assignee: Kenneth Knowles
>Priority: Major
>
> BeamRelNode exposes PCollection buildBeamPipeline() which builds 
> a pipeline when parsing.
> It feels like it should instead implement a 
> PTransform which would 
> receive a prepared PCollection, and apply sub-expressions instead of manually 
> invoking expression evaluation to get the input.
> And maybe consider building it lazily.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is still unstable: beam_PostCommit_Java_MavenInstall #5943

2018-02-12 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_TextIOIT #150

2018-02-12 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-3629] Send the windowing strategy and whether its a merging window

[iemejia] Fix warning on jenkins on non-existent profile 
'validates-runner-tests'

[iemejia] Remove unneeded overwrites of maven-compiler-plugin

[iemejia] Change tests execution order from filesystem (default) to random

[iemejia] Remove repeated dependencies on runners/java-fn-execution module

[iemejia] Add missing modules to javadoc generation: TikaIO, RedisIO, Jackson, 
Xml

[iemejia] [BEAM-2530] Make final fixes to ensure code and tests compile with 
Java

--
[...truncated 25.01 KB...]
[INFO] Excluding org.apache.httpcomponents:httpclient:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding org.apache.httpcomponents:httpcore:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding commons-codec:commons-codec:jar:1.3 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded 
jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-io-google-cloud-platform:jar:2.4.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-protobuf:jar:2.4.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev355-1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.api:gax-grpc:jar:0.20.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.auto.value:auto-value:jar:1.5.1 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-pubsub:jar:v1-rev10-1.22.0 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding 

Build failed in Jenkins: beam_PerformanceTests_Python #906

2018-02-12 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-3629] Send the windowing strategy and whether its a merging window

[iemejia] Fix warning on jenkins on non-existent profile 
'validates-runner-tests'

[iemejia] Remove unneeded overwrites of maven-compiler-plugin

[iemejia] Change tests execution order from filesystem (default) to random

[iemejia] Remove repeated dependencies on runners/java-fn-execution module

[iemejia] Add missing modules to javadoc generation: TikaIO, RedisIO, Jackson, 
Xml

[iemejia] [BEAM-2530] Make final fixes to ensure code and tests compile with 
Java

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam7 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision a0071ed64569982d19ccd03047600d15fd743fdc (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f a0071ed64569982d19ccd03047600d15fd743fdc
Commit message: "[BEAM-3629] Send the windowing strategy and whether its a 
merging window fn to Dataflow."
 > git rev-list 67234a15b56ddeed1d4dd90d276bad7e5003822c # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins7356237673886653156.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins2747508338875177610.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6423647119034939963.sh
+ virtualenv .env --system-site-packages
New python executable in 

Installing setuptools, pip, wheel...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe /tmp/jenkins96009241630856505.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in ./.env/lib/python2.7/site-packages
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins7378703334742625102.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins2989690027785256286.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
:318:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve 

[beam] 01/01: Merge pull request #4572 [BEAM-3074] Serialize DoFns by portable id in Dataflow runner.

2018-02-12 Thread robertwb
This is an automated email from the ASF dual-hosted git repository.

robertwb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit aec829a87541e3e6deac9363b7d7583fd317b1ca
Merge: a0071ed 52cabfd
Author: Robert Bradshaw 
AuthorDate: Mon Feb 12 10:05:46 2018 -0800

Merge pull request #4572 [BEAM-3074] Serialize DoFns by portable id in 
Dataflow runner.

 .../apache_beam/runners/dataflow/dataflow_runner.py | 17 ++---
 sdks/python/apache_beam/runners/pipeline_context.py |  3 +++
 2 files changed, 17 insertions(+), 3 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
rober...@apache.org.


[beam] branch master updated (a0071ed -> aec829a)

2018-02-12 Thread robertwb
This is an automated email from the ASF dual-hosted git repository.

robertwb pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from a0071ed  [BEAM-3629] Send the windowing strategy and whether its a 
merging window fn to Dataflow.
 add 52cabfd  [BEAM-3074] Serialize DoFns by portable id in Dataflow runner.
 new aec829a  Merge pull request #4572 [BEAM-3074] Serialize DoFns by 
portable id in Dataflow runner.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../apache_beam/runners/dataflow/dataflow_runner.py | 17 ++---
 sdks/python/apache_beam/runners/pipeline_context.py |  3 +++
 2 files changed, 17 insertions(+), 3 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
rober...@apache.org.


[beam] branch master updated (9666228 -> a0071ed)

2018-02-12 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 9666228  Merge pull request #4657: Multiple maven fixes
 add 5d6b25e  [BEAM-3629] Send the windowing strategy and whether its a 
merging window fn to Dataflow.
 new a0071ed  [BEAM-3629] Send the windowing strategy and whether its a 
merging window fn to Dataflow.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../apache/beam/runners/dataflow/DataflowPipelineTranslator.java   | 7 +++
 .../java/org/apache/beam/runners/dataflow/util/PropertyNames.java  | 1 +
 2 files changed, 8 insertions(+)

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


[beam] 01/01: [BEAM-3629] Send the windowing strategy and whether its a merging window fn to Dataflow.

2018-02-12 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit a0071ed64569982d19ccd03047600d15fd743fdc
Merge: 9666228 5d6b25e
Author: Lukasz Cwik 
AuthorDate: Mon Feb 12 09:52:27 2018 -0800

[BEAM-3629] Send the windowing strategy and whether its a merging window fn 
to Dataflow.

 .../apache/beam/runners/dataflow/DataflowPipelineTranslator.java   | 7 +++
 .../java/org/apache/beam/runners/dataflow/util/PropertyNames.java  | 1 +
 2 files changed, 8 insertions(+)

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


Jenkins build became unstable: beam_PostCommit_Java_MavenInstall #5942

2018-02-12 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-3690) Dependency Conflict problems: several conflicting classes exist in different JARs

2018-02-12 Thread Luke Cwik (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361083#comment-16361083
 ] 

Luke Cwik commented on BEAM-3690:
-

Note, the gradle build bans hamcrest-all and mockito-all and forces people to 
depend on hamcrest-core, mockito-core, ...

So only fixes to Maven are required.

> Dependency Conflict problems: several conflicting classes exist in different 
> JARs
> -
>
> Key: BEAM-3690
> URL: https://issues.apache.org/jira/browse/BEAM-3690
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Affects Versions: 2.2.0
>Reporter: PandaMonkey
>Assignee: PandaMonkey
>Priority: Major
> Fix For: 2.3.0
>
> Attachments: beam_conflicts.txt
>
>
> Hi, we found that there are duplicate classes exist in different JARs, and 
> these classes have different features.
> The conflicting JAR pairs are:
> 1. 
> jar-pair:
> 2. 
> jar-pair:
> Some of method only exist in one version of duplicate classes.
> As the JVM only load the classes present first on the classpath and shadow 
> the other duplicate ones with the same names. The dependency conflict problem 
> brings high risks of "*NoSuchMethodException*" or "*NoSuchMethodError*"  
> issues at runtime. The conflicting details are listed in the attachment. 
> Please notice that. Thanks.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_ValidatesContainer_Dataflow #24

2018-02-12 Thread Apache Jenkins Server
alidatesrunner-test \
--temp_location=$GCS_LOCATION/temp-validatesrunner-test \
--output=$GCS_LOCATION/output \
--sdk_location=$SDK_LOCATION \
--num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesContainer_Dataflow/ws/src/sdks/python/container/local/lib/python2.7/site-packages/setuptools/dist.py>:355:
 UserWarning: Normalizing '2.4.0.dev' to '2.4.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesContainer_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py>:166:
 DeprecationWarning: object() takes no parameters
  super(GcsIO, cls).__new__(cls, storage_client))
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) 
... ok

--
Ran 1 test in 373.693s

OK

# Delete the container locally and remotely
docker rmi $CONTAINER:$TAG
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20180212-164823
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python@sha256:02c0f5ab59a626ca943cdccecadcba3e7a5a5c598ab69ead462ab766583555e5
Deleted: sha256:3e146ec4c227d05a51bae9bd8ec812e0bb54721416a0232929f94cf1a782bffa
Deleted: sha256:024e0c23d256f612f5a174a3430f55b16c0549b248d94ab1ec7978820e3a5e1e
Deleted: sha256:db7d2d98c0ef025b4891ec930399948da22e4067d84239e1f83e60904f857c3b
Deleted: sha256:5537f7b414981e2f13c607a57594156214f868fe07bf50fdd51634ed11d7d430
Deleted: sha256:55bb6122b4882e3e637a2819aa8b72fd264c9b48b7714092289018c83f18375b
Deleted: sha256:f4ce464f9bfaa55469d77f4298cc22c544d0bb20948ebb4df86a2233b1682c12
Deleted: sha256:67d748d852b79c935efc8d3e684492e135c3151ad69ec330dc337bcdb103fd45
Deleted: sha256:5a37a1ce1d5dcd3e2c68340b886c450f0a9ef764e8ca6dbc431a04127cc5c1d5
gcloud container images delete $CONTAINER:$TAG --quiet
Usage: gcloud container [optional flags] 
  group may be   clusters | node-pools | operations
  command may be get-server-config

Deploy and manage clusters of machines for running containers.

flags:
  Run `gcloud container --help`
  for the full list of available flags for this command.

global flags:
  Run `gcloud -h` for a description of flags available to all commands.

command groups:
  clusters   Deploy and teardown Google Container Engine clusters.
  node-pools Create and delete operations for Google Container
 Engine node pools.
  operations Get and list operations for Google Container Engine
 clusters.

commands:
  get-server-config  Get Container Engine server config.


For more detailed information on this command and its flags, run:
  gcloud container --help

ERROR: (gcloud.container) Invalid choice: 'images'.

Valid choices are [clusters, get-server-config, node-pools, operations].
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user aljoscha.kret...@gmail.com
Not sending mail to unregistered user eh...@google.com
Not sending mail to unregistered user z...@giggles.nyc.corp.google.com
Not sending mail to unregistered user sid...@google.com
Not sending mail to unregistered user xuming...@users.noreply.github.com
Not sending mail to unregistered user j...@nanthrax.net
Not sending mail to unregistered user pawel.pk.kaczmarc...@gmail.com
Not sending mail to unregistered user ke...@google.com
Not sending mail to unregistered user aromanenko@gmail.com
Not sending mail to unregistered user joey.bar...@gmail.com
Not sending mail to unregistered user dariusz.aniszew...@polidea.com
Not sending mail to unregistered user mott...@gmail.com
Not sending mail to unregistered user ccla...@bluewin.ch
Not sending mail to unregistered user w...@google.com
Not sending mail to unregistered user hero...@google.com
Not sending mail to unregistered user c...@google.com
Not sending mail to unregistered user ekirpic...@gmail.com
Not sending mail to unregistered user daniel.o.program...@gmail.com
Not sending mail to unregistered user mari...@mariagh.svl.corp.google.com
Not sending mail to unregistered user g...@telligent-data.com
Not sending mail to unregistered user apill...@google.com
Not sending mail to unregistered user jiang...@gmail.com
Not sending mail to unregistered user fjetum...@gmail.com
Not sending mail to unregistered user kirpic...@google.com
Not sending mail to unregistered user git...@alasdairhodge.co.uk
Not sending mail to unregistered user k...@google.com
Not sending mail to unregistered user mair...@google.com


[beam] branch master updated (2b4df80 -> 9666228)

2018-02-12 Thread kenn
This is an automated email from the ASF dual-hosted git repository.

kenn pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 2b4df80  Merge pull request #4662: [BEAM-2530] Make final fixes to 
ensure code and tests compile with Java 9
 add 855930b  Fix warning on jenkins on non-existent profile 
'validates-runner-tests'
 add 2cd499a  Remove unneeded overwrites of maven-compiler-plugin
 add 109b45a  Change tests execution order from filesystem (default) to 
random
 add 8b0aff3  Remove repeated dependencies on runners/java-fn-execution 
module
 add f965e80  Add missing modules to javadoc generation: TikaIO, RedisIO, 
Jackson, Xml
 new 9666228  Merge pull request #4657: Multiple maven fixes

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 ...am_PostCommit_Java_ValidatesRunner_Flink.groovy |  2 +-
 ...am_PostCommit_Java_ValidatesRunner_Spark.groovy |  2 +-
 pom.xml| 45 +-
 runners/gearpump/pom.xml   | 11 --
 runners/java-fn-execution/pom.xml  | 11 --
 sdks/java/extensions/sql/pom.xml   |  2 -
 sdks/java/io/tika/pom.xml  | 12 --
 sdks/java/javadoc/pom.xml  | 43 -
 8 files changed, 79 insertions(+), 49 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
k...@apache.org.


[beam] 01/01: Merge pull request #4657: Multiple maven fixes

2018-02-12 Thread kenn
This is an automated email from the ASF dual-hosted git repository.

kenn pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 9666228c453754c3b11845de9ac52ea5e9ce54c2
Merge: 2b4df80 f965e80
Author: Kenn Knowles 
AuthorDate: Mon Feb 12 08:57:22 2018 -0800

Merge pull request #4657: Multiple maven fixes

 ...am_PostCommit_Java_ValidatesRunner_Flink.groovy |  2 +-
 ...am_PostCommit_Java_ValidatesRunner_Spark.groovy |  2 +-
 pom.xml| 45 +-
 runners/gearpump/pom.xml   | 11 --
 runners/java-fn-execution/pom.xml  | 11 --
 sdks/java/extensions/sql/pom.xml   |  2 -
 sdks/java/io/tika/pom.xml  | 12 --
 sdks/java/javadoc/pom.xml  | 43 -
 8 files changed, 79 insertions(+), 49 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
k...@apache.org.


[jira] [Assigned] (BEAM-3690) Dependency Conflict problems: several conflicting classes exist in different JARs

2018-02-12 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3690?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles reassigned BEAM-3690:
-

Assignee: PandaMonkey  (was: Kenneth Knowles)

> Dependency Conflict problems: several conflicting classes exist in different 
> JARs
> -
>
> Key: BEAM-3690
> URL: https://issues.apache.org/jira/browse/BEAM-3690
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Affects Versions: 2.2.0
>Reporter: PandaMonkey
>Assignee: PandaMonkey
>Priority: Major
> Fix For: 2.3.0
>
> Attachments: beam_conflicts.txt
>
>
> Hi, we found that there are duplicate classes exist in different JARs, and 
> these classes have different features.
> The conflicting JAR pairs are:
> 1. 
> jar-pair:
> 2. 
> jar-pair:
> Some of method only exist in one version of duplicate classes.
> As the JVM only load the classes present first on the classpath and shadow 
> the other duplicate ones with the same names. The dependency conflict problem 
> brings high risks of "*NoSuchMethodException*" or "*NoSuchMethodError*"  
> issues at runtime. The conflicting details are listed in the attachment. 
> Please notice that. Thanks.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3690) Dependency Conflict problems: several conflicting classes exist in different JARs

2018-02-12 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361059#comment-16361059
 ] 

Kenneth Knowles commented on BEAM-3690:
---

Thanks for filing this. The SDK core does depend on Hamcrest core. It should 
not be including mockito or hamcrest-all. I will not be able to look at this 
right away - would you be willing to take a look?

> Dependency Conflict problems: several conflicting classes exist in different 
> JARs
> -
>
> Key: BEAM-3690
> URL: https://issues.apache.org/jira/browse/BEAM-3690
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Affects Versions: 2.2.0
>Reporter: PandaMonkey
>Assignee: Kenneth Knowles
>Priority: Major
> Fix For: 2.3.0
>
> Attachments: beam_conflicts.txt
>
>
> Hi, we found that there are duplicate classes exist in different JARs, and 
> these classes have different features.
> The conflicting JAR pairs are:
> 1. 
> jar-pair:
> 2. 
> jar-pair:
> Some of method only exist in one version of duplicate classes.
> As the JVM only load the classes present first on the classpath and shadow 
> the other duplicate ones with the same names. The dependency conflict problem 
> brings high risks of "*NoSuchMethodException*" or "*NoSuchMethodError*"  
> issues at runtime. The conflicting details are listed in the attachment. 
> Please notice that. Thanks.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3637) HBaseIOTest methods do not clean up tables

2018-02-12 Thread Alexey Romanenko (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3637?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361056#comment-16361056
 ] 

Alexey Romanenko commented on BEAM-3637:


Disabling and truncating/deleting is quite expensive operation on mini-cluster. 
So I agree that using uniq table name would be better in this case.

> HBaseIOTest methods do not clean up tables
> --
>
> Key: BEAM-3637
> URL: https://issues.apache.org/jira/browse/BEAM-3637
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Reporter: Kenneth Knowles
>Assignee: Alexey Romanenko
>Priority: Minor
>  Labels: beginner, newbie, starter
>  Time Spent: 20m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3692) Hadoop Input Format module is skipped from deployment after mix with Java 1.8

2018-02-12 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-3692?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía reassigned BEAM-3692:
--

Assignee: Jean-Baptiste Onofré  (was: Ismaël Mejía)

> Hadoop Input Format module is skipped from deployment after mix with Java 1.8
> -
>
> Key: BEAM-3692
> URL: https://issues.apache.org/jira/browse/BEAM-3692
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Reporter: Ismaël Mejía
>Assignee: Jean-Baptiste Onofré
>Priority: Major
>
> An error on the merge of the build scrips for Java 8 added a skip for this 
> module.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-3692) Hadoop Input Format module is skipped from deployment after mix with Java 1.8

2018-02-12 Thread JIRA
Ismaël Mejía created BEAM-3692:
--

 Summary: Hadoop Input Format module is skipped from deployment 
after mix with Java 1.8
 Key: BEAM-3692
 URL: https://issues.apache.org/jira/browse/BEAM-3692
 Project: Beam
  Issue Type: Bug
  Components: sdk-java-extensions
Reporter: Ismaël Mejía
Assignee: Ismaël Mejía


An error on the merge of the build scrips for Java 8 added a skip for this 
module.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] branch master updated (67234a1 -> 2b4df80)

2018-02-12 Thread kenn
This is an automated email from the ASF dual-hosted git repository.

kenn pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 67234a1  Merge pull request #4603: [BEAM-3620] Remove older Kafka 
versions from build time support
 add 9a9c67b  [BEAM-2530] Make final fixes to ensure code and tests compile 
with Java 9
 new 2b4df80  Merge pull request #4662: [BEAM-2530] Make final fixes to 
ensure code and tests compile with Java 9

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/java/io/solr/pom.xml | 7 +++
 1 file changed, 7 insertions(+)

-- 
To stop receiving notification emails like this one, please contact
k...@apache.org.


[beam] 01/01: Merge pull request #4662: [BEAM-2530] Make final fixes to ensure code and tests compile with Java 9

2018-02-12 Thread kenn
This is an automated email from the ASF dual-hosted git repository.

kenn pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 2b4df8046e0930d0662c5e635648425463436766
Merge: 67234a1 9a9c67b
Author: Kenn Knowles 
AuthorDate: Mon Feb 12 08:20:28 2018 -0800

Merge pull request #4662: [BEAM-2530] Make final fixes to ensure code and 
tests compile with Java 9

 sdks/java/io/solr/pom.xml | 7 +++
 1 file changed, 7 insertions(+)

-- 
To stop receiving notification emails like this one, please contact
k...@apache.org.


[jira] [Commented] (BEAM-3605) Kinesis ShardReadersPoolTest shouldForgetClosedShardIterator failure

2018-02-12 Thread Alexey Romanenko (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3605?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360783#comment-16360783
 ] 

Alexey Romanenko commented on BEAM-3605:


Just submitted a PR (4661) which fixes BEAM-3605 (this one) and BEAM-3598. 
Also, I fixed some other tests of ShardReadersPoolTest that used 
_Thread.sleep()_ to use _CountDownLatch_ approach.

> Kinesis ShardReadersPoolTest shouldForgetClosedShardIterator failure
> 
>
> Key: BEAM-3605
> URL: https://issues.apache.org/jira/browse/BEAM-3605
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Reporter: Kenneth Knowles
>Assignee: Alexey Romanenko
>Priority: Critical
>  Labels: flake, sickbay
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> Here's one:
> https://builds.apache.org/job/beam_PreCommit_Java_GradleBuild/1758/testReport/junit/org.apache.beam.sdk.io.kinesis/ShardReadersPoolTest/shouldForgetClosedShardIterator/
> Filing all test failures as "Critical" so we can sickbay or fix.
> The Jenkins build will get GC'd so here is the error:
> {code}
> java.lang.AssertionError: 
> Expecting:
>   <["shard1", "shard2"]>
> to contain only:
>   <["shard2"]>
> but the following elements were unexpected:
>   <["shard1"]>
>   at 
> org.apache.beam.sdk.io.kinesis.ShardReadersPoolTest.shouldForgetClosedShardIterator(ShardReadersPoolTest.java:270)
> {code}
> and stderr
> {code}
> Feb 01, 2018 11:24:16 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Shard iterator for shard1 shard is closed, finishing the read loop
> org.apache.beam.sdk.io.kinesis.KinesisShardClosedException
> Feb 01, 2018 11:24:16 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Kinesis Shard read loop has finished
> Feb 01, 2018 11:24:16 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Shard iterator for shard1 shard is closed, finishing the read loop
> org.apache.beam.sdk.io.kinesis.KinesisShardClosedException
> Feb 01, 2018 11:24:16 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Kinesis Shard read loop has finished
> Feb 01, 2018 11:24:19 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Shard iterator for shard1 shard is closed, finishing the read loop
> org.apache.beam.sdk.io.kinesis.KinesisShardClosedException
> Feb 01, 2018 11:24:19 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Kinesis Shard read loop has finished
> Feb 01, 2018 11:24:23 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Shard iterator for shard1 shard is closed, finishing the read loop
> org.apache.beam.sdk.io.kinesis.KinesisShardClosedException: Shard iterator 
> reached end of the shard: streamName=null, shardId=shard1
>   at 
> org.apache.beam.sdk.io.kinesis.ShardRecordsIterator.readNextBatch(ShardRecordsIterator.java:70)
>   at 
> org.apache.beam.sdk.io.kinesis.ShardReadersPool.readLoop(ShardReadersPool.java:121)
>   at 
> org.apache.beam.sdk.io.kinesis.ShardReadersPool.lambda$startReadingShards$0(ShardReadersPool.java:112)
>   at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>   at java.lang.Thread.run(Thread.java:748)
> Feb 01, 2018 11:24:23 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Kinesis Shard read loop has finished
> Feb 01, 2018 11:24:23 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Shard iterator for shard2 shard is closed, finishing the read loop
> org.apache.beam.sdk.io.kinesis.KinesisShardClosedException: Shard iterator 
> reached end of the shard: streamName=null, shardId=shard2
>   at 
> org.apache.beam.sdk.io.kinesis.ShardRecordsIterator.readNextBatch(ShardRecordsIterator.java:70)
>   at 
> org.apache.beam.sdk.io.kinesis.ShardReadersPool.readLoop(ShardReadersPool.java:121)
>   at 
> org.apache.beam.sdk.io.kinesis.ShardReadersPool.lambda$startReadingShards$0(ShardReadersPool.java:112)
>   at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>   at java.lang.Thread.run(Thread.java:748)
> Feb 01, 2018 11:24:23 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool stop
> INFO: Closing shard iterators pool
> Feb 01, 2018 11:24:24 

[jira] [Comment Edited] (BEAM-3605) Kinesis ShardReadersPoolTest shouldForgetClosedShardIterator failure

2018-02-12 Thread Alexey Romanenko (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3605?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360783#comment-16360783
 ] 

Alexey Romanenko edited comment on BEAM-3605 at 2/12/18 2:10 PM:
-

Just submitted a PR (4661) which fixes BEAM-3605 (this one) and BEAM-3598. 
Also, I fixed some other tests of ShardReadersPoolTest that used 
_Thread.sleep()_ to use _CountDownLatch_ approach instead.


was (Author: aromanenko):
Just submitted a PR (4661) which fixes BEAM-3605 (this one) and BEAM-3598. 
Also, I fixed some other tests of ShardReadersPoolTest that used 
_Thread.sleep()_ to use _CountDownLatch_ approach.

> Kinesis ShardReadersPoolTest shouldForgetClosedShardIterator failure
> 
>
> Key: BEAM-3605
> URL: https://issues.apache.org/jira/browse/BEAM-3605
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Reporter: Kenneth Knowles
>Assignee: Alexey Romanenko
>Priority: Critical
>  Labels: flake, sickbay
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> Here's one:
> https://builds.apache.org/job/beam_PreCommit_Java_GradleBuild/1758/testReport/junit/org.apache.beam.sdk.io.kinesis/ShardReadersPoolTest/shouldForgetClosedShardIterator/
> Filing all test failures as "Critical" so we can sickbay or fix.
> The Jenkins build will get GC'd so here is the error:
> {code}
> java.lang.AssertionError: 
> Expecting:
>   <["shard1", "shard2"]>
> to contain only:
>   <["shard2"]>
> but the following elements were unexpected:
>   <["shard1"]>
>   at 
> org.apache.beam.sdk.io.kinesis.ShardReadersPoolTest.shouldForgetClosedShardIterator(ShardReadersPoolTest.java:270)
> {code}
> and stderr
> {code}
> Feb 01, 2018 11:24:16 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Shard iterator for shard1 shard is closed, finishing the read loop
> org.apache.beam.sdk.io.kinesis.KinesisShardClosedException
> Feb 01, 2018 11:24:16 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Kinesis Shard read loop has finished
> Feb 01, 2018 11:24:16 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Shard iterator for shard1 shard is closed, finishing the read loop
> org.apache.beam.sdk.io.kinesis.KinesisShardClosedException
> Feb 01, 2018 11:24:16 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Kinesis Shard read loop has finished
> Feb 01, 2018 11:24:19 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Shard iterator for shard1 shard is closed, finishing the read loop
> org.apache.beam.sdk.io.kinesis.KinesisShardClosedException
> Feb 01, 2018 11:24:19 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Kinesis Shard read loop has finished
> Feb 01, 2018 11:24:23 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Shard iterator for shard1 shard is closed, finishing the read loop
> org.apache.beam.sdk.io.kinesis.KinesisShardClosedException: Shard iterator 
> reached end of the shard: streamName=null, shardId=shard1
>   at 
> org.apache.beam.sdk.io.kinesis.ShardRecordsIterator.readNextBatch(ShardRecordsIterator.java:70)
>   at 
> org.apache.beam.sdk.io.kinesis.ShardReadersPool.readLoop(ShardReadersPool.java:121)
>   at 
> org.apache.beam.sdk.io.kinesis.ShardReadersPool.lambda$startReadingShards$0(ShardReadersPool.java:112)
>   at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>   at java.lang.Thread.run(Thread.java:748)
> Feb 01, 2018 11:24:23 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Kinesis Shard read loop has finished
> Feb 01, 2018 11:24:23 PM org.apache.beam.sdk.io.kinesis.ShardReadersPool 
> readLoop
> INFO: Shard iterator for shard2 shard is closed, finishing the read loop
> org.apache.beam.sdk.io.kinesis.KinesisShardClosedException: Shard iterator 
> reached end of the shard: streamName=null, shardId=shard2
>   at 
> org.apache.beam.sdk.io.kinesis.ShardRecordsIterator.readNextBatch(ShardRecordsIterator.java:70)
>   at 
> org.apache.beam.sdk.io.kinesis.ShardReadersPool.readLoop(ShardReadersPool.java:121)
>   at 
> org.apache.beam.sdk.io.kinesis.ShardReadersPool.lambda$startReadingShards$0(ShardReadersPool.java:112)
>   at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 

[jira] [Commented] (BEAM-3317) KinesisReaderTest is Flaky due to overadvanced watermarks

2018-02-12 Thread JIRA

[ 
https://issues.apache.org/jira/browse/BEAM-3317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360751#comment-16360751
 ] 

Ismaël Mejía commented on BEAM-3317:


Just something to remember, we shall remove the code that was forcing the tests 
to run sequentially to skip this issue from the maven pom file, as part of this 
fix.

> KinesisReaderTest is Flaky due to overadvanced watermarks
> -
>
> Key: BEAM-3317
> URL: https://issues.apache.org/jira/browse/BEAM-3317
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Reporter: Thomas Groh
>Assignee: Alexey Romanenko
>Priority: Critical
>  Labels: flake, sickbay
>
> https://builds.apache.org/job/beam_PreCommit_Java_GradleBuild/392/testReport/junit/org.apache.beam.sdk.io.kinesis/KinesisReaderTest/watermarkAdvancesWhenEnoughRecordsReadRecently/
> org.junit.ComparisonFailure: expected:<[-290308-12-21T19:59:05.225]Z> but 
> was:<[1970-01-01T00:00:01.000]Z>



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


  1   2   >