[jira] [Work logged] (BEAM-3327) Add abstractions to manage Environment Instance lifecycles.

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3327?focusedWorklogId=91623=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91623
 ]

ASF GitHub Bot logged work on BEAM-3327:


Author: ASF GitHub Bot
Created on: 17/Apr/18 05:57
Start Date: 17/Apr/18 05:57
Worklog Time Spent: 10m 
  Work Description: axelmagn commented on issue #5152: [BEAM-3327] Harness 
Manager Interfaces
URL: https://github.com/apache/beam/pull/5152#issuecomment-381854681
 
 
   R: @tgroh 
   CC: @bsidhom 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91623)
Time Spent: 6.5h  (was: 6h 20m)

> Add abstractions to manage Environment Instance lifecycles.
> ---
>
> Key: BEAM-3327
> URL: https://issues.apache.org/jira/browse/BEAM-3327
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Axel Magnuson
>Priority: Major
>  Labels: portability
>  Time Spent: 6.5h
>  Remaining Estimate: 0h
>
> This permits remote stage execution for arbitrary environments



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3327) Add abstractions to manage Environment Instance lifecycles.

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3327?focusedWorklogId=91621=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91621
 ]

ASF GitHub Bot logged work on BEAM-3327:


Author: ASF GitHub Bot
Created on: 17/Apr/18 05:57
Start Date: 17/Apr/18 05:57
Worklog Time Spent: 10m 
  Work Description: axelmagn commented on issue #5152: [BEAM-3327] Harness 
Manager Interfaces
URL: https://github.com/apache/beam/pull/5152#issuecomment-381854681
 
 
   R: tgroh@
   cc: bsidhom@


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91621)
Time Spent: 6h 20m  (was: 6h 10m)

> Add abstractions to manage Environment Instance lifecycles.
> ---
>
> Key: BEAM-3327
> URL: https://issues.apache.org/jira/browse/BEAM-3327
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Axel Magnuson
>Priority: Major
>  Labels: portability
>  Time Spent: 6h 20m
>  Remaining Estimate: 0h
>
> This permits remote stage execution for arbitrary environments



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3327) Add abstractions to manage Environment Instance lifecycles.

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3327?focusedWorklogId=91620=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91620
 ]

ASF GitHub Bot logged work on BEAM-3327:


Author: ASF GitHub Bot
Created on: 17/Apr/18 05:56
Start Date: 17/Apr/18 05:56
Worklog Time Spent: 10m 
  Work Description: axelmagn opened a new pull request #5152: [BEAM-3327] 
Harness Manager Interfaces
URL: https://github.com/apache/beam/pull/5152
 
 
   These are some interfaces that will be used on the worker to manage the 
lifetimes of remote environments and the related RPC services.  The key 
addition is of `SdkHarnessManager`, which is responsible for managing these 
resources and can provide a `RemoteEnvironment` to runner operators.
   
   
   
   Follow this checklist to help us incorporate your contribution quickly and 
easily:
   
- [x] Make sure there is a [JIRA 
issue](https://issues.apache.org/jira/projects/BEAM/issues/) filed for the 
change (usually before you start working on it).  Trivial changes like typos do 
not require a JIRA issue.  Your pull request should address just this issue, 
without pulling in other changes.
- [x] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue.
- [x] Write a pull request description that is detailed enough to 
understand:
  - [x] What the pull request does
  - [x] Why it does it
  - [x] How it does it
  - [x] Why this approach
- [x] Each commit in the pull request should have a meaningful subject line 
and body.
- [x] Run `mvn clean verify` to make sure basic checks pass. A more 
thorough check will be performed on your pull request automatically.
- [x] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91620)
Time Spent: 6h 10m  (was: 6h)

> Add abstractions to manage Environment Instance lifecycles.
> ---
>
> Key: BEAM-3327
> URL: https://issues.apache.org/jira/browse/BEAM-3327
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Axel Magnuson
>Priority: Major
>  Labels: portability
>  Time Spent: 6h 10m
>  Remaining Estimate: 0h
>
> This permits remote stage execution for arbitrary environments



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (BEAM-4096) BigQueryIO ValueProvider support for Method and Triggering Frequency

2018-04-16 Thread Jan Peuker (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-4096?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16440364#comment-16440364
 ] 

Jan Peuker edited comment on BEAM-4096 at 4/17/18 5:40 AM:
---

Hi this is Jan, all set up with Jira now.

Small addition here: We also need to change withNumFileShards to a 
ValueProvider which is a required option right now. The default 1000 mentioned 
in the JavaDoc is incorrect and tends to cause OutOfMemoryError in 
DataflowRunner. From my current, naive, benchmarks it seems a more sensible 
suggestion for most cases seems to have 100 shards (easy to calculate shard on 
powers of 10 and reaches common chunk sizes earlier).


was (Author: janpeuker):
Hi this is Jan, all set up with Jira now.

Small addition here: We also need to change withNumFileShards to a 
ValueProvider which is a required option right now. The default 1000 mentioned 
in the JavaDoc is incorrect and tends to cause OutOfMemoryError in 
DataflowRunner. From my current, native, benchmarks it seems a more sensible 
suggestion for most cases seems to have 100 shards (easy to calculate shard on 
powers of 2 and reaches common chunk sizes earlier).

> BigQueryIO ValueProvider support for Method and Triggering Frequency
> 
>
> Key: BEAM-4096
> URL: https://issues.apache.org/jira/browse/BEAM-4096
> Project: Beam
>  Issue Type: Improvement
>  Components: io-java-gcp
>Affects Versions: 2.4.0
>Reporter: Ryan McDowell
>Priority: Minor
> Fix For: 2.5.0
>
>
> Enhance BigQueryIO to accept ValueProviders for:
>  * withMethod(..)
>  * withTriggeringFrequency(..)
>  * withNumFileShards(..)
> It would allow Dataflow templates to accept these parameters at runtime 
> instead of being hardcoded. This opens up the ability to create Dataflow 
> templates which allow users to flip back-and-forth between batch and 
> streaming inserts.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #94

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[apilloud] [SQL] Remove PRIMARY KEY it does nothing

[apilloud] [SQL] Plumb through column nullable field

[apilloud] [SQL] Copy in DDL code from Calicite 1.16

[apilloud] [SQL] Patch ddl code for beam

--
[...truncated 18.74 MB...]
Apr 17, 2018 4:58:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Sample 
keys/Combine.GroupedValues as step s17
Apr 17, 2018 4:58:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/GBKaSVForData/ParDo(GroupByKeyHashAndSortByKeyAndWindow) as step s18
Apr 17, 2018 4:58:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly as step s19
Apr 17, 2018 4:58:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/ParMultiDo(ToIsmRecordForMapLike) as step s20
Apr 17, 2018 4:58:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/GBKaSVForSize as step s21
Apr 17, 2018 4:58:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/ParDo(ToIsmMetadataRecordForSize) as step s22
Apr 17, 2018 4:58:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/GBKaSVForKeys as step s23
Apr 17, 2018 4:58:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/ParDo(ToIsmMetadataRecordForKey) as step s24
Apr 17, 2018 4:58:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/Flatten.PCollections as step s25
Apr 17, 2018 4:58:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample 
as view/CreateDataflowView as step s26
Apr 17, 2018 4:58:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Partition 
input as step s27
Apr 17, 2018 4:58:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Group by 
partition as step s28
Apr 17, 2018 4:58:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Batch 
mutations together as step s29
Apr 17, 2018 4:58:22 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Write 
mutations to Spanner as step s30
Apr 17, 2018 4:58:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0417045813-1677eb97/output/results/staging/
Apr 17, 2018 4:58:22 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <80355 bytes, hash 4SEs4ZP5zo4cY2AOJ41Dhg> to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0417045813-1677eb97/output/results/staging/pipeline-4SEs4ZP5zo4cY2AOJ41Dhg.pb

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Dataflow SDK version: 2.5.0-SNAPSHOT

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 17, 2018 4:58:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-04-16_21_58_23-10481857675704817823?project=apache-beam-testing

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Submitted job: 2018-04-16_21_58_23-10481857675704817823

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 17, 2018 4:58:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Apex_Gradle #117

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[apilloud] [SQL] Remove PRIMARY KEY it does nothing

[apilloud] [SQL] Plumb through column nullable field

[apilloud] [SQL] Copy in DDL code from Calicite 1.16

[apilloud] [SQL] Patch ddl code for beam

--
[...truncated 27.76 MB...]
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 1 sending EndOfStream
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 2 sending EndOfStream
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 3 sending EndOfStream
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 4 sending EndOfStream
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 5 sending EndOfStream
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 6 sending EndOfStream
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 7 sending EndOfStream
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 8 sending EndOfStream
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 9 sending EndOfStream
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 10 sending EndOfStream
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 13 sending EndOfStream
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 14 sending EndOfStream
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 15 sending EndOfStream
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 16 sending EndOfStream
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 17 sending EndOfStream
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 18 sending EndOfStream
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 19 sending EndOfStream
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Undeploy request: [14]
Apr 17, 2018 5:01:23 AM com.datatorrent.stram.engine.StreamingContainer 
undeploy
INFO: Undeploy complete.
Apr 17, 2018 5:01:23 AM com.datatorrent.bufferserver.server.Server$3 run
INFO: Removing ln 
LogicalNode@ef045e0identifier=tcp://localhost:59609/14.output.14, 
upstream=14.output.14, group=stream6/15.input, partitions=[], 
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@48e3f220{da=com.datatorrent.bufferserver.internal.DataList$Block@68adcd3c{identifier=14.output.14,
 data=1048576, readingOffset=0, writingOffset=245, 
starting_window=5ad57fa1, ending_window=5ad57fa7, refCount=2, 
uniqueIdentifier=0, next=null, future=null}}} from dl 
com.datatorrent.bufferserver.internal.DataList@45499a9a {14.output.14}
Apr 17, 2018 5:01:24 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 5:01:24 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 5:01:24 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Undeploy request: [13]
Apr 17, 2018 5:01:24 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Undeploy request: [12]
Apr 17, 2018 5:01:24 AM com.datatorrent.stram.engine.StreamingContainer 
undeploy
INFO: Undeploy complete.
Apr 17, 2018 5:01:24 AM com.datatorrent.stram.engine.StreamingContainer 
undeploy
INFO: Undeploy complete.
Apr 17, 2018 5:01:24 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #115

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[apilloud] [SQL] Remove PRIMARY KEY it does nothing

[apilloud] [SQL] Plumb through column nullable field

[apilloud] [SQL] Copy in DDL code from Calicite 1.16

[apilloud] [SQL] Patch ddl code for beam

--
[...truncated 1.24 MB...]
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:626)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:169)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:123)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:83)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
at 
org.apache.beam.runners.spark.translation.streaming.CreateStreamTest.testFirstElementLate(CreateStreamTest.java:240)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
at 
org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:239)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.runTestClass(JUnitTestClassExecuter.java:114)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.execute(JUnitTestClassExecuter.java:57)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassProcessor.processTestClass(JUnitTestClassProcessor.java:66)
at 
org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at 
org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:32)
at 
org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:93)
at com.sun.proxy.$Proxy3.processTestClass(Unknown Source)
at 
org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:108)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at 

[jira] [Issue Comment Deleted] (BEAM-4096) BigQueryIO ValueProvider support for Method and Triggering Frequency

2018-04-16 Thread Ryan McDowell (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4096?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ryan McDowell updated BEAM-4096:

Comment: was deleted

(was: Added withNumFileShards(..) to the description.)

> BigQueryIO ValueProvider support for Method and Triggering Frequency
> 
>
> Key: BEAM-4096
> URL: https://issues.apache.org/jira/browse/BEAM-4096
> Project: Beam
>  Issue Type: Improvement
>  Components: io-java-gcp
>Affects Versions: 2.4.0
>Reporter: Ryan McDowell
>Priority: Minor
> Fix For: 2.5.0
>
>
> Enhance BigQueryIO to accept ValueProviders for:
>  * withMethod(..)
>  * withTriggeringFrequency(..)
>  * withNumFileShards(..)
> It would allow Dataflow templates to accept these parameters at runtime 
> instead of being hardcoded. This opens up the ability to create Dataflow 
> templates which allow users to flip back-and-forth between batch and 
> streaming inserts.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-4096) BigQueryIO ValueProvider support for Method and Triggering Frequency

2018-04-16 Thread Ryan McDowell (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-4096?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16440367#comment-16440367
 ] 

Ryan McDowell commented on BEAM-4096:
-

Added withNumFileShards(..) to the description.

> BigQueryIO ValueProvider support for Method and Triggering Frequency
> 
>
> Key: BEAM-4096
> URL: https://issues.apache.org/jira/browse/BEAM-4096
> Project: Beam
>  Issue Type: Improvement
>  Components: io-java-gcp
>Affects Versions: 2.4.0
>Reporter: Ryan McDowell
>Priority: Minor
> Fix For: 2.5.0
>
>
> Enhance BigQueryIO to accept ValueProviders for:
>  * withMethod(..)
>  * withTriggeringFrequency(..)
>  * withNumFileShards(..)
> It would allow Dataflow templates to accept these parameters at runtime 
> instead of being hardcoded. This opens up the ability to create Dataflow 
> templates which allow users to flip back-and-forth between batch and 
> streaming inserts.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-4096) BigQueryIO ValueProvider support for Method and Triggering Frequency

2018-04-16 Thread Ryan McDowell (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4096?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ryan McDowell updated BEAM-4096:

Description: 
Enhance BigQueryIO to accept ValueProviders for:
 * withMethod(..)
 * withTriggeringFrequency(..)
 * withNumFileShards(..)

It would allow Dataflow templates to accept these parameters at runtime instead 
of being hardcoded. This opens up the ability to create Dataflow templates 
which allow users to flip back-and-forth between batch and streaming inserts.

  was:
Enhance BigQueryIO to accept ValueProviders for:
 * withMethod(..)
 * withTriggeringFrequency(..)

It would allow Dataflow templates to accept these parameters at runtime instead 
of being hardcoded. This opens up the ability to create Dataflow templates 
which allow users to flip back-and-forth between batch and streaming inserts.


> BigQueryIO ValueProvider support for Method and Triggering Frequency
> 
>
> Key: BEAM-4096
> URL: https://issues.apache.org/jira/browse/BEAM-4096
> Project: Beam
>  Issue Type: Improvement
>  Components: io-java-gcp
>Affects Versions: 2.4.0
>Reporter: Ryan McDowell
>Priority: Minor
> Fix For: 2.5.0
>
>
> Enhance BigQueryIO to accept ValueProviders for:
>  * withMethod(..)
>  * withTriggeringFrequency(..)
>  * withNumFileShards(..)
> It would allow Dataflow templates to accept these parameters at runtime 
> instead of being hardcoded. This opens up the ability to create Dataflow 
> templates which allow users to flip back-and-forth between batch and 
> streaming inserts.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (BEAM-4096) BigQueryIO ValueProvider support for Method and Triggering Frequency

2018-04-16 Thread Jan Peuker (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-4096?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16440364#comment-16440364
 ] 

Jan Peuker edited comment on BEAM-4096 at 4/17/18 4:19 AM:
---

Hi this is Jan, all set up with Jira now.

Small addition here: We also need to change withNumFileShards to a 
ValueProvider which is a required option right now. The default 1000 mentioned 
in the JavaDoc is incorrect and tends to cause OutOfMemoryError in 
DataflowRunner. From my current, native, benchmarks it seems a more sensible 
suggestion for most cases seems to have 100 shards (easy to calculate shard on 
powers of 2 and reaches common chunk sizes earlier).


was (Author: janpeuker):
Hi this is Jan, all set up with Jira now.

Small addition here: We also need to be change withNumFileShards to a 
ValueProviders which is a required option right now. The default 1000 mentioned 
in the JavaDoc is incorrect and tends to cause OutOfMemoryError in 
DataflowRunner. From my current, native, benchmarks it seems a more sensible 
suggestion for most cases seems to have 100 shards (easy to calculate shard on 
powers of 2 and reaches common chunk sizes earlier).

> BigQueryIO ValueProvider support for Method and Triggering Frequency
> 
>
> Key: BEAM-4096
> URL: https://issues.apache.org/jira/browse/BEAM-4096
> Project: Beam
>  Issue Type: Improvement
>  Components: io-java-gcp
>Affects Versions: 2.4.0
>Reporter: Ryan McDowell
>Priority: Minor
> Fix For: 2.5.0
>
>
> Enhance BigQueryIO to accept ValueProviders for:
>  * withMethod(..)
>  * withTriggeringFrequency(..)
> It would allow Dataflow templates to accept these parameters at runtime 
> instead of being hardcoded. This opens up the ability to create Dataflow 
> templates which allow users to flip back-and-forth between batch and 
> streaming inserts.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-4096) BigQueryIO ValueProvider support for Method and Triggering Frequency

2018-04-16 Thread Jan Peuker (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-4096?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16440364#comment-16440364
 ] 

Jan Peuker commented on BEAM-4096:
--

Hi this is Jan, all set up with Jira now.

Small addition here: We also need to be change withNumFileShards to a 
ValueProviders which is a required option right now. The default 1000 mentioned 
in the JavaDoc is incorrect and tends to cause OutOfMemoryError in 
DataflowRunner. From my current, native, benchmarks it seems a more sensible 
suggestion for most cases seems to have 100 shards (easy to calculate shard on 
powers of 2 and reaches common chunk sizes earlier).

> BigQueryIO ValueProvider support for Method and Triggering Frequency
> 
>
> Key: BEAM-4096
> URL: https://issues.apache.org/jira/browse/BEAM-4096
> Project: Beam
>  Issue Type: Improvement
>  Components: io-java-gcp
>Affects Versions: 2.4.0
>Reporter: Ryan McDowell
>Priority: Minor
> Fix For: 2.5.0
>
>
> Enhance BigQueryIO to accept ValueProviders for:
>  * withMethod(..)
>  * withTriggeringFrequency(..)
> It would allow Dataflow templates to accept these parameters at runtime 
> instead of being hardcoded. This opens up the ability to create Dataflow 
> templates which allow users to flip back-and-forth between batch and 
> streaming inserts.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-4083) TypeName should be a proper algebraic type, and probably just called BeamType

2018-04-16 Thread Reuven Lax (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-4083?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16440365#comment-16440365
 ] 

Reuven Lax commented on BEAM-4083:
--

TypeName is not itself a type. FieldType is the actual "type" - TypeName simply 
segregates classes of types.

> TypeName should be a proper algebraic type, and probably just called BeamType
> -
>
> Key: BEAM-4083
> URL: https://issues.apache.org/jira/browse/BEAM-4083
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-java-core
>Reporter: Kenneth Knowles
>Priority: Major
>
> TypeName mixes atomic types and type constructors. Or, equivalently, it does 
> not distinguish by arity the type constructors. It would be best to make this 
> an ADT.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_Verify #4715

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[apilloud] [SQL] Remove PRIMARY KEY it does nothing

[apilloud] [SQL] Plumb through column nullable field

[apilloud] [SQL] Copy in DDL code from Calicite 1.16

[apilloud] [SQL] Patch ddl code for beam

--
Started by GitHub push by XuMingmin
[EnvInject] - Loading node environment variables.
Building remotely on beam1 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b9fb8f6765e923f24e5d41bf8e19240b8ba4a08c (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b9fb8f6765e923f24e5d41bf8e19240b8ba4a08c
Commit message: "Merge pull request #5040 from apilloud/ddl"
 > git rev-list --no-walk 6782e87c1f7d1d13f224df8369a7a14315556e4b # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PostCommit_Python_Verify] $ /bin/bash -xe 
/tmp/jenkins4461976126055808368.sh
+ cd src
+ bash sdks/python/run_postcommit.sh

# pip install --user installation location.
LOCAL_PATH=$HOME/.local/bin/

# Remove any tox cache from previous workspace
# TODO(udim): Remove this line and add '-r' to tox invocation instead.
rm -rf sdks/python/target/.tox

# INFRA does not install these packages
pip install --user --upgrade virtualenv tox
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:339: 
SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  SNIMissingWarning
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Requirement already up-to-date: virtualenv in 
/home/jenkins/.local/lib/python2.7/site-packages (15.2.0)
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Requirement already up-to-date: tox in 
/home/jenkins/.local/lib/python2.7/site-packages (3.0.0)
Requirement not upgraded as not directly required: py>=1.4.17 in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (1.5.3)
Requirement not upgraded as not directly required: pluggy<1.0,>=0.3.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (0.6.0)
Requirement not upgraded as not directly required: six in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (1.11.0)
cheetah 2.4.4 requires Markdown>=2.0.1, which is not installed.
apache-beam 2.5.0.dev0 requires hdfs<3.0.0,>=2.1.0, which is not installed.
apache-beam 2.5.0.dev0 requires pytz>=2018.3, which is not installed.
apache-beam 2.5.0.dev0 has requirement grpcio<2,>=1.8, but you'll have grpcio 
1.4.0 which is incompatible.

# Tox runs unit tests in a 

[jira] [Commented] (BEAM-4081) Review of schema metadata vs schema types

2018-04-16 Thread Reuven Lax (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-4081?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16440363#comment-16440363
 ] 

Reuven Lax commented on BEAM-4081:
--

agree, however I would recommend being conservative about what we promote to be 
a basic type, and err on keeping the Beam-level type set small. e.g. right now 
I see no good reason for Beam to distinguish between CHAR and VARCHAR, even 
though SQL does need to make this distinction (though maybe we will find a good 
reason for Beam to care).

> Review of schema metadata vs schema types
> -
>
> Key: BEAM-4081
> URL: https://issues.apache.org/jira/browse/BEAM-4081
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-java-core
>Reporter: Kenneth Knowles
>Priority: Major
>
> The Schema basic types have a place for metadata that can say "this int is 
> really millis since epoch". This deserves some careful design review and 
> perhaps some of these need to be promoted to basic types.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-4044) Take advantage of Calcite DDL

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4044?focusedWorklogId=91592=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91592
 ]

ASF GitHub Bot logged work on BEAM-4044:


Author: ASF GitHub Bot
Created on: 17/Apr/18 04:13
Start Date: 17/Apr/18 04:13
Worklog Time Spent: 10m 
  Work Description: XuMingmin closed pull request #5040: [BEAM-4044] [SQL] 
Refresh DDL from 1.16
URL: https://github.com/apache/beam/pull/5040
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/sdks/java/extensions/sql/src/main/codegen/config.fmpp 
b/sdks/java/extensions/sql/src/main/codegen/config.fmpp
index 61645e29f55..5ecb3d53fe4 100644
--- a/sdks/java/extensions/sql/src/main/codegen/config.fmpp
+++ b/sdks/java/extensions/sql/src/main/codegen/config.fmpp
@@ -1,10 +1,9 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to you under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
 #
 # http://www.apache.org/licenses/LICENSE-2.0
 #
@@ -15,9 +14,79 @@
 # limitations under the License.
 
 data: {
-  parser:   tdd(../data/Parser.tdd)
-}
+parser: {
+  # Generated parser implementation class package and name
+  package: "org.apache.beam.sdk.extensions.sql.impl.parser.impl",
+  class: "BeamSqlParserImpl",
+
+  # List of import statements.
+  imports: [
+"org.apache.calcite.schema.ColumnStrategy"
+"org.apache.calcite.sql.SqlCreate"
+"org.apache.calcite.sql.SqlDrop"
+"org.apache.beam.sdk.extensions.sql.impl.parser.SqlDdlNodes"
+  ]
+
+  # List of keywords.
+  keywords: [
+"COMMENT"
+"IF"
+   "LOCATION"
+   "TBLPROPERTIES"
+  ]
+
+  # List of keywords from "keywords" section that are not reserved.
+  nonReservedKeywords: [
+"COMMENT"
+"IF"
+   "LOCATION"
+   "TBLPROPERTIES"
+  ]
+
+  # List of methods for parsing custom SQL statements.
+  statementParserMethods: [
+  ]
+
+  # List of methods for parsing custom literals.
+  # Example: ParseJsonLiteral().
+  literalParserMethods: [
+  ]
+
+  # List of methods for parsing custom data types.
+  dataTypeParserMethods: [
+  ]
 
+  # List of methods for parsing extensions to "ALTER " calls.
+  # Each must accept arguments "(SqlParserPos pos, String scope)".
+  alterStatementParserMethods: [
+  ]
+
+  # List of methods for parsing extensions to "CREATE [OR REPLACE]" calls.
+  # Each must accept arguments "(SqlParserPos pos, boolean replace)".
+  createStatementParserMethods: [
+"SqlCreateTable"
+  ]
+
+  # List of methods for parsing extensions to "DROP" calls.
+  # Each must accept arguments "(SqlParserPos pos)".
+  dropStatementParserMethods: [
+"SqlDropTable"
+  ]
+
+  # List of files in @includes directory that have parser method
+  # implementations for parsing custom SQL statements, literals or types
+  # given as part of "statementParserMethods", "literalParserMethods" or
+  # "dataTypeParserMethods".
+  implementationFiles: [
+"parserImpls.ftl"
+  ]
+
+  includeCompoundIdentifier: true
+  includeBraces: true
+  includeAdditionalDeclarations: false
+
+}
+}
 freemarkerLinks: {
-  includes: includes/
+includes: includes/
 }
diff --git a/sdks/java/extensions/sql/src/main/codegen/data/Parser.tdd 
b/sdks/java/extensions/sql/src/main/codegen/data/Parser.tdd
deleted file mode 100644
index 1afa73d255b..000
--- a/sdks/java/extensions/sql/src/main/codegen/data/Parser.tdd
+++ /dev/null
@@ -1,76 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use 

[jira] [Commented] (BEAM-4080) Consider Schema.join to automatically produce a correct joined schema

2018-04-16 Thread Reuven Lax (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-4080?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16440362#comment-16440362
 ] 

Reuven Lax commented on BEAM-4080:
--

Some subtleties here: What if both schemas contain a field with the same name? 
In SQL this is legal, and the field can be accessed by extra scoping.

> Consider Schema.join to automatically produce a correct joined schema
> -
>
> Key: BEAM-4080
> URL: https://issues.apache.org/jira/browse/BEAM-4080
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-java-core
>Reporter: Kenneth Knowles
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-4077) Refactor builder field nullability

2018-04-16 Thread Reuven Lax (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-4077?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16440359#comment-16440359
 ] 

Reuven Lax commented on BEAM-4077:
--

every type can be nullable, so this would mean doubling the number of builder 
methods. There already is a a builder method that takes in a Field object 
(addField) which technically provides full generality. The others were added 
simply because I found that it increased ease of use of the builder.

> Refactor builder field nullability
> --
>
> Key: BEAM-4077
> URL: https://issues.apache.org/jira/browse/BEAM-4077
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-java-core
>Reporter: Kenneth Knowles
>Priority: Major
>
> Currently the Schema builder methods take a boolean for nullability. It would 
> be more standard to have separate builder methods. At this point the builder 
> might as well just take the Field spec since it does not add concision.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_ValidatesContainer_Dataflow #106

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Update Dataflow Development Container Version

[swegner] Fix a typo in gradle task group

[ehudm] Normalize Filesystems.match() glob behavior.

[mingmxu] support MAP in SQL schema

[mingmxu] in MAP, key as primitive, and value can be primitive/array/map/row

[mingmxu] use Collection for ARRAY type, and re-org `verify` code in `Row`

[mingmxu] rebase as file conflict with #5089

[github] Add region to dataflowOptions as well.

[tgroh] Use Explicit PipelineOptions in Native Evaluators

[github] Update containers at master to newly released beam-master-20180413.

[mingmxu] rename CollectionType to CollectionElementType

[github] Add region to dataflowOptions struct.

[sidhom] [BEAM-4056] Identify side inputs by transform id and local name

[sidhom] Add side input assertions to ExecutableStageMatcher

[amyrvold] Fix failing nightly release build

--
[...truncated 63.95 KB...]
fi
basename "$VIRTUAL_ENV"

# Make sure to unalias pydoc if it's already there
alias pydoc 2>/dev/null >/dev/null && unalias pydoc

pydoc () {
python -m pydoc "$@"
}

# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands.  Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH-}" ] || [ -n "${ZSH_VERSION-}" ] ; then
hash -r 2>/dev/null
fi
cd sdks/python
pip install -e .[gcp,test]
Obtaining 
file://
Collecting avro<2.0.0,>=1.8.1 (from apache-beam==2.5.0.dev0)
:339:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  SNIMissingWarning
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting dill==0.2.6 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting grpcio<2,>=1.8 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from 

Jenkins build is back to normal : beam_PostRelease_NightlySnapshot #200

2018-04-16 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1376

2018-04-16 Thread Apache Jenkins Server
See 


--
[...truncated 4.43 KB...]

pydoc () {
python -m pydoc "$@"
}

# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands.  Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH-}" ] || [ -n "${ZSH_VERSION-}" ] ; then
hash -r 2>/dev/null
fi
cd sdks/python
pip install -e .[gcp,test]
Obtaining 
file://
Collecting avro<2.0.0,>=1.8.1 (from apache-beam==2.5.0.dev0)
:339:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  SNIMissingWarning
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting dill==0.2.6 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting grpcio<2,>=1.8 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
  Using cached 
https://files.pythonhosted.org/packages/0d/54/b647a6323be6526be27b2c90bb042769f1a7a6e59bd1a5f2eeb795bfece4/grpcio-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting httplib2<0.10,>=0.8 (from apache-beam==2.5.0.dev0)

Build failed in Jenkins: beam_PostCommit_Python_Verify #4714

2018-04-16 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam1 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6782e87c1f7d1d13f224df8369a7a14315556e4b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6782e87c1f7d1d13f224df8369a7a14315556e4b
Commit message: "Merge pull request #5024: [BEAM-4011] Unify Python IO glob 
implementation"
 > git rev-list --no-walk 6782e87c1f7d1d13f224df8369a7a14315556e4b # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PostCommit_Python_Verify] $ /bin/bash -xe 
/tmp/jenkins6072468480999693103.sh
+ cd src
+ bash sdks/python/run_postcommit.sh

# pip install --user installation location.
LOCAL_PATH=$HOME/.local/bin/

# Remove any tox cache from previous workspace
# TODO(udim): Remove this line and add '-r' to tox invocation instead.
rm -rf sdks/python/target/.tox

# INFRA does not install these packages
pip install --user --upgrade virtualenv tox
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:339: 
SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  SNIMissingWarning
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Requirement already up-to-date: virtualenv in 
/home/jenkins/.local/lib/python2.7/site-packages (15.2.0)
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Requirement already up-to-date: tox in 
/home/jenkins/.local/lib/python2.7/site-packages (3.0.0)
Requirement not upgraded as not directly required: py>=1.4.17 in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (1.5.3)
Requirement not upgraded as not directly required: pluggy<1.0,>=0.3.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (0.6.0)
Requirement not upgraded as not directly required: six in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (1.11.0)
cheetah 2.4.4 requires Markdown>=2.0.1, which is not installed.
apache-beam 2.5.0.dev0 requires hdfs<3.0.0,>=2.1.0, which is not installed.
apache-beam 2.5.0.dev0 requires pytz>=2018.3, which is not installed.
apache-beam 2.5.0.dev0 has requirement grpcio<2,>=1.8, but you'll have grpcio 
1.4.0 which is incompatible.

# Tox runs unit tests in a virtual environment
${LOCAL_PATH}/tox -e ALL -c sdks/python/tox.ini
GLOB sdist-make: 

ERROR: invocation failed (exit code 

Jenkins build is back to normal : beam_PostCommit_Java_GradleBuild #93

2018-04-16 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Apex_Gradle #116

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[ehudm] Normalize Filesystems.match() glob behavior.

--
[...truncated 27.88 MB...]
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 1 sending EndOfStream
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 2 sending EndOfStream
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 3 sending EndOfStream
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 4 sending EndOfStream
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 5 sending EndOfStream
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 6 sending EndOfStream
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 7 sending EndOfStream
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 8 sending EndOfStream
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 9 sending EndOfStream
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 12 sending EndOfStream
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 13 sending EndOfStream
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 14 sending EndOfStream
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 15 sending EndOfStream
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 16 sending EndOfStream
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 17 sending EndOfStream
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 18 sending EndOfStream
Apr 17, 2018 3:08:33 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 19 sending EndOfStream
Apr 17, 2018 3:08:34 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 3:08:34 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 3:08:34 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 3:08:34 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 3:08:34 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 3:08:34 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Undeploy request: [16, 17, 18, 19, 15]
Apr 17, 2018 3:08:34 AM com.datatorrent.stram.engine.StreamingContainer 
undeploy
INFO: Undeploy complete.
Apr 17, 2018 3:08:34 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 3:08:34 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Undeploy request: [10]
Apr 17, 2018 3:08:34 AM com.datatorrent.stram.engine.StreamingContainer 
undeploy
INFO: Undeploy complete.
Apr 17, 2018 3:08:34 AM com.datatorrent.bufferserver.server.Server$3 run
INFO: Removing ln 
LogicalNode@6a7be092identifier=tcp://localhost:50061/10.output.9, 
upstream=10.output.9, group=stream1/11.inputPort, partitions=[], 
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@27bf2c98{da=com.datatorrent.bufferserver.internal.DataList$Block@40dfa2a{identifier=10.output.9,
 data=1048576, readingOffset=0, writingOffset=77, 
starting_window=5ad5652e0001, ending_window=5ad5652e0002, refCount=2, 
uniqueIdentifier=0, next=null, future=null}}} from dl 
com.datatorrent.bufferserver.internal.DataList@31c7f549 {10.output.9}
Apr 17, 2018 3:08:34 AM 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #114

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[ehudm] Normalize Filesystems.match() glob behavior.

--
[...truncated 1.24 MB...]
at 
org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:828)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:626)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:169)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:123)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:83)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
at 
org.apache.beam.runners.spark.translation.streaming.CreateStreamTest.testFirstElementLate(CreateStreamTest.java:240)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
at 
org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:239)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.runTestClass(JUnitTestClassExecuter.java:114)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.execute(JUnitTestClassExecuter.java:57)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassProcessor.processTestClass(JUnitTestClassProcessor.java:66)
at 
org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at 
org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:32)
at 
org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:93)
at com.sun.proxy.$Proxy3.processTestClass(Unknown Source)
at 
org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:108)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at 

[jira] [Work logged] (BEAM-4038) Support Kafka Headers in KafkaIO

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4038?focusedWorklogId=91577=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91577
 ]

ASF GitHub Bot logged work on BEAM-4038:


Author: ASF GitHub Bot
Created on: 17/Apr/18 02:42
Start Date: 17/Apr/18 02:42
Worklog Time Spent: 10m 
  Work Description: rangadi commented on a change in pull request #5111: 
[BEAM-4038] Support Kafka Headers in KafkaIO
URL: https://github.com/apache/beam/pull/5111#discussion_r181938653
 
 

 ##
 File path: 
sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaHeader.java
 ##
 @@ -0,0 +1,28 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.kafka;
+
+/**
+ * This is a copy of Kafka's {@link org.apache.kafka.common.header.Header}. 
Included here in order
+ * to support older Kafka versions (0.9.x).
+ */
+public interface KafkaHeader {
 
 Review comment:
   I see. That is correct, compilation error is fine. All the kafka-client 
versions before 0.10.1 are already deprecated [1]. If a test wants to test 
kafka headers, it probably makes sense to depend on a kafka-client version that 
has headers. 
   
   
[1]https://github.com/apache/beam/blob/master/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaIO.java#L624


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91577)
Time Spent: 2h 10m  (was: 2h)

> Support Kafka Headers in KafkaIO
> 
>
> Key: BEAM-4038
> URL: https://issues.apache.org/jira/browse/BEAM-4038
> Project: Beam
>  Issue Type: New Feature
>  Components: io-java-kafka
>Reporter: Geet Kumar
>Assignee: Raghu Angadi
>Priority: Minor
>  Time Spent: 2h 10m
>  Remaining Estimate: 0h
>
> Headers have been added to Kafka Consumer/Producer records (KAFKA-4208). The 
> purpose of this JIRA is to support this feature in KafkaIO.  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-4038) Support Kafka Headers in KafkaIO

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4038?focusedWorklogId=91576=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91576
 ]

ASF GitHub Bot logged work on BEAM-4038:


Author: ASF GitHub Bot
Created on: 17/Apr/18 02:30
Start Date: 17/Apr/18 02:30
Worklog Time Spent: 10m 
  Work Description: gkumar7 commented on a change in pull request #5111: 
[BEAM-4038] Support Kafka Headers in KafkaIO
URL: https://github.com/apache/beam/pull/5111#discussion_r181937184
 
 

 ##
 File path: 
sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaHeader.java
 ##
 @@ -0,0 +1,28 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.kafka;
+
+/**
+ * This is a copy of Kafka's {@link org.apache.kafka.common.header.Header}. 
Included here in order
+ * to support older Kafka versions (0.9.x).
+ */
+public interface KafkaHeader {
 
 Review comment:
   I have removed the additional classes. Yes, I understand that master won't 
compile, but I am referring to compiling my test program. What I mean to say is 
if the test program (which depends on beam-kafka) calls 
KafkaRecord.getHeaders(), this raises a compile error. Is this what we want? 
   
   It seems a bit counter-intuitive to expose a method such as getHeaders() and 
not be able to compile the *test* program when using it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91576)
Time Spent: 2h  (was: 1h 50m)

> Support Kafka Headers in KafkaIO
> 
>
> Key: BEAM-4038
> URL: https://issues.apache.org/jira/browse/BEAM-4038
> Project: Beam
>  Issue Type: New Feature
>  Components: io-java-kafka
>Reporter: Geet Kumar
>Assignee: Raghu Angadi
>Priority: Minor
>  Time Spent: 2h
>  Remaining Estimate: 0h
>
> Headers have been added to Kafka Consumer/Producer records (KAFKA-4208). The 
> purpose of this JIRA is to support this feature in KafkaIO.  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #92

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[amyrvold] Fix failing nightly release build

--
[...truncated 19.51 MB...]
INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Write 
mutations to Spanner as step s30
Apr 17, 2018 2:25:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0417022504-8608dc60/output/results/staging/
Apr 17, 2018 2:25:14 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <80355 bytes, hash S-lwq3pgWW0EDtjL9NNwCQ> to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0417022504-8608dc60/output/results/staging/pipeline-S-lwq3pgWW0EDtjL9NNwCQ.pb

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Dataflow SDK version: 2.5.0-SNAPSHOT

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 17, 2018 2:25:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-04-16_19_25_14-17247909770597229751?project=apache-beam-testing

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
Submitted job: 2018-04-16_19_25_14-17247909770597229751

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 17, 2018 2:25:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2018-04-16_19_25_14-17247909770597229751
Apr 17, 2018 2:25:16 AM org.apache.beam.runners.dataflow.TestDataflowRunner 
run
INFO: Running Dataflow job 2018-04-16_19_25_14-17247909770597229751 with 0 
expected assertions.

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
Apr 17, 2018 2:25:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-17T02:25:16.762Z: Autoscaling: Resized worker pool from 1 to 
0.
Apr 17, 2018 2:25:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-17T02:25:16.802Z: Autoscaling: Would further reduce the 
number of workers but reached the minimum number allowed for the job.
Apr 17, 2018 2:25:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-17T02:25:16.848Z: Worker pool stopped.
Apr 17, 2018 2:25:24 AM 
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
INFO: Job 2018-04-16_19_22_28-16444588825548488005 finished with status 
DONE.
Apr 17, 2018 2:25:24 AM org.apache.beam.runners.dataflow.TestDataflowRunner 
checkForPAssertSuccess
INFO: Success result for Dataflow job 
2018-04-16_19_22_28-16444588825548488005. Found 0 success, 0 failures out of 0 
expected assertions.

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
Apr 17, 2018 2:25:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-17T02:25:14.982Z: Autoscaling is enabled for job 
2018-04-16_19_25_14-17247909770597229751. The number of workers will be between 
1 and 1000.
Apr 17, 2018 2:25:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-17T02:25:15.007Z: Autoscaling was automatically enabled for 
job 2018-04-16_19_25_14-17247909770597229751.
Apr 17, 2018 2:25:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-17T02:25:17.816Z: Checking required Cloud APIs are enabled.
Apr 17, 2018 2:25:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-17T02:25:18.006Z: Checking permissions granted to controller 
Service Account.
Apr 17, 2018 2:25:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-17T02:25:22.211Z: Worker configuration: n1-standard-1 in 
us-central1-f.
Apr 17, 2018 2:25:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-17T02:25:22.674Z: Expanding CoGroupByKey operations into 
optimizable parts.
Apr 17, 2018 2:25:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-17T02:25:22.847Z: Expanding GroupByKey operations into 
optimizable parts.
Apr 17, 2018 2:25:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-17T02:25:22.882Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
Apr 17, 2018 2:25:26 AM 

[jira] [Work logged] (BEAM-4038) Support Kafka Headers in KafkaIO

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4038?focusedWorklogId=91575=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91575
 ]

ASF GitHub Bot logged work on BEAM-4038:


Author: ASF GitHub Bot
Created on: 17/Apr/18 02:28
Start Date: 17/Apr/18 02:28
Worklog Time Spent: 10m 
  Work Description: gkumar7 commented on a change in pull request #5111: 
[BEAM-4038] Support Kafka Headers in KafkaIO
URL: https://github.com/apache/beam/pull/5111#discussion_r181937184
 
 

 ##
 File path: 
sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaHeader.java
 ##
 @@ -0,0 +1,28 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.kafka;
+
+/**
+ * This is a copy of Kafka's {@link org.apache.kafka.common.header.Header}. 
Included here in order
+ * to support older Kafka versions (0.9.x).
+ */
+public interface KafkaHeader {
 
 Review comment:
   Yes, I understand. I have removed the additional classes. What I mean to say 
is if the test program (which depends on beam-kafka) calls 
KafkaRecord.getHeaders(), this raises a compile error. Is this what we want? 
   
   It seems a bit counter-intuitive to expose a method such as getHeaders() and 
not be able to compile the program when using it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91575)
Time Spent: 1h 50m  (was: 1h 40m)

> Support Kafka Headers in KafkaIO
> 
>
> Key: BEAM-4038
> URL: https://issues.apache.org/jira/browse/BEAM-4038
> Project: Beam
>  Issue Type: New Feature
>  Components: io-java-kafka
>Reporter: Geet Kumar
>Assignee: Raghu Angadi
>Priority: Minor
>  Time Spent: 1h 50m
>  Remaining Estimate: 0h
>
> Headers have been added to Kafka Consumer/Producer records (KAFKA-4208). The 
> purpose of this JIRA is to support this feature in KafkaIO.  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1375

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[ehudm] Normalize Filesystems.match() glob behavior.

--
[...truncated 4.46 KB...]

pydoc () {
python -m pydoc "$@"
}

# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands.  Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH-}" ] || [ -n "${ZSH_VERSION-}" ] ; then
hash -r 2>/dev/null
fi
cd sdks/python
pip install -e .[gcp,test]
Obtaining 
file://
Collecting avro<2.0.0,>=1.8.1 (from apache-beam==2.5.0.dev0)
:339:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  SNIMissingWarning
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting dill==0.2.6 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting grpcio<2,>=1.8 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
  Using cached 
https://files.pythonhosted.org/packages/0d/54/b647a6323be6526be27b2c90bb042769f1a7a6e59bd1a5f2eeb795bfece4/grpcio-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  

Build failed in Jenkins: beam_PostCommit_Python_Verify #4713

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[ehudm] Normalize Filesystems.match() glob behavior.

--
Started by GitHub push by chamikaramj
[EnvInject] - Loading node environment variables.
Building remotely on beam1 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6782e87c1f7d1d13f224df8369a7a14315556e4b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6782e87c1f7d1d13f224df8369a7a14315556e4b
Commit message: "Merge pull request #5024: [BEAM-4011] Unify Python IO glob 
implementation"
 > git rev-list --no-walk 4014b48f5aed660aee57d013e7b8b47208b28a6f # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PostCommit_Python_Verify] $ /bin/bash -xe 
/tmp/jenkins7556557078007985196.sh
+ cd src
+ bash sdks/python/run_postcommit.sh

# pip install --user installation location.
LOCAL_PATH=$HOME/.local/bin/

# Remove any tox cache from previous workspace
# TODO(udim): Remove this line and add '-r' to tox invocation instead.
rm -rf sdks/python/target/.tox

# INFRA does not install these packages
pip install --user --upgrade virtualenv tox
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:339: 
SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  SNIMissingWarning
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Requirement already up-to-date: virtualenv in 
/home/jenkins/.local/lib/python2.7/site-packages (15.2.0)
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Requirement already up-to-date: tox in 
/home/jenkins/.local/lib/python2.7/site-packages (3.0.0)
Requirement not upgraded as not directly required: py>=1.4.17 in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (1.5.3)
Requirement not upgraded as not directly required: pluggy<1.0,>=0.3.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (0.6.0)
Requirement not upgraded as not directly required: six in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (1.11.0)
cheetah 2.4.4 requires Markdown>=2.0.1, which is not installed.
apache-beam 2.5.0.dev0 requires hdfs<3.0.0,>=2.1.0, which is not installed.
apache-beam 2.5.0.dev0 requires pytz>=2018.3, which is not installed.
apache-beam 2.5.0.dev0 has requirement grpcio<2,>=1.8, but you'll have grpcio 
1.4.0 which is incompatible.

# Tox runs unit tests in a virtual environment
${LOCAL_PATH}/tox -e ALL -c sdks/python/tox.ini
GLOB sdist-make: 

[jira] [Work logged] (BEAM-4011) Python SDK: add glob support for HDFS

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4011?focusedWorklogId=91574=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91574
 ]

ASF GitHub Bot logged work on BEAM-4011:


Author: ASF GitHub Bot
Created on: 17/Apr/18 02:20
Start Date: 17/Apr/18 02:20
Worklog Time Spent: 10m 
  Work Description: chamikaramj closed pull request #5024: [BEAM-4011] 
Unify Python IO glob implementation.
URL: https://github.com/apache/beam/pull/5024
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/sdks/python/apache_beam/io/filesystem.py 
b/sdks/python/apache_beam/io/filesystem.py
index 3f7e9aba847..929b7ef5607 100644
--- a/sdks/python/apache_beam/io/filesystem.py
+++ b/sdks/python/apache_beam/io/filesystem.py
@@ -21,8 +21,11 @@
 import abc
 import bz2
 import cStringIO
+import fnmatch
 import logging
 import os
+import posixpath
+import re
 import time
 import zlib
 
@@ -498,9 +501,49 @@ def mkdirs(self, path):
 raise NotImplementedError
 
   @abc.abstractmethod
+  def has_dirs(self):
+"""Whether this FileSystem supports directories."""
+raise NotImplementedError
+
+  @abc.abstractmethod
+  def _list(self, dir_or_prefix):
+"""List files in a location.
+
+Listing is non-recursive (for filesystems that support directories).
+
+Args:
+  dir_or_prefix: (string) A directory or location prefix (for filesystems
+that don't have directories).
+
+Returns:
+  Generator of ``FileMetadata`` objects.
+
+Raises:
+  ``BeamIOError`` if listing fails, but not if no files were found.
+"""
+raise NotImplementedError
+
+  @staticmethod
+  def _url_dirname(url_or_path):
+"""Like posixpath.dirname, but preserves scheme:// prefix.
+
+Args:
+  url_or_path: A string in the form of scheme://some/path OR /some/path.
+"""
+match = re.match(r'([a-z]+://)(.*)', url_or_path)
+if match is None:
+  return posixpath.dirname(url_or_path)
+url_prefix, path = match.groups()
+return url_prefix + posixpath.dirname(path)
+
   def match(self, patterns, limits=None):
 """Find all matching paths to the patterns provided.
 
+Pattern matching is done using fnmatch.fnmatch.
+For filesystems that have directories, matching is not recursive. Patterns
+like scheme://path/*/foo will not match anything.
+Patterns ending with '/' will be appended with '*'.
+
 Args:
   patterns: list of string for the file path pattern to match against
   limits: list of maximum number of responses that need to be fetched
@@ -510,7 +553,52 @@ def match(self, patterns, limits=None):
 Raises:
   ``BeamIOError`` if any of the pattern match operations fail
 """
-raise NotImplementedError
+if limits is None:
+  limits = [None] * len(patterns)
+else:
+  err_msg = "Patterns and limits should be equal in length"
+  assert len(patterns) == len(limits), err_msg
+
+def _match(pattern, limit):
+  """Find all matching paths to the pattern provided."""
+  if pattern.endswith('/'):
+pattern += '*'
+  # Get the part of the pattern before the first globbing character.
+  # For example scheme://path/foo* will become scheme://path/foo for
+  # filesystems like GCS, or converted to scheme://path for filesystems 
with
+  # directories.
+  prefix_or_dir = re.match('^[^[*?]*', pattern).group(0)
+
+  file_metadatas = []
+  if prefix_or_dir == pattern:
+# Short-circuit calling self.list() if there's no glob pattern to 
match.
+if self.exists(pattern):
+  file_metadatas = [FileMetadata(pattern, self.size(pattern))]
+  else:
+if self.has_dirs():
+  prefix_or_dir = self._url_dirname(prefix_or_dir)
+file_metadatas = self._list(prefix_or_dir)
+
+  metadata_list = []
+  for file_metadata in file_metadatas:
+if limit is not None and len(metadata_list) >= limit:
+  break
+if fnmatch.fnmatch(file_metadata.path, pattern):
+  metadata_list.append(file_metadata)
+
+  return MatchResult(pattern, metadata_list)
+
+exceptions = {}
+result = []
+for pattern, limit in zip(patterns, limits):
+  try:
+result.append(_match(pattern, limit))
+  except Exception as e:  # pylint: disable=broad-except
+exceptions[pattern] = e
+
+if exceptions:
+  raise BeamIOError("Match operation failed", exceptions)
+return result
 
   @abc.abstractmethod
   def create(self, path, mime_type='application/octet-stream',
@@ -579,6 +667,19 @@ def exists(self, path):
 raise NotImplementedError
 
   @abc.abstractmethod
+  def size(self, path):
+"""Get 

[beam] branch master updated (4014b48 -> 6782e87)

2018-04-16 Thread chamikara
This is an automated email from the ASF dual-hosted git repository.

chamikara pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 4014b48  Merge pull request #5142 from alanmyrvold/alan-rel-failure
 add 36a2506  Normalize Filesystems.match() glob behavior.
 new 6782e87  Merge pull request #5024: [BEAM-4011] Unify Python IO glob 
implementation

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/python/apache_beam/io/filesystem.py   | 103 +++-
 sdks/python/apache_beam/io/filesystem_test.py  | 156 ++
 sdks/python/apache_beam/io/filesystems.py  |   5 +
 sdks/python/apache_beam/io/gcp/gcsfilesystem.py|  65 
 .../apache_beam/io/gcp/gcsfilesystem_test.py   |  23 ++-
 sdks/python/apache_beam/io/gcp/gcsio.py|  66 ++--
 sdks/python/apache_beam/io/gcp/gcsio_test.py   | 179 +++--
 sdks/python/apache_beam/io/hadoopfilesystem.py |  53 ++
 .../python/apache_beam/io/hadoopfilesystem_test.py |  19 ++-
 .../io/hdfs_integration_test/Dockerfile|   2 +-
 sdks/python/apache_beam/io/localfilesystem.py  |  68 
 sdks/python/apache_beam/io/localfilesystem_test.py |  10 ++
 12 files changed, 415 insertions(+), 334 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
chamik...@apache.org.


[beam] 01/01: Merge pull request #5024: [BEAM-4011] Unify Python IO glob implementation

2018-04-16 Thread chamikara
This is an automated email from the ASF dual-hosted git repository.

chamikara pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 6782e87c1f7d1d13f224df8369a7a14315556e4b
Merge: 4014b48 36a2506
Author: Chamikara Jayalath 
AuthorDate: Mon Apr 16 19:20:33 2018 -0700

Merge pull request #5024: [BEAM-4011] Unify Python IO glob implementation

 sdks/python/apache_beam/io/filesystem.py   | 103 +++-
 sdks/python/apache_beam/io/filesystem_test.py  | 156 ++
 sdks/python/apache_beam/io/filesystems.py  |   5 +
 sdks/python/apache_beam/io/gcp/gcsfilesystem.py|  65 
 .../apache_beam/io/gcp/gcsfilesystem_test.py   |  23 ++-
 sdks/python/apache_beam/io/gcp/gcsio.py|  66 ++--
 sdks/python/apache_beam/io/gcp/gcsio_test.py   | 179 +++--
 sdks/python/apache_beam/io/hadoopfilesystem.py |  53 ++
 .../python/apache_beam/io/hadoopfilesystem_test.py |  19 ++-
 .../io/hdfs_integration_test/Dockerfile|   2 +-
 sdks/python/apache_beam/io/localfilesystem.py  |  68 
 sdks/python/apache_beam/io/localfilesystem_test.py |  10 ++
 12 files changed, 415 insertions(+), 334 deletions(-)


-- 
To stop receiving notification emails like this one, please contact
chamik...@apache.org.


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Apex_Gradle #115

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[amyrvold] Fix failing nightly release build

--
[...truncated 27.71 MB...]
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 1 sending EndOfStream
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 2 sending EndOfStream
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 3 sending EndOfStream
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 4 sending EndOfStream
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 5 sending EndOfStream
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 6 sending EndOfStream
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 7 sending EndOfStream
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 8 sending EndOfStream
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 9 sending EndOfStream
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 12 sending EndOfStream
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 13 sending EndOfStream
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 14 sending EndOfStream
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 15 sending EndOfStream
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 16 sending EndOfStream
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 17 sending EndOfStream
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 18 sending EndOfStream
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 19 sending EndOfStream
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Undeploy request: [7]
Apr 17, 2018 2:02:12 AM com.datatorrent.stram.engine.StreamingContainer 
undeploy
INFO: Undeploy complete.
Apr 17, 2018 2:02:12 AM com.datatorrent.bufferserver.server.Server$3 run
INFO: Removing ln 
LogicalNode@5fa2ae1cidentifier=tcp://localhost:55133/7.output.7, 
upstream=7.output.7, group=stream11/8.input, partitions=[], 
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@524521f3{da=com.datatorrent.bufferserver.internal.DataList$Block@6682fcbd{identifier=7.output.7,
 data=1048576, readingOffset=0, writingOffset=237, 
starting_window=5ad555a10001, ending_window=5ad555a10007, refCount=2, 
uniqueIdentifier=0, next=null, future=null}}} from dl 
com.datatorrent.bufferserver.internal.DataList@30799191 {7.output.7}
Apr 17, 2018 2:02:13 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 2:02:13 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Undeploy request: [10]
Apr 17, 2018 2:02:13 AM com.datatorrent.stram.engine.StreamingContainer 
undeploy
INFO: Undeploy complete.
Apr 17, 2018 2:02:13 AM com.datatorrent.bufferserver.server.Server$3 run
INFO: Removing ln 
LogicalNode@2fb72713identifier=tcp://localhost:55133/10.output.9, 
upstream=10.output.9, group=stream0/11.inputPort, partitions=[], 
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@26e0a732{da=com.datatorrent.bufferserver.internal.DataList$Block@5d5150a4{identifier=10.output.9,
 data=1048576, readingOffset=0, writingOffset=77, 
starting_window=5ad555a10001, ending_window=5ad555a10002, refCount=2, 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #113

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[amyrvold] Fix failing nightly release build

--
[...truncated 1.25 MB...]
at scala.Option.getOrElse(Option.scala:121)
at 
org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:828)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:626)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:169)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:123)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:83)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
at 
org.apache.beam.runners.spark.translation.streaming.CreateStreamTest.testFirstElementLate(CreateStreamTest.java:240)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
at 
org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:239)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.runTestClass(JUnitTestClassExecuter.java:114)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.execute(JUnitTestClassExecuter.java:57)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassProcessor.processTestClass(JUnitTestClassProcessor.java:66)
at 
org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at 
org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:32)
at 
org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:93)
at com.sun.proxy.$Proxy3.processTestClass(Unknown Source)
at 
org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:108)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at 

[jira] [Work logged] (BEAM-3942) Update performance testing framework to use Gradle.

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3942?focusedWorklogId=91568=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91568
 ]

ASF GitHub Bot logged work on BEAM-3942:


Author: ASF GitHub Bot
Created on: 17/Apr/18 01:44
Start Date: 17/Apr/18 01:44
Worklog Time Spent: 10m 
  Work Description: chamikaramj commented on a change in pull request 
#5003: [BEAM-3942] Update performance testing framework to use Gradle
URL: https://github.com/apache/beam/pull/5003#discussion_r181932094
 
 

 ##
 File path: build_rules.gradle
 ##
 @@ -837,6 +837,179 @@ ext.applyJavaNature = {
   }
 }
 
+// Reads and contains all necessary performance test parameters
+class JavaPerformanceTestConfiguration {
+
+  /* Optional properties (set only if needed in your case): */
+
+  // Path to PerfKitBenchmarker application (pkb.py).
+  // It is only required when running Performance Tests with PerfKitBenchmarker
+  String pkbLocation = System.getProperty('pkbLocation')
+
+  // Data Processing Backend's log level.
+  String logLevel = System.getProperty('logLevel', 'INFO')
+
+  // Path to gradle binary.
+  String gradleBinary = System.getProperty('gradleBinary', './gradlew')
+
+  // If benchmark is official or not.
+  // Official benchmark results are meant to be displayed on PerfKitExplorer 
dashboards.
+  String isOfficial = System.getProperty('official', 'false')
+
+  // Specifies names of benchmarks to be run by PerfKitBenchmarker.
+  String benchmarks = System.getProperty('benchmarks', 
'beam_integration_benchmark')
+
+  // If beam is not "prebuilt" then PerfKitBenchmarker runs the build task 
before running the tests.
+  String beamPrebuilt = System.getProperty('beamPrebuilt', 'true')
+
+  // Beam's sdk to be used by PerfKitBenchmarker.
+  String beamSdk = System.getProperty('beamSdk', 'java')
+
+  // Timeout (in seconds) after which PerfKitBenchmarker will stop executing 
the benchmark (and will fail).
+  String timeout = System.getProperty('itTimeout', '1200')
+
+  // Path to kubernetes configuration file.
+  String kubeconfig = System.getProperty('kubeconfig', 
System.getProperty('user.home') + '/.kube/config')
+
+  // Path to kubernetes executable.
+  String kubectl = System.getProperty('kubectl', 'kubectl')
+
+  // Paths to files with kubernetes infrastructure to setup before the test 
runs.
+  // PerfKitBenchmarker will have trouble reading 'null' path. It expects 
empty string if no scripts are expected.
+  String kubernetesScripts = System.getProperty('kubernetesScripts', '')
+
+  // Path to file with 'dynamic' and 'static' pipeline options.
+  // that will be appended by PerfKitBenchmarker to the test running command.
+  // PerfKitBenchmarker will have trouble reading 'null' path. It expects 
empty string if no config file is expected.
+  String optionsConfigFile = System.getProperty('beamITOptions', '')
+
+  // Any additional properties to be appended to benchmark execution command.
+  String extraProperties = System.getProperty('beamExtraProperties', '')
+
+  // Runner which will be used for running the tests. Possible values: 
dataflow/direct.
+  // PerfKitBenchmarker will have trouble reading 'null' value. It expects 
empty string if no config file is expected.
+  String runner = System.getProperty('integrationTestRunner', '')
+
+  // Filesystem which will be used for running the tests. Possible values: 
hdfs.
+  // if not specified runner's local filesystem will be used.
+  String filesystem = System.getProperty('filesystem')
+
+  /* Always required properties: */
+
+  // Pipeline options to be used by the tested pipeline.
+  String integrationTestPipelineOptions = 
System.getProperty('integrationTestPipelineOptions')
+
+  // Fully qualified name of the test to be run, eg:
+  // 'org.apache.beam.sdks.java.io.jdbc.JdbcIOIT'.
+  String integrationTest = System.getProperty('integrationTest')
+
+  // Relative path to module where the test is, eg. 'sdks/java/io/jdbc.
+  String itModule = System.getProperty('itModule')
+}
+
+// When applied in a module's build.gradle file, this closure provides task 
for running
+// IO integration tests (manually, without PerfKitBenchmarker).
+ext.enableJavaPerformanceTesting = {
+  println "enableJavaPerformanceTesting with ${it ? "$it" : "default 
configuration"} for project ${project.name}"
+
+  // Use the implicit it parameter of the closure to handle zero argument or 
one argument map calls.
+  // See: http://groovy-lang.org/closures.html#implicit-it
+  JavaPerformanceTestConfiguration configuration = it ? it as 
JavaPerformanceTestConfiguration : new JavaPerformanceTestConfiguration()
+
+  // Task for running integration tests
+  task integrationTest(type: Test) {
+include "**/*IT.class"
+systemProperties.beamTestPipelineOptions = 
configuration.integrationTestPipelineOptions
+  }
+}
+
+// When applied in a module's build.gradle file, 

[jira] [Work logged] (BEAM-3942) Update performance testing framework to use Gradle.

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3942?focusedWorklogId=91569=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91569
 ]

ASF GitHub Bot logged work on BEAM-3942:


Author: ASF GitHub Bot
Created on: 17/Apr/18 01:44
Start Date: 17/Apr/18 01:44
Worklog Time Spent: 10m 
  Work Description: chamikaramj commented on a change in pull request 
#5003: [BEAM-3942] Update performance testing framework to use Gradle
URL: https://github.com/apache/beam/pull/5003#discussion_r181932097
 
 

 ##
 File path: build_rules.gradle
 ##
 @@ -837,6 +837,179 @@ ext.applyJavaNature = {
   }
 }
 
+// Reads and contains all necessary performance test parameters
+class JavaPerformanceTestConfiguration {
+
+  /* Optional properties (set only if needed in your case): */
+
+  // Path to PerfKitBenchmarker application (pkb.py).
+  // It is only required when running Performance Tests with PerfKitBenchmarker
+  String pkbLocation = System.getProperty('pkbLocation')
+
+  // Data Processing Backend's log level.
+  String logLevel = System.getProperty('logLevel', 'INFO')
+
+  // Path to gradle binary.
+  String gradleBinary = System.getProperty('gradleBinary', './gradlew')
+
+  // If benchmark is official or not.
+  // Official benchmark results are meant to be displayed on PerfKitExplorer 
dashboards.
+  String isOfficial = System.getProperty('official', 'false')
+
+  // Specifies names of benchmarks to be run by PerfKitBenchmarker.
+  String benchmarks = System.getProperty('benchmarks', 
'beam_integration_benchmark')
+
+  // If beam is not "prebuilt" then PerfKitBenchmarker runs the build task 
before running the tests.
+  String beamPrebuilt = System.getProperty('beamPrebuilt', 'true')
+
+  // Beam's sdk to be used by PerfKitBenchmarker.
+  String beamSdk = System.getProperty('beamSdk', 'java')
+
+  // Timeout (in seconds) after which PerfKitBenchmarker will stop executing 
the benchmark (and will fail).
+  String timeout = System.getProperty('itTimeout', '1200')
+
+  // Path to kubernetes configuration file.
+  String kubeconfig = System.getProperty('kubeconfig', 
System.getProperty('user.home') + '/.kube/config')
+
+  // Path to kubernetes executable.
+  String kubectl = System.getProperty('kubectl', 'kubectl')
+
+  // Paths to files with kubernetes infrastructure to setup before the test 
runs.
+  // PerfKitBenchmarker will have trouble reading 'null' path. It expects 
empty string if no scripts are expected.
+  String kubernetesScripts = System.getProperty('kubernetesScripts', '')
+
+  // Path to file with 'dynamic' and 'static' pipeline options.
+  // that will be appended by PerfKitBenchmarker to the test running command.
+  // PerfKitBenchmarker will have trouble reading 'null' path. It expects 
empty string if no config file is expected.
+  String optionsConfigFile = System.getProperty('beamITOptions', '')
+
+  // Any additional properties to be appended to benchmark execution command.
+  String extraProperties = System.getProperty('beamExtraProperties', '')
+
+  // Runner which will be used for running the tests. Possible values: 
dataflow/direct.
+  // PerfKitBenchmarker will have trouble reading 'null' value. It expects 
empty string if no config file is expected.
+  String runner = System.getProperty('integrationTestRunner', '')
+
+  // Filesystem which will be used for running the tests. Possible values: 
hdfs.
+  // if not specified runner's local filesystem will be used.
+  String filesystem = System.getProperty('filesystem')
+
+  /* Always required properties: */
+
+  // Pipeline options to be used by the tested pipeline.
+  String integrationTestPipelineOptions = 
System.getProperty('integrationTestPipelineOptions')
+
+  // Fully qualified name of the test to be run, eg:
+  // 'org.apache.beam.sdks.java.io.jdbc.JdbcIOIT'.
+  String integrationTest = System.getProperty('integrationTest')
+
+  // Relative path to module where the test is, eg. 'sdks/java/io/jdbc.
+  String itModule = System.getProperty('itModule')
+}
+
+// When applied in a module's build.gradle file, this closure provides task 
for running
+// IO integration tests (manually, without PerfKitBenchmarker).
+ext.enableJavaPerformanceTesting = {
+  println "enableJavaPerformanceTesting with ${it ? "$it" : "default 
configuration"} for project ${project.name}"
+
+  // Use the implicit it parameter of the closure to handle zero argument or 
one argument map calls.
+  // See: http://groovy-lang.org/closures.html#implicit-it
+  JavaPerformanceTestConfiguration configuration = it ? it as 
JavaPerformanceTestConfiguration : new JavaPerformanceTestConfiguration()
+
+  // Task for running integration tests
+  task integrationTest(type: Test) {
+include "**/*IT.class"
+systemProperties.beamTestPipelineOptions = 
configuration.integrationTestPipelineOptions
+  }
+}
+
+// When applied in a module's build.gradle file, 

[jira] [Work logged] (BEAM-4038) Support Kafka Headers in KafkaIO

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4038?focusedWorklogId=91566=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91566
 ]

ASF GitHub Bot logged work on BEAM-4038:


Author: ASF GitHub Bot
Created on: 17/Apr/18 01:42
Start Date: 17/Apr/18 01:42
Worklog Time Spent: 10m 
  Work Description: rangadi commented on a change in pull request #5111: 
[BEAM-4038] Support Kafka Headers in KafkaIO
URL: https://github.com/apache/beam/pull/5111#discussion_r181931824
 
 

 ##
 File path: 
sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaHeader.java
 ##
 @@ -0,0 +1,28 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.kafka;
+
+/**
+ * This is a copy of Kafka's {@link org.apache.kafka.common.header.Header}. 
Included here in order
+ * to support older Kafka versions (0.9.x).
+ */
+public interface KafkaHeader {
 
 Review comment:
   As noted in previous two comments, we don't need to support compiling with 
0.9. Even master would not not compile with 0.9. We can require 0.11.x 
(whichever version first included headers) for compilation and to run tests.
   If you get the code working with the default version of kafka-clients Beam 
(I think 1.0.x), that is good enough. During the review we can check if it 
would work with older version of kafka-clients at runtime.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91566)
Time Spent: 1h 40m  (was: 1.5h)

> Support Kafka Headers in KafkaIO
> 
>
> Key: BEAM-4038
> URL: https://issues.apache.org/jira/browse/BEAM-4038
> Project: Beam
>  Issue Type: New Feature
>  Components: io-java-kafka
>Reporter: Geet Kumar
>Assignee: Raghu Angadi
>Priority: Minor
>  Time Spent: 1h 40m
>  Remaining Estimate: 0h
>
> Headers have been added to Kafka Consumer/Producer records (KAFKA-4208). The 
> purpose of this JIRA is to support this feature in KafkaIO.  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-4038) Support Kafka Headers in KafkaIO

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4038?focusedWorklogId=91564=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91564
 ]

ASF GitHub Bot logged work on BEAM-4038:


Author: ASF GitHub Bot
Created on: 17/Apr/18 01:27
Start Date: 17/Apr/18 01:27
Worklog Time Spent: 10m 
  Work Description: gkumar7 commented on a change in pull request #5111: 
[BEAM-4038] Support Kafka Headers in KafkaIO
URL: https://github.com/apache/beam/pull/5111#discussion_r181930163
 
 

 ##
 File path: 
sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaHeader.java
 ##
 @@ -0,0 +1,28 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.kafka;
+
+/**
+ * This is a copy of Kafka's {@link org.apache.kafka.common.header.Header}. 
Included here in order
+ * to support older Kafka versions (0.9.x).
+ */
+public interface KafkaHeader {
 
 Review comment:
   Hmm, when adding ```Headers``` to ```KafkaRecord```, I was not able to 
instantiate ```KafkaRecord``` in my test program. Compilation failed with:
   
   ```
   Error:(8, 5) java: cannot access org.apache.kafka.common.header.Headers
 class file for org.apache.kafka.common.header.Headers not found
   ```
   Tried with both JDK 1.7 and 1.8. This is using kafka-clients 0.9.0.0.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91564)
Time Spent: 1.5h  (was: 1h 20m)

> Support Kafka Headers in KafkaIO
> 
>
> Key: BEAM-4038
> URL: https://issues.apache.org/jira/browse/BEAM-4038
> Project: Beam
>  Issue Type: New Feature
>  Components: io-java-kafka
>Reporter: Geet Kumar
>Assignee: Raghu Angadi
>Priority: Minor
>  Time Spent: 1.5h
>  Remaining Estimate: 0h
>
> Headers have been added to Kafka Consumer/Producer records (KAFKA-4208). The 
> purpose of this JIRA is to support this feature in KafkaIO.  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1374

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[amyrvold] Fix failing nightly release build

--
[...truncated 4.39 KB...]
alias pydoc 2>/dev/null >/dev/null && unalias pydoc

pydoc () {
python -m pydoc "$@"
}

# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands.  Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH-}" ] || [ -n "${ZSH_VERSION-}" ] ; then
hash -r 2>/dev/null
fi
cd sdks/python
pip install -e .[gcp,test]
Obtaining 
file://
Collecting avro<2.0.0,>=1.8.1 (from apache-beam==2.5.0.dev0)
:339:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  SNIMissingWarning
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting dill==0.2.6 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting grpcio<2,>=1.8 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
  Using cached 
https://files.pythonhosted.org/packages/0d/54/b647a6323be6526be27b2c90bb042769f1a7a6e59bd1a5f2eeb795bfece4/grpcio-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 

Build failed in Jenkins: beam_PostCommit_Python_Verify #4712

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[amyrvold] Fix failing nightly release build

--
Started by GitHub push by aaltay
[EnvInject] - Loading node environment variables.
Building remotely on beam1 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4014b48f5aed660aee57d013e7b8b47208b28a6f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4014b48f5aed660aee57d013e7b8b47208b28a6f
Commit message: "Merge pull request #5142 from alanmyrvold/alan-rel-failure"
 > git rev-list --no-walk e1c526d7f88add0aa44635a56223a4907b81e86b # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PostCommit_Python_Verify] $ /bin/bash -xe 
/tmp/jenkins3848557500709690464.sh
+ cd src
+ bash sdks/python/run_postcommit.sh

# pip install --user installation location.
LOCAL_PATH=$HOME/.local/bin/

# Remove any tox cache from previous workspace
# TODO(udim): Remove this line and add '-r' to tox invocation instead.
rm -rf sdks/python/target/.tox

# INFRA does not install these packages
pip install --user --upgrade virtualenv tox
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:339: 
SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  SNIMissingWarning
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Requirement already up-to-date: virtualenv in 
/home/jenkins/.local/lib/python2.7/site-packages (15.2.0)
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Requirement already up-to-date: tox in 
/home/jenkins/.local/lib/python2.7/site-packages (3.0.0)
Requirement not upgraded as not directly required: py>=1.4.17 in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (1.5.3)
Requirement not upgraded as not directly required: pluggy<1.0,>=0.3.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (0.6.0)
Requirement not upgraded as not directly required: six in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (1.11.0)
cheetah 2.4.4 requires Markdown>=2.0.1, which is not installed.
apache-beam 2.5.0.dev0 requires hdfs<3.0.0,>=2.1.0, which is not installed.
apache-beam 2.5.0.dev0 requires pytz>=2018.3, which is not installed.
apache-beam 2.5.0.dev0 has requirement grpcio<2,>=1.8, but you'll have grpcio 
1.4.0 which is incompatible.

# Tox runs unit tests in a virtual environment
${LOCAL_PATH}/tox -e ALL -c sdks/python/tox.ini
GLOB sdist-make: 

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #91

2018-04-16 Thread Apache Jenkins Server
See 


--
[...truncated 18.95 MB...]
INFO: Uploading 
/home/jenkins/.m2/repository/com/thoughtworks/paranamer/paranamer/2.7/paranamer-2.7.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0417010853-9831a24a/output/results/staging/paranamer-2.7-VweilzYySf_-OOgYnNb5yw.jar
Apr 17, 2018 1:08:57 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/com/google/http-client/google-http-client/1.22.0/google-http-client-1.22.0.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0417010853-9831a24a/output/results/staging/google-http-client-1.22.0-q8VPJBnjIy5tDHYvqlr2Jw.jar
Apr 17, 2018 1:08:57 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/commons-logging/commons-logging/1.2/commons-logging-1.2.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0417010853-9831a24a/output/results/staging/commons-logging-1.2-BAtLTY6siG9rSio70vMbAA.jar
Apr 17, 2018 1:08:57 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/apache/commons/commons-compress/1.16.1/commons-compress-1.16.1.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0417010853-9831a24a/output/results/staging/commons-compress-1.16.1-NAljjWtr0jBC7qxc2X4lbA.jar
Apr 17, 2018 1:08:57 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-context/1.6.1/9c52ae577c2dd4b8c6ac6e49c1154e1dc37d98ee/grpc-context-1.6.1.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0417010853-9831a24a/output/results/staging/grpc-context-1.6.1-JoAKXOzNRjLi-eq0okbqJw.jar
Apr 17, 2018 1:08:57 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0417010853-9831a24a/output/results/staging/jackson-mapper-asl-1.9.13-F1D5wzk1L8S3KNYbVxcWEw.jar
Apr 17, 2018 1:08:57 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/xerial/snappy/snappy-java/1.1.4/snappy-java-1.1.4.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0417010853-9831a24a/output/results/staging/snappy-java-1.1.4-SFNwbMuGq13aaoKVzeS1Tw.jar
Apr 17, 2018 1:08:57 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/objenesis/objenesis/2.6/objenesis-2.6.jar to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0417010853-9831a24a/output/results/staging/objenesis-2.6-X_rD9RQFypspFZcKIks-jw.jar
Apr 17, 2018 1:08:57 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar 
to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0417010853-9831a24a/output/results/staging/kryo-2.21-olkSUBNMYGe-WXkdlwcytQ.jar
Apr 17, 2018 1:08:58 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/com/google/api/grpc/proto-google-cloud-spanner-admin-instance-v1/0.1.11/proto-google-cloud-spanner-admin-instance-v1-0.1.11.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0417010853-9831a24a/output/results/staging/proto-google-cloud-spanner-admin-instance-v1-0.1.11-XJS_u8898MWkoK2CW-gz_A.jar
Apr 17, 2018 1:08:58 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/json/json/20160810/json-20160810.jar to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0417010853-9831a24a/output/results/staging/json-20160810-L3-Jnwdm5lAXdEpMT8FNRg.jar
Apr 17, 2018 1:08:58 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/com/squareup/okio/okio/1.6.0/okio-1.6.0.jar to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0417010853-9831a24a/output/results/staging/okio-1.6.0-Fk0cKMMjz24qkX1gN0xXGA.jar
Apr 17, 2018 1:08:58 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/com/google/protobuf/nano/protobuf-javanano/3.0.0-alpha-5/protobuf-javanano-3.0.0-alpha-5.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0417010853-9831a24a/output/results/staging/protobuf-javanano-3.0.0-alpha-5-SsS7fZe7brffHbAGUgRrnw.jar
Apr 17, 2018 1:08:58 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage

[jira] [Work logged] (BEAM-3981) Futurize and fix python 2 compatibility for coders package

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3981?focusedWorklogId=91563=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91563
 ]

ASF GitHub Bot logged work on BEAM-3981:


Author: ASF GitHub Bot
Created on: 17/Apr/18 01:10
Start Date: 17/Apr/18 01:10
Worklog Time Spent: 10m 
  Work Description: tvalentyn commented on issue #5053: [BEAM-3981] 
Futurize coders subpackage
URL: https://github.com/apache/beam/pull/5053#issuecomment-381797547
 
 
   There is a proto generation error in the ValidatesRunner tests, which seems 
to be unrelated to this PR, I also see that error in other Postcommit  test 
suites.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91563)
Time Spent: 12h  (was: 11h 50m)

> Futurize and fix python 2 compatibility for coders package
> --
>
> Key: BEAM-3981
> URL: https://issues.apache.org/jira/browse/BEAM-3981
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Robbe
>Assignee: Ahmet Altay
>Priority: Major
>  Time Spent: 12h
>  Remaining Estimate: 0h
>
> Run automatic conversion with futurize tool on coders subpackage and fix 
> python 2 compatibility. This prepares the subpackage for python 3 support.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-4069) Empty pipeline options can be gracefully serialized/deserialized

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4069?focusedWorklogId=91562=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91562
 ]

ASF GitHub Bot logged work on BEAM-4069:


Author: ASF GitHub Bot
Created on: 17/Apr/18 01:04
Start Date: 17/Apr/18 01:04
Worklog Time Spent: 10m 
  Work Description: bsidhom commented on a change in pull request #5126: 
[BEAM-4069] Gracefully deserialize empty options structs
URL: https://github.com/apache/beam/pull/5126#discussion_r181926271
 
 

 ##
 File path: 
sdks/java/core/src/main/java/org/apache/beam/sdk/options/ProxyInvocationHandler.java
 ##
 @@ -732,14 +732,22 @@ private void ensureSerializable(
 @Override
 public PipelineOptions deserialize(JsonParser jp, DeserializationContext 
ctxt)
 throws IOException, JsonProcessingException {
-  ObjectNode objectNode = (ObjectNode) jp.readValueAsTree();
-  ObjectNode optionsNode = (ObjectNode) objectNode.get("options");
+  ObjectNode objectNode = jp.readValueAsTree();
+  JsonNode rawOptionsNode = objectNode.get("options");
+  ObjectNode optionsNode;
+  if (rawOptionsNode == null || rawOptionsNode.isNull()) {
 
 Review comment:
   As it turns out, the second if statement is completely redundant. ;)
   
   I wasn't confident about the equivalence between an empty 
`ProxyInvocationHandler` reconstruction and `PipelineOptionsFactory.create()`, 
so I've left that as it was.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91562)
Time Spent: 1h  (was: 50m)

> Empty pipeline options can be gracefully serialized/deserialized
> 
>
> Key: BEAM-4069
> URL: https://issues.apache.org/jira/browse/BEAM-4069
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core
>Reporter: Ben Sidhom
>Assignee: Ben Sidhom
>Priority: Minor
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> PipelineOptionsTranslation.fromProto currently crashes with a 
> NullPointerException when passed an empty options Struct. This is due to 
> ProxyInvocationHandler.Deserializer expecting a non-empty enclosing Struct.
> Empty pipeline options may be passed by SDKs interacting with a job server, 
> so this case needs to be handled. Note that testing a round-trip of an 
> effectively-empty Java PipelineOptions object is not sufficient to catch this 
> because "empty" Java options still contain default fields not defined in 
> other SDKs.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3981) Futurize and fix python 2 compatibility for coders package

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3981?focusedWorklogId=91561=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91561
 ]

ASF GitHub Bot logged work on BEAM-3981:


Author: ASF GitHub Bot
Created on: 17/Apr/18 01:02
Start Date: 17/Apr/18 01:02
Worklog Time Spent: 10m 
  Work Description: tvalentyn commented on issue #5053: [BEAM-3981] 
Futurize coders subpackage
URL: https://github.com/apache/beam/pull/5053#issuecomment-381796316
 
 
   Actually, looking at Jenkins logs I see that Jenkins already merges this PR 
with latest commit on master when we run a Postcommit suite:  
   > Checking out Revision 1b8df077a60fc0188d786a146d5c5edb9eb2732f 
(refs/remotes/origin/pr/5053/merge)
   >  > git config core.sparsecheckout # timeout=10
   >  > git checkout -f 1b8df077a60fc0188d786a146d5c5edb9eb2732f
   > Commit message: "Merge 6909ff3d2b1802c0712d8236692c0c7110d9b250 into 
e1c526d7f88add0aa44635a56223a4907b81e86b"
   > ...
   > 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91561)
Time Spent: 11h 50m  (was: 11h 40m)

> Futurize and fix python 2 compatibility for coders package
> --
>
> Key: BEAM-3981
> URL: https://issues.apache.org/jira/browse/BEAM-3981
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Robbe
>Assignee: Ahmet Altay
>Priority: Major
>  Time Spent: 11h 50m
>  Remaining Estimate: 0h
>
> Run automatic conversion with futurize tool on coders subpackage and fix 
> python 2 compatibility. This prepares the subpackage for python 3 support.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Apex_Gradle #114

2018-04-16 Thread Apache Jenkins Server
See 


--
[...truncated 27.84 MB...]
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 1 sending EndOfStream
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 2 sending EndOfStream
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 3 sending EndOfStream
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 4 sending EndOfStream
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 5 sending EndOfStream
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 6 sending EndOfStream
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 7 sending EndOfStream
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 8 sending EndOfStream
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 9 sending EndOfStream
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 12 sending EndOfStream
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 13 sending EndOfStream
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 14 sending EndOfStream
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 15 sending EndOfStream
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 16 sending EndOfStream
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 17 sending EndOfStream
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 18 sending EndOfStream
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 19 sending EndOfStream
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Undeploy request: [14]
Apr 17, 2018 1:03:03 AM com.datatorrent.stram.engine.StreamingContainer 
undeploy
INFO: Undeploy complete.
Apr 17, 2018 1:03:03 AM com.datatorrent.bufferserver.server.Server$3 run
INFO: Removing ln 
LogicalNode@48eec632identifier=tcp://localhost:46879/14.output.14, 
upstream=14.output.14, group=stream17/15.input, partitions=[], 
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@59e2638{da=com.datatorrent.bufferserver.internal.DataList$Block@5bf54420{identifier=14.output.14,
 data=1048576, readingOffset=0, writingOffset=245, 
starting_window=5ad547c40001, ending_window=5ad547c40007, refCount=2, 
uniqueIdentifier=0, next=null, future=null}}} from dl 
com.datatorrent.bufferserver.internal.DataList@7e6efa4b {14.output.14}
Apr 17, 2018 1:03:04 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 1:03:04 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Undeploy request: [1]
Apr 17, 2018 1:03:04 AM com.datatorrent.stram.engine.StreamingContainer 
undeploy
INFO: Undeploy complete.
Apr 17, 2018 1:03:04 AM com.datatorrent.bufferserver.server.Server$3 run
INFO: Removing ln 
LogicalNode@1dbc1ff0identifier=tcp://localhost:46879/1.output.1, 
upstream=1.output.1, group=stream6/2.input, partitions=[], 
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@50b54d66{da=com.datatorrent.bufferserver.internal.DataList$Block@24ee1eae{identifier=1.output.1,
 data=1048576, readingOffset=0, writingOffset=291, 
starting_window=5ad547c40001, ending_window=5ad547c40007, refCount=2, 

Jenkins build is back to normal : beam_PostCommit_Java_GradleBuild #90

2018-04-16 Thread Apache Jenkins Server
See 




[jira] [Created] (BEAM-4098) Handle WindowInto in the Java SDK Harness

2018-04-16 Thread Robert Bradshaw (JIRA)
Robert Bradshaw created BEAM-4098:
-

 Summary: Handle WindowInto in the Java SDK Harness
 Key: BEAM-4098
 URL: https://issues.apache.org/jira/browse/BEAM-4098
 Project: Beam
  Issue Type: Task
  Components: sdk-java-core
Reporter: Robert Bradshaw
Assignee: Kenneth Knowles






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-4096) BigQueryIO ValueProvider support for Method and Triggering Frequency

2018-04-16 Thread Chamikara Jayalath (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-4096?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16440217#comment-16440217
 ] 

Chamikara Jayalath commented on BEAM-4096:
--

Looks like you haven't been given the Jira contributors role yet. [~kenn] might 
be able to add you.

> BigQueryIO ValueProvider support for Method and Triggering Frequency
> 
>
> Key: BEAM-4096
> URL: https://issues.apache.org/jira/browse/BEAM-4096
> Project: Beam
>  Issue Type: Improvement
>  Components: io-java-gcp
>Affects Versions: 2.4.0
>Reporter: Ryan McDowell
>Priority: Minor
> Fix For: 2.5.0
>
>
> Enhance BigQueryIO to accept ValueProviders for:
>  * withMethod(..)
>  * withTriggeringFrequency(..)
> It would allow Dataflow templates to accept these parameters at runtime 
> instead of being hardcoded. This opens up the ability to create Dataflow 
> templates which allow users to flip back-and-forth between batch and 
> streaming inserts.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-4097) Python SDK should set the environment in the job submission protos

2018-04-16 Thread Robert Bradshaw (JIRA)
Robert Bradshaw created BEAM-4097:
-

 Summary: Python SDK should set the environment in the job 
submission protos
 Key: BEAM-4097
 URL: https://issues.apache.org/jira/browse/BEAM-4097
 Project: Beam
  Issue Type: Task
  Components: sdk-py-core
Reporter: Robert Bradshaw
Assignee: Ahmet Altay






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-4096) BigQueryIO ValueProvider support for Method and Triggering Frequency

2018-04-16 Thread Ryan McDowell (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-4096?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16440215#comment-16440215
 ] 

Ryan McDowell commented on BEAM-4096:
-

I have a member on my team who is interested in taking it on this week but I 
don't believe they're on the Jira yet. Feel free to assign to me and I'll 
update once they're setup.

> BigQueryIO ValueProvider support for Method and Triggering Frequency
> 
>
> Key: BEAM-4096
> URL: https://issues.apache.org/jira/browse/BEAM-4096
> Project: Beam
>  Issue Type: Improvement
>  Components: io-java-gcp
>Affects Versions: 2.4.0
>Reporter: Ryan McDowell
>Priority: Minor
> Fix For: 2.5.0
>
>
> Enhance BigQueryIO to accept ValueProviders for:
>  * withMethod(..)
>  * withTriggeringFrequency(..)
> It would allow Dataflow templates to accept these parameters at runtime 
> instead of being hardcoded. This opens up the ability to create Dataflow 
> templates which allow users to flip back-and-forth between batch and 
> streaming inserts.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-4096) BigQueryIO ValueProvider support for Method and Triggering Frequency

2018-04-16 Thread Chamikara Jayalath (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4096?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chamikara Jayalath reassigned BEAM-4096:


Assignee: (was: Chamikara Jayalath)

> BigQueryIO ValueProvider support for Method and Triggering Frequency
> 
>
> Key: BEAM-4096
> URL: https://issues.apache.org/jira/browse/BEAM-4096
> Project: Beam
>  Issue Type: Improvement
>  Components: io-java-gcp
>Affects Versions: 2.4.0
>Reporter: Ryan McDowell
>Priority: Minor
> Fix For: 2.5.0
>
>
> Enhance BigQueryIO to accept ValueProviders for:
>  * withMethod(..)
>  * withTriggeringFrequency(..)
> It would allow Dataflow templates to accept these parameters at runtime 
> instead of being hardcoded. This opens up the ability to create Dataflow 
> templates which allow users to flip back-and-forth between batch and 
> streaming inserts.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-4096) BigQueryIO ValueProvider support for Method and Triggering Frequency

2018-04-16 Thread Chamikara Jayalath (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-4096?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16440214#comment-16440214
 ] 

Chamikara Jayalath commented on BEAM-4096:
--

Ryan, thanks for filing the JIRA. Is this something you hope to work on ?

> BigQueryIO ValueProvider support for Method and Triggering Frequency
> 
>
> Key: BEAM-4096
> URL: https://issues.apache.org/jira/browse/BEAM-4096
> Project: Beam
>  Issue Type: Improvement
>  Components: io-java-gcp
>Affects Versions: 2.4.0
>Reporter: Ryan McDowell
>Assignee: Chamikara Jayalath
>Priority: Minor
> Fix For: 2.5.0
>
>
> Enhance BigQueryIO to accept ValueProviders for:
>  * withMethod(..)
>  * withTriggeringFrequency(..)
> It would allow Dataflow templates to accept these parameters at runtime 
> instead of being hardcoded. This opens up the ability to create Dataflow 
> templates which allow users to flip back-and-forth between batch and 
> streaming inserts.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #112

2018-04-16 Thread Apache Jenkins Server
See 


--
[...truncated 1.24 MB...]
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:626)
at scala.Option.getOrElse(Option.scala:121)
at 
org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:828)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:626)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:169)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:123)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:83)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
at 
org.apache.beam.runners.spark.translation.streaming.CreateStreamTest.testFirstElementLate(CreateStreamTest.java:240)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
at 
org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:239)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.runTestClass(JUnitTestClassExecuter.java:114)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.execute(JUnitTestClassExecuter.java:57)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassProcessor.processTestClass(JUnitTestClassProcessor.java:66)
at 
org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at 
org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:32)
at 
org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:93)
at com.sun.proxy.$Proxy3.processTestClass(Unknown Source)
at 
org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:108)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at 

[jira] [Updated] (BEAM-4096) BigQueryIO ValueProvider support for Method and Triggering Frequency

2018-04-16 Thread Ryan McDowell (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4096?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ryan McDowell updated BEAM-4096:

Priority: Minor  (was: Major)

> BigQueryIO ValueProvider support for Method and Triggering Frequency
> 
>
> Key: BEAM-4096
> URL: https://issues.apache.org/jira/browse/BEAM-4096
> Project: Beam
>  Issue Type: Improvement
>  Components: io-java-gcp
>Affects Versions: 2.4.0
>Reporter: Ryan McDowell
>Assignee: Chamikara Jayalath
>Priority: Minor
> Fix For: 2.5.0
>
>
> Enhance BigQueryIO to accept ValueProviders for:
>  * withMethod(..)
>  * withTriggeringFrequency(..)
> It would allow Dataflow templates to accept these parameters at runtime 
> instead of being hardcoded. This opens up the ability to create Dataflow 
> templates which allow users to flip back-and-forth between batch and 
> streaming inserts.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-4096) BigQueryIO ValueProvider support for Method and Triggering Frequency

2018-04-16 Thread Ryan McDowell (JIRA)
Ryan McDowell created BEAM-4096:
---

 Summary: BigQueryIO ValueProvider support for Method and 
Triggering Frequency
 Key: BEAM-4096
 URL: https://issues.apache.org/jira/browse/BEAM-4096
 Project: Beam
  Issue Type: Improvement
  Components: io-java-gcp
Affects Versions: 2.4.0
Reporter: Ryan McDowell
Assignee: Chamikara Jayalath
 Fix For: 2.5.0


Enhance BigQueryIO to accept ValueProviders for:
 * withMethod(..)
 * withTriggeringFrequency(..)

It would allow Dataflow templates to accept these parameters at runtime instead 
of being hardcoded. This opens up the ability to create Dataflow templates 
which allow users to flip back-and-forth between batch and streaming inserts.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_Spark #1599

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[swegner] Fix a typo in gradle task group

[mingmxu] support MAP in SQL schema

[mingmxu] in MAP, key as primitive, and value can be primitive/array/map/row

[mingmxu] use Collection for ARRAY type, and re-org `verify` code in `Row`

[mingmxu] rebase as file conflict with #5089

[github] Update containers at master to newly released beam-master-20180413.

[mingmxu] rename CollectionType to CollectionElementType

[github] Add region to dataflowOptions struct.

[sidhom] [BEAM-4056] Identify side inputs by transform id and local name

[sidhom] Add side input assertions to ExecutableStageMatcher

--
[...truncated 70.49 KB...]
2018-04-17 00:22:35,029 b84ca22d MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-17 00:22:52,739 b84ca22d MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-17 00:22:56,275 b84ca22d MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r6cc77394b09c8ebe_0162d0fb9c9f_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r6cc77394b09c8ebe_0162d0fb9c9f_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r6cc77394b09c8ebe_0162d0fb9c9f_1 ... (0s) Current status: DONE   
2018-04-17 00:22:56,276 b84ca22d MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-17 00:23:23,771 b84ca22d MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-17 00:23:27,617 b84ca22d MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r57681aa55f838664_0162d0fc163d_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_r57681aa55f838664_0162d0fc163d_1 
... (0s) Current status: RUNNING
  Waiting on 
bqjob_r57681aa55f838664_0162d0fc163d_1 ... (0s) Current status: DONE   
2018-04-17 00:23:27,618 b84ca22d MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-17 00:23:47,803 b84ca22d MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-17 00:23:52,652 b84ca22d MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_rd3172a513f32d26_0162d0fc780b_1': Invalid schema
update. Field timestamp has changed type from TIMESTAMP to FLOAT

STDERR: Upload complete.Waiting on bqjob_rd3172a513f32d26_0162d0fc780b_1 
... (0s) Current status: RUNNING
 Waiting on 
bqjob_rd3172a513f32d26_0162d0fc780b_1 ... (0s) Current status: DONE   
2018-04-17 00:23:52,652 b84ca22d MainThread INFO Retrying exception running 
IssueRetryableCommand: Command returned a non-zero exit code.

2018-04-17 00:24:20,883 b84ca22d MainThread INFO Running: bq load 
--autodetect --source_format=NEWLINE_DELIMITED_JSON 
beam_performance.pkb_results 

2018-04-17 00:24:25,667 b84ca22d MainThread INFO Ran: {bq load --autodetect 
--source_format=NEWLINE_DELIMITED_JSON beam_performance.pkb_results 

  ReturnCode:1
STDOUT: 

BigQuery error in load operation: 

[jira] [Work logged] (BEAM-3952) GreedyStageFuserTest broken

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3952?focusedWorklogId=91559=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91559
 ]

ASF GitHub Bot logged work on BEAM-3952:


Author: ASF GitHub Bot
Created on: 17/Apr/18 00:28
Start Date: 17/Apr/18 00:28
Worklog Time Spent: 10m 
  Work Description: robertwb commented on issue #4995: 
[BEAM-3952][BEAM-3988] Fix GreedyPipelineFuser test
URL: https://github.com/apache/beam/pull/4995#issuecomment-381791260
 
 
   Doesn't this break PTransforms like the "switch" transform that takes two 
inputs and returns exactly one of them (without any subtransforms)?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91559)
Time Spent: 50m  (was: 40m)

> GreedyStageFuserTest broken
> ---
>
> Key: BEAM-3952
> URL: https://issues.apache.org/jira/browse/BEAM-3952
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core
>Reporter: Ben Sidhom
>Assignee: Thomas Groh
>Priority: Minor
> Fix For: Not applicable
>
>  Time Spent: 50m
>  Remaining Estimate: 0h
>
> The materializesWithDifferentEnvConsumer test is currently failing due to a 
> bad assertion. The fused subgraph contains the parDo.out PCollection but the 
> test expects an empty output.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3327) Add abstractions to manage Environment Instance lifecycles.

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3327?focusedWorklogId=91558=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91558
 ]

ASF GitHub Bot logged work on BEAM-3327:


Author: ASF GitHub Bot
Created on: 17/Apr/18 00:26
Start Date: 17/Apr/18 00:26
Worklog Time Spent: 10m 
  Work Description: bsidhom commented on issue #5144: [BEAM-3327] Rename 
EnvironmentManager to EnvironmentFactory
URL: https://github.com/apache/beam/pull/5144#issuecomment-381790930
 
 
   FYI, the `:beam-sdks-python:setupVirtualenv` task has been failing for a 
while now and appears to be unrelated.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91558)
Time Spent: 6h  (was: 5h 50m)

> Add abstractions to manage Environment Instance lifecycles.
> ---
>
> Key: BEAM-3327
> URL: https://issues.apache.org/jira/browse/BEAM-3327
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Axel Magnuson
>Priority: Major
>  Labels: portability
>  Time Spent: 6h
>  Remaining Estimate: 0h
>
> This permits remote stage execution for arbitrary environments



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3327) Add abstractions to manage Environment Instance lifecycles.

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3327?focusedWorklogId=91557=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91557
 ]

ASF GitHub Bot logged work on BEAM-3327:


Author: ASF GitHub Bot
Created on: 17/Apr/18 00:26
Start Date: 17/Apr/18 00:26
Worklog Time Spent: 10m 
  Work Description: bsidhom commented on issue #5144: [BEAM-3327] Rename 
EnvironmentManager to EnvironmentFactory
URL: https://github.com/apache/beam/pull/5144#issuecomment-381790930
 
 
   FYI, the `:beam-sdks-python:setupVirtualenv` has been failing for a while 
now and appears to be unrelated.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91557)
Time Spent: 5h 50m  (was: 5h 40m)

> Add abstractions to manage Environment Instance lifecycles.
> ---
>
> Key: BEAM-3327
> URL: https://issues.apache.org/jira/browse/BEAM-3327
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Axel Magnuson
>Priority: Major
>  Labels: portability
>  Time Spent: 5h 50m
>  Remaining Estimate: 0h
>
> This permits remote stage execution for arbitrary environments



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Apex_Gradle #113

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[github] Update containers at master to newly released beam-master-20180413.

--
[...truncated 28.10 MB...]
INFO: 13 sending EndOfStream
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 14 sending EndOfStream
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 15 sending EndOfStream
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 16 sending EndOfStream
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 17 sending EndOfStream
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 18 sending EndOfStream
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.engine.Node emitEndStream
INFO: 19 sending EndOfStream
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:15 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:16 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:16 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Undeploy request: [1]
Apr 17, 2018 12:23:16 AM com.datatorrent.stram.engine.StreamingContainer 
undeploy
INFO: Undeploy complete.
Apr 17, 2018 12:23:16 AM com.datatorrent.bufferserver.server.Server$3 run
INFO: Removing ln 
LogicalNode@3ea275faidentifier=tcp://localhost:42634/1.output.1, 
upstream=1.output.1, group=stream14/2.input, partitions=[], 
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1e0ac652{da=com.datatorrent.bufferserver.internal.DataList$Block@344824b9{identifier=1.output.1,
 data=1048576, readingOffset=0, writingOffset=306, 
starting_window=5ad53e71, ending_window=5ad53e77, refCount=2, 
uniqueIdentifier=0, next=null, future=null}}} from dl 
com.datatorrent.bufferserver.internal.DataList@24557c08 {1.output.1}
Apr 17, 2018 12:23:16 AM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 17, 2018 12:23:16 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Undeploy request: [10]
Apr 17, 2018 12:23:16 AM com.datatorrent.stram.engine.StreamingContainer 
undeploy
INFO: Undeploy complete.
Apr 17, 2018 12:23:16 AM com.datatorrent.bufferserver.server.Server$3 run
INFO: Removing ln 

[jira] [Work logged] (BEAM-4071) Portable Runner Job API shim

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4071?focusedWorklogId=91556=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91556
 ]

ASF GitHub Bot logged work on BEAM-4071:


Author: ASF GitHub Bot
Created on: 17/Apr/18 00:22
Start Date: 17/Apr/18 00:22
Worklog Time Spent: 10m 
  Work Description: bsidhom commented on issue #5150:  [BEAM-4071] Add 
Portable Runner Job API shim
URL: https://github.com/apache/beam/pull/5150#issuecomment-381790425
 
 
   R: @tgroh 
   CC: @jkff @angoenka


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91556)
Time Spent: 20m  (was: 10m)

> Portable Runner Job API shim
> 
>
> Key: BEAM-4071
> URL: https://issues.apache.org/jira/browse/BEAM-4071
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-core
>Reporter: Ben Sidhom
>Assignee: Ben Sidhom
>Priority: Minor
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> There needs to be a way to execute Java-SDK pipelines against a portable job 
> server. The job server itself is expected to be started up out-of-band. The 
> "PortableRunner" should take an option indicating the Job API endpoint and 
> defer other runner configurations to the backend itself.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #57

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[swegner] Fix a typo in gradle task group

[mingmxu] support MAP in SQL schema

[mingmxu] in MAP, key as primitive, and value can be primitive/array/map/row

[mingmxu] use Collection for ARRAY type, and re-org `verify` code in `Row`

[mingmxu] rebase as file conflict with #5089

[github] Update containers at master to newly released beam-master-20180413.

[mingmxu] rename CollectionType to CollectionElementType

[github] Add region to dataflowOptions struct.

[sidhom] [BEAM-4056] Identify side inputs by transform id and local name

[sidhom] Add side input assertions to ExecutableStageMatcher

--
[...truncated 171.71 KB...]
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy61.create(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:248)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:235)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at 
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at 
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy60.create(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy61.create(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at 

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1373

2018-04-16 Thread Apache Jenkins Server
See 


--
[...truncated 4.79 KB...]
basename "$VIRTUAL_ENV"

# Make sure to unalias pydoc if it's already there
alias pydoc 2>/dev/null >/dev/null && unalias pydoc

pydoc () {
python -m pydoc "$@"
}

# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands.  Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH-}" ] || [ -n "${ZSH_VERSION-}" ] ; then
hash -r 2>/dev/null
fi
cd sdks/python
pip install -e .[gcp,test]
Obtaining 
file://
Collecting avro<2.0.0,>=1.8.1 (from apache-beam==2.5.0.dev0)
:339:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  SNIMissingWarning
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting dill==0.2.6 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting grpcio<2,>=1.8 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
  Using cached 
https://files.pythonhosted.org/packages/0d/54/b647a6323be6526be27b2c90bb042769f1a7a6e59bd1a5f2eeb795bfece4/grpcio-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting httplib2<0.10,>=0.8 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For 

[jira] [Work logged] (BEAM-4071) Portable Runner Job API shim

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4071?focusedWorklogId=91555=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91555
 ]

ASF GitHub Bot logged work on BEAM-4071:


Author: ASF GitHub Bot
Created on: 17/Apr/18 00:18
Start Date: 17/Apr/18 00:18
Worklog Time Spent: 10m 
  Work Description: bsidhom opened a new pull request #5150:  [BEAM-4071] 
Add Portable Runner Job API shim
URL: https://github.com/apache/beam/pull/5150
 
 
   This runner allows clients of the Java SDK to run against portable an 
external Job Service with a provided endpoint.
   
   
   
   Follow this checklist to help us incorporate your contribution quickly and 
easily:
   
- [ ] Make sure there is a [JIRA 
issue](https://issues.apache.org/jira/projects/BEAM/issues/) filed for the 
change (usually before you start working on it).  Trivial changes like typos do 
not require a JIRA issue.  Your pull request should address just this issue, 
without pulling in other changes.
- [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue.
- [ ] Write a pull request description that is detailed enough to 
understand:
  - [ ] What the pull request does
  - [ ] Why it does it
  - [ ] How it does it
  - [ ] Why this approach
- [ ] Each commit in the pull request should have a meaningful subject line 
and body.
- [ ] Run `mvn clean verify` to make sure basic checks pass. A more 
thorough check will be performed on your pull request automatically.
- [ ] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91555)
Time Spent: 10m
Remaining Estimate: 0h

> Portable Runner Job API shim
> 
>
> Key: BEAM-4071
> URL: https://issues.apache.org/jira/browse/BEAM-4071
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-core
>Reporter: Ben Sidhom
>Assignee: Ben Sidhom
>Priority: Minor
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> There needs to be a way to execute Java-SDK pipelines against a portable job 
> server. The job server itself is expected to be started up out-of-band. The 
> "PortableRunner" should take an option indicating the Job API endpoint and 
> defer other runner configurations to the backend itself.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_Verify #4711

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[github] Update containers at master to newly released beam-master-20180413.

--
Started by GitHub push by aaltay
[EnvInject] - Loading node environment variables.
Building remotely on beam1 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e1c526d7f88add0aa44635a56223a4907b81e86b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e1c526d7f88add0aa44635a56223a4907b81e86b
Commit message: "Merge pull request #5131 from tvalentyn/patch-7"
 > git rev-list --no-walk e5a17db4f4ccf30548bb18c96277e7afe1b8ae56 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PostCommit_Python_Verify] $ /bin/bash -xe 
/tmp/jenkins1458079275236088013.sh
+ cd src
+ bash sdks/python/run_postcommit.sh

# pip install --user installation location.
LOCAL_PATH=$HOME/.local/bin/

# Remove any tox cache from previous workspace
# TODO(udim): Remove this line and add '-r' to tox invocation instead.
rm -rf sdks/python/target/.tox

# INFRA does not install these packages
pip install --user --upgrade virtualenv tox
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:339: 
SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  SNIMissingWarning
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Requirement already up-to-date: virtualenv in 
/home/jenkins/.local/lib/python2.7/site-packages (15.2.0)
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Requirement already up-to-date: tox in 
/home/jenkins/.local/lib/python2.7/site-packages (3.0.0)
Requirement not upgraded as not directly required: py>=1.4.17 in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (1.5.3)
Requirement not upgraded as not directly required: pluggy<1.0,>=0.3.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (0.6.0)
Requirement not upgraded as not directly required: six in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (1.11.0)
cheetah 2.4.4 requires Markdown>=2.0.1, which is not installed.
apache-beam 2.5.0.dev0 requires hdfs<3.0.0,>=2.1.0, which is not installed.
apache-beam 2.5.0.dev0 requires pytz>=2018.3, which is not installed.
apache-beam 2.5.0.dev0 has requirement grpcio<2,>=1.8, but you'll have grpcio 
1.4.0 which is incompatible.

# Tox runs unit tests in a virtual environment
${LOCAL_PATH}/tox -e ALL -c sdks/python/tox.ini
GLOB sdist-make: 

Build failed in Jenkins: beam_PerformanceTests_MongoDBIO_IT #59

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[swegner] Fix a typo in gradle task group

[mingmxu] support MAP in SQL schema

[mingmxu] in MAP, key as primitive, and value can be primitive/array/map/row

[mingmxu] use Collection for ARRAY type, and re-org `verify` code in `Row`

[mingmxu] rebase as file conflict with #5089

[github] Update containers at master to newly released beam-master-20180413.

[mingmxu] rename CollectionType to CollectionElementType

[github] Add region to dataflowOptions struct.

[sidhom] [BEAM-4056] Identify side inputs by transform id and local name

[sidhom] Add side input assertions to ExecutableStageMatcher

--
[...truncated 55.18 KB...]
at 
com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
at 
com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:200)
at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:158)
at 
com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:75)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:383)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:355)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:286)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
com.mongodb.MongoTimeoutException: Timed out after 3 ms while waiting for a 
server that matches WritableServerSelector. Client view of cluster state is 
{type=UNKNOWN, servers=[{address=35.226.169.59:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
at 
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:75)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:71)
at 
com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
at 
com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
at com.mongodb.Mongo.execute(Mongo.java:781)
at com.mongodb.Mongo$2.execute(Mongo.java:764)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 3 ms while waiting for a 
server that matches WritableServerSelector. Client view of cluster state is 
{type=UNKNOWN, servers=[{address=35.226.169.59:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
at 
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:75)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:71)
at 
com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
at 
com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
at 

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #64

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[swegner] Fix a typo in gradle task group

[mingmxu] support MAP in SQL schema

[mingmxu] in MAP, key as primitive, and value can be primitive/array/map/row

[mingmxu] use Collection for ARRAY type, and re-org `verify` code in `Row`

[mingmxu] rebase as file conflict with #5089

[github] Update containers at master to newly released beam-master-20180413.

[mingmxu] rename CollectionType to CollectionElementType

[github] Add region to dataflowOptions struct.

[sidhom] [BEAM-4056] Identify side inputs by transform id and local name

[sidhom] Add side input assertions to ExecutableStageMatcher

--
[...truncated 64.02 KB...]
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-io-hadoop-file-system:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-hdfs:jar:2.7.1 from the shaded jar.
[INFO] Excluding org.mortbay.jetty:jetty:jar:6.1.26 from the shaded jar.
[INFO] Excluding org.mortbay.jetty:jetty-util:jar:6.1.26 from the shaded jar.
[INFO] Excluding com.sun.jersey:jersey-core:jar:1.9 from the shaded jar.
[INFO] Excluding com.sun.jersey:jersey-server:jar:1.9 from the shaded jar.
[INFO] Excluding asm:asm:jar:3.1 from the shaded jar.
[INFO] Excluding commons-cli:commons-cli:jar:1.2 from the shaded jar.
[INFO] Excluding commons-codec:commons-codec:jar:1.4 from the shaded jar.
[INFO] Excluding commons-io:commons-io:jar:2.4 from the shaded jar.
[INFO] Excluding commons-lang:commons-lang:jar:2.6 from the shaded jar.
[INFO] Excluding commons-logging:commons-logging:jar:1.1.3 from the shaded jar.
[INFO] Excluding commons-daemon:commons-daemon:jar:1.0.13 from the shaded jar.
[INFO] Excluding log4j:log4j:jar:1.2.17 from the shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java:jar:3.2.0 from the shaded 
jar.
[INFO] Excluding javax.servlet:servlet-api:jar:2.5 from the shaded jar.
[INFO] Excluding xmlenc:xmlenc:jar:0.52 from the shaded jar.
[INFO] Excluding io.netty:netty-all:jar:4.0.23.Final from the shaded jar.
[INFO] Excluding xerces:xercesImpl:jar:2.9.1 from the shaded jar.
[INFO] Excluding xml-apis:xml-apis:jar:1.3.04 from the shaded jar.
[INFO] Excluding org.apache.htrace:htrace-core:jar:3.1.0-incubating from the 
shaded jar.
[INFO] Excluding org.fusesource.leveldbjni:leveldbjni-all:jar:1.8 from the 
shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-client:jar:2.7.1 from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-common:jar:2.7.3 from the shaded jar.
[INFO] Excluding org.apache.commons:commons-math3:jar:3.1.1 from the shaded jar.
[INFO] Excluding commons-httpclient:commons-httpclient:jar:3.1 from the shaded 
jar.
[INFO] Excluding commons-net:commons-net:jar:3.1 from the shaded jar.
[INFO] Excluding commons-collections:commons-collections:jar:3.2.2 from the 
shaded jar.
[INFO] Excluding javax.servlet.jsp:jsp-api:jar:2.1 from the shaded jar.
[INFO] Excluding commons-configuration:commons-configuration:jar:1.6 from the 
shaded jar.
[INFO] Excluding commons-digester:commons-digester:jar:1.8 from the shaded jar.
[INFO] Excluding commons-beanutils:commons-beanutils:jar:1.7.0 from the shaded 
jar.
[INFO] Excluding commons-beanutils:commons-beanutils-core:jar:1.8.0 from the 
shaded jar.
[INFO] Excluding org.slf4j:slf4j-log4j12:jar:1.7.10 from the shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.2.4 from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-auth:jar:2.7.3 from the shaded jar.
[INFO] Excluding 
org.apache.directory.server:apacheds-kerberos-codec:jar:2.0.0-M15 from the 
shaded jar.
[INFO] Excluding org.apache.directory.server:apacheds-i18n:jar:2.0.0-M15 from 
the shaded jar.
[INFO] Excluding org.apache.directory.api:api-asn1-api:jar:1.0.0-M20 from the 
shaded jar.
[INFO] Excluding org.apache.directory.api:api-util:jar:1.0.0-M20 from the 
shaded jar.
[INFO] Excluding org.apache.curator:curator-framework:jar:2.7.1 from the shaded 
jar.
[INFO] Excluding org.apache.curator:curator-client:jar:2.7.1 from the shaded 
jar.
[INFO] Excluding org.apache.curator:curator-recipes:jar:2.7.1 from the shaded 
jar.
[INFO] Excluding org.apache.zookeeper:zookeeper:jar:3.4.6 from the shaded jar.
[INFO] Excluding io.netty:netty:jar:3.7.0.Final from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-mapreduce-client-app:jar:2.7.1 from 
the 

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1372

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[github] Update containers at master to newly released beam-master-20180413.

--
[...truncated 4.37 KB...]
# Make sure to unalias pydoc if it's already there
alias pydoc 2>/dev/null >/dev/null && unalias pydoc

pydoc () {
python -m pydoc "$@"
}

# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands.  Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH-}" ] || [ -n "${ZSH_VERSION-}" ] ; then
hash -r 2>/dev/null
fi
cd sdks/python
pip install -e .[gcp,test]
Obtaining 
file://
Collecting avro<2.0.0,>=1.8.1 (from apache-beam==2.5.0.dev0)
:339:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  SNIMissingWarning
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting dill==0.2.6 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting grpcio<2,>=1.8 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
  Using cached 
https://files.pythonhosted.org/packages/0d/54/b647a6323be6526be27b2c90bb042769f1a7a6e59bd1a5f2eeb795bfece4/grpcio-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 

Build failed in Jenkins: beam_PostCommit_Python_Verify #4710

2018-04-16 Thread Apache Jenkins Server
See 


--
GitHub pull request #5146 of commit 07ad9f192ca5b61e38ce30cce98541420a1c924f, 
no merge conflicts.
Setting status of 07ad9f192ca5b61e38ce30cce98541420a1c924f to PENDING with url 
https://builds.apache.org/job/beam_PostCommit_Python_Verify/4710/ and message: 
'Build started sha1 is merged.'
Using context: Jenkins: Python SDK PostCommit Tests
[EnvInject] - Loading node environment variables.
Building remotely on beam1 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/5146/*:refs/remotes/origin/pr/5146/*
 > git rev-parse refs/remotes/origin/pr/5146/merge^{commit} # timeout=10
 > git rev-parse refs/remotes/origin/origin/pr/5146/merge^{commit} # timeout=10
Checking out Revision 7e0a50df0c8d831accbeb424771a78243ef3844b 
(refs/remotes/origin/pr/5146/merge)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7e0a50df0c8d831accbeb424771a78243ef3844b
Commit message: "Merge 07ad9f192ca5b61e38ce30cce98541420a1c924f into 
e1c526d7f88add0aa44635a56223a4907b81e86b"
First time build. Skipping changelog.
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PostCommit_Python_Verify] $ /bin/bash -xe 
/tmp/jenkins8726216730349696739.sh
+ cd src
+ bash sdks/python/run_postcommit.sh

# Run tests on the service.

# Where to store integration test outputs.
GCS_LOCATION=gs://temp-storage-for-end-to-end-tests

PROJECT=apache-beam-testing

# Create a tarball
python setup.py sdist
python: can't open file 'setup.py': [Errno 2] No such file or directory
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user git...@alasdairhodge.co.uk
Not sending mail to unregistered user sweg...@google.com
Not sending mail to unregistered user sid...@google.com


Build failed in Jenkins: beam_PerformanceTests_Python #1157

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[swegner] Fix a typo in gradle task group

[mingmxu] support MAP in SQL schema

[mingmxu] in MAP, key as primitive, and value can be primitive/array/map/row

[mingmxu] use Collection for ARRAY type, and re-org `verify` code in `Row`

[mingmxu] rebase as file conflict with #5089

[github] Update containers at master to newly released beam-master-20180413.

[mingmxu] rename CollectionType to CollectionElementType

[github] Add region to dataflowOptions struct.

[sidhom] [BEAM-4056] Identify side inputs by transform id and local name

[sidhom] Add side input assertions to ExecutableStageMatcher

--
[...truncated 633.16 KB...]
[INFO] Apache Beam :: SDKs :: Java :: IO :: Elasticsearch-Tests SUCCESS [  
0.843 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: Elasticsearch-Tests :: Common 
SUCCESS [  0.599 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: Elasticsearch-Tests :: 2.x SUCCESS 
[  2.567 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: Elasticsearch-Tests :: 5.x SUCCESS 
[  4.310 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: XML ... SUCCESS [  2.260 s]
[INFO] Apache Beam :: SDKs :: Java :: Extensions :: Protobuf SUCCESS [  1.963 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: Google Cloud Platform SUCCESS [  
5.559 s]
[INFO] Apache Beam :: Runners :: Google Cloud Dataflow  SUCCESS [  9.089 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: File-based-io-tests SUCCESS [  
2.452 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: Hadoop Common . SUCCESS [  4.017 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: Hadoop File System SUCCESS [  3.636 
s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: JDBC .. SUCCESS [  4.123 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: Hadoop Input Format SUCCESS [ 
14.388 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: HBase . SUCCESS [  6.751 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: HCatalog .. SUCCESS [  9.639 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: JMS ... SUCCESS [  2.102 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: Kafka . SUCCESS [  2.476 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: Kinesis ... SUCCESS [  2.042 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: MongoDB ... SUCCESS [  2.745 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: MQTT .. SUCCESS [  2.142 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: Redis . SUCCESS [  1.957 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: Solr .. SUCCESS [  5.055 s]
[INFO] Apache Beam :: SDKs :: Java :: IO :: Tika .. SUCCESS [  4.904 s]
[INFO] Apache Beam :: SDKs :: Java :: Maven Archetypes  SUCCESS [  0.062 s]
[INFO] Apache Beam :: SDKs :: Java :: Maven Archetypes :: Starter SUCCESS [  
8.377 s]
[INFO] Apache Beam :: Examples  SUCCESS [  0.165 s]
[INFO] Apache Beam :: Examples :: Java  SUCCESS [  3.289 s]
[INFO] Apache Beam :: SDKs :: Java :: Maven Archetypes :: Examples SUCCESS [ 
31.063 s]
[INFO] Apache Beam :: SDKs :: Java :: Extensions :: Jackson SUCCESS [  1.458 s]
[INFO] Apache Beam :: SDKs :: Java :: Extensions :: Join library SUCCESS [  
1.746 s]
[INFO] Apache Beam :: SDKs :: Java :: Extensions :: Sketching SUCCESS [  2.510 
s]
[INFO] Apache Beam :: SDKs :: Java :: Extensions :: Sorter  SUCCESS [  2.208 s]
[INFO] Apache Beam :: SDKs :: Java :: Extensions :: SQL ... SUCCESS [ 16.797 s]
[INFO] Apache Beam :: SDKs :: Java :: Nexmark . SUCCESS [ 15.173 s]
[INFO] Apache Beam :: SDKs :: Python .. FAILURE [ 11.287 s]
[INFO] Apache Beam :: SDKs :: Python :: Container . SKIPPED
[INFO] Apache Beam :: Runners :: Java Fn Execution  SKIPPED
[INFO] Apache Beam :: Runners :: Java Local Artifact Service SKIPPED
[INFO] Apache Beam :: Runners :: Reference  SKIPPED
[INFO] Apache Beam :: Runners :: Reference :: Java  SKIPPED
[INFO] Apache Beam :: Runners :: Reference :: Job Orchestrator SKIPPED
[INFO] Apache Beam :: Runners :: Flink  SKIPPED
[INFO] Apache Beam :: Runners :: Gearpump . SKIPPED
[INFO] Apache Beam :: Runners :: Spark  SKIPPED
[INFO] Apache Beam :: Runners :: Apex . SKIPPED
[INFO] Apache Beam :: Runners :: Google Cloud Platform  SKIPPED
[INFO] Apache Beam :: Runners :: Google Cloud Platform :: GCE metadata 
provisioning SKIPPED
[INFO] Apache Beam :: Runners :: Google Cloud Platform :: GCS artifact proxy 
SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Aggregated Javadoc .. SKIPPED
[INFO] 
[INFO] BUILD FAILURE
[INFO] 
[INFO] Total time: 07:00 min
[INFO] Finished at: 2018-04-17T00:13:59Z
[INFO] Final Memory: 339M/2155M
[INFO] 

Jenkins build is back to normal : beam_PerformanceTests_AvroIOIT_HDFS #58

2018-04-16 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #111

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[github] Update containers at master to newly released beam-master-20180413.

--
[...truncated 1.23 MB...]
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:626)
at scala.Option.getOrElse(Option.scala:121)
at 
org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:828)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:626)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:169)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:123)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:83)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
at 
org.apache.beam.runners.spark.translation.streaming.CreateStreamTest.testFirstElementLate(CreateStreamTest.java:240)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
at 
org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:239)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.runTestClass(JUnitTestClassExecuter.java:114)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.execute(JUnitTestClassExecuter.java:57)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassProcessor.processTestClass(JUnitTestClassProcessor.java:66)
at 
org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at 
org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:32)
at 
org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:93)
at com.sun.proxy.$Proxy3.processTestClass(Unknown Source)
at 
org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:108)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at 

Build failed in Jenkins: beam_PerformanceTests_HadoopInputFormat #150

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[swegner] Fix a typo in gradle task group

[mingmxu] support MAP in SQL schema

[mingmxu] in MAP, key as primitive, and value can be primitive/array/map/row

[mingmxu] use Collection for ARRAY type, and re-org `verify` code in `Row`

[mingmxu] rebase as file conflict with #5089

[github] Update containers at master to newly released beam-master-20180413.

[mingmxu] rename CollectionType to CollectionElementType

[github] Add region to dataflowOptions struct.

[sidhom] [BEAM-4056] Identify side inputs by transform id and local name

[sidhom] Add side input assertions to ExecutableStageMatcher

--
[...truncated 1.23 MB...]
[INFO] Excluding com.google.cloud.bigdataoss:gcsio:jar:1.4.5 from the shaded 
jar.
[INFO] Excluding 
com.google.apis:google-api-services-cloudresourcemanager:jar:v1-rev6-1.22.0 
from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-io-google-cloud-platform:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-sdks-java-extensions-protobuf:jar:2.5.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-bigquery:jar:v2-rev374-1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.api:gax-grpc:jar:0.20.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding com.google.apis:google-api-services-pubsub:jar:v1-rev10-1.22.0 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core:jar:1.0.2 from the shaded 
jar.
[INFO] Excluding org.json:json:jar:20160810 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-spanner:jar:0.20.0b-beta from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-spanner-v1:jar:0.1.11b 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:proto-google-cloud-spanner-admin-instance-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-spanner-v1:jar:0.1.11b 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:grpc-google-cloud-spanner-admin-database-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding 
com.google.api.grpc:grpc-google-cloud-spanner-admin-instance-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-longrunning-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-longrunning-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-protos:jar:1.0.0-pre3 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-client-core:jar:1.0.0 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-appengine:jar:0.7.0 from 
the shaded jar.
[INFO] Excluding io.opencensus:opencensus-contrib-grpc-util:jar:0.7.0 from the 
shaded jar.
[INFO] Excluding io.opencensus:opencensus-api:jar:0.7.0 from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:proto-google-cloud-spanner-admin-database-v1:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding 

Build failed in Jenkins: beam_PerformanceTests_JDBC #461

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[swegner] Fix a typo in gradle task group

[mingmxu] support MAP in SQL schema

[mingmxu] in MAP, key as primitive, and value can be primitive/array/map/row

[mingmxu] use Collection for ARRAY type, and re-org `verify` code in `Row`

[mingmxu] rebase as file conflict with #5089

[github] Update containers at master to newly released beam-master-20180413.

[mingmxu] rename CollectionType to CollectionElementType

[github] Add region to dataflowOptions struct.

[sidhom] [BEAM-4056] Identify side inputs by transform id and local name

[sidhom] Add side input assertions to ExecutableStageMatcher

--
[...truncated 104.04 KB...]
[INFO] Excluding com.google.protobuf.nano:protobuf-javanano:jar:3.0.0-alpha-5 
from the shaded jar.
[INFO] Excluding io.netty:netty-tcnative-boringssl-static:jar:1.1.33.Fork26 
from the shaded jar.
[INFO] Excluding 
org.apache.beam:beam-runners-core-construction-java:jar:2.5.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding org.apache.beam:beam-model-job-management:jar:2.5.0-SNAPSHOT 
from the shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client:jar:1.22.0 from the 
shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client:jar:1.22.0 from the 
shaded jar.
[INFO] Excluding org.apache.httpcomponents:httpclient:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding org.apache.httpcomponents:httpcore:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding commons-codec:commons-codec:jar:1.3 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded 
jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 
from the shaded jar.
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing 

 with 

[INFO] Replacing original test artifact with shaded test artifact.
[INFO] Replacing 

 with 

[INFO] Dependency-reduced POM written at: 

[INFO] 
[INFO] --- maven-failsafe-plugin:2.21.0:integration-test (default) @ 
beam-sdks-java-io-jdbc ---
[INFO] Failsafe report directory: 

[INFO] parallel='all', perCoreThreadCount=true, threadCount=4, 
useUnlimitedThreads=false, threadCountSuites=0, threadCountClasses=0, 
threadCountMethods=0, parallelOptimized=true
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] Running org.apache.beam.sdk.io.jdbc.JdbcIOIT
[ERROR] Tests run: 2, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 0 s <<< 
FAILURE! - in org.apache.beam.sdk.io.jdbc.JdbcIOIT
[ERROR] org.apache.beam.sdk.io.jdbc.JdbcIOIT  Time elapsed: 0 s  <<< ERROR!
org.postgresql.util.PSQLException: The connection attempt failed.
at 
org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:272)
at 
org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
at 

Build failed in Jenkins: beam_PostRelease_NightlySnapshot #199

2018-04-16 Thread Apache Jenkins Server
See 


--
[...truncated 101.23 KB...]
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-common-artifact-filters/3.0.0/maven-common-artifact-filters-3.0.0.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-common-artifact-filters/3.0.0/maven-common-artifact-filters-3.0.0.pom
 (4.8 kB at 115 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-shared-components/22/maven-shared-components-22.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-shared-components/22/maven-shared-components-22.pom
 (5.1 kB at 146 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/maven-parent/27/maven-parent-27.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/maven-parent/27/maven-parent-27.pom
 (41 kB at 1.4 MB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/apache/17/apache-17.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/apache/17/apache-17.pom (16 kB 
at 554 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-shared-utils/3.0.0/maven-shared-utils-3.0.0.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-shared-utils/3.0.0/maven-shared-utils-3.0.0.pom
 (5.6 kB at 207 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/commons-io/commons-io/2.2/commons-io-2.2.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/commons-io/commons-io/2.2/commons-io-2.2.pom
 (11 kB at 263 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/commons/commons-parent/24/commons-parent-24.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/commons/commons-parent/24/commons-parent-24.pom
 (47 kB at 1.2 MB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/apache/9/apache-9.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/apache/9/apache-9.pom (15 kB at 
370 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/com/google/code/findbugs/jsr305/2.0.1/jsr305-2.0.1.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/com/google/code/findbugs/jsr305/2.0.1/jsr305-2.0.1.pom
 (965 B at 32 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/commons-codec/commons-codec/1.6/commons-codec-1.6.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/commons-codec/commons-codec/1.6/commons-codec-1.6.pom
 (11 kB at 398 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/commons/commons-parent/22/commons-parent-22.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/commons/commons-parent/22/commons-parent-22.pom
 (42 kB at 1.2 MB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.pom
 (2.7 kB at 71 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/slf4j/slf4j-parent/1.7.5/slf4j-parent-1.7.5.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/slf4j/slf4j-parent/1.7.5/slf4j-parent-1.7.5.pom
 (12 kB at 437 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-velocity/1.1.8/plexus-velocity-1.1.8.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-velocity/1.1.8/plexus-velocity-1.1.8.pom
 (1.9 kB at 69 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-components/1.1.15/plexus-components-1.1.15.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-components/1.1.15/plexus-components-1.1.15.pom
 (2.8 kB at 102 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus/2.0.3/plexus-2.0.3.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus/2.0.3/plexus-2.0.3.pom
 (15 kB at 533 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.pom
 (13 kB at 403 kB/s)
[INFO] Downloading from central: 

[jira] [Work logged] (BEAM-3981) Futurize and fix python 2 compatibility for coders package

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3981?focusedWorklogId=91552=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91552
 ]

ASF GitHub Bot logged work on BEAM-3981:


Author: ASF GitHub Bot
Created on: 16/Apr/18 23:47
Start Date: 16/Apr/18 23:47
Worklog Time Spent: 10m 
  Work Description: tvalentyn commented on issue #5053: [BEAM-3981] 
Futurize coders subpackage
URL: https://github.com/apache/beam/pull/5053#issuecomment-381784452
 
 
   With  #5131 merged, we probably need to rebase this PR off the current 
master, for the ValidatesRunner tests to pass. @RobbeSneyders can we do that 
please?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91552)
Time Spent: 11h 40m  (was: 11.5h)

> Futurize and fix python 2 compatibility for coders package
> --
>
> Key: BEAM-3981
> URL: https://issues.apache.org/jira/browse/BEAM-3981
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Robbe
>Assignee: Ahmet Altay
>Priority: Major
>  Time Spent: 11h 40m
>  Remaining Estimate: 0h
>
> Run automatic conversion with futurize tool on coders subpackage and fix 
> python 2 compatibility. This prepares the subpackage for python 3 support.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3981) Futurize and fix python 2 compatibility for coders package

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3981?focusedWorklogId=91551=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91551
 ]

ASF GitHub Bot logged work on BEAM-3981:


Author: ASF GitHub Bot
Created on: 16/Apr/18 23:46
Start Date: 16/Apr/18 23:46
Worklog Time Spent: 10m 
  Work Description: tvalentyn commented on issue #5053: [BEAM-3981] 
Futurize coders subpackage
URL: https://github.com/apache/beam/pull/5053#issuecomment-381784452
 
 
   With  #5131 merged, we probably need to rebase this PR off the current 
masters, for the ValidatesRunner tests to pass. @RobbeSneyders can we do that 
please?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91551)
Time Spent: 11.5h  (was: 11h 20m)

> Futurize and fix python 2 compatibility for coders package
> --
>
> Key: BEAM-3981
> URL: https://issues.apache.org/jira/browse/BEAM-3981
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Robbe
>Assignee: Ahmet Altay
>Priority: Major
>  Time Spent: 11.5h
>  Remaining Estimate: 0h
>
> Run automatic conversion with futurize tool on coders subpackage and fix 
> python 2 compatibility. This prepares the subpackage for python 3 support.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3981) Futurize and fix python 2 compatibility for coders package

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3981?focusedWorklogId=91550=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91550
 ]

ASF GitHub Bot logged work on BEAM-3981:


Author: ASF GitHub Bot
Created on: 16/Apr/18 23:43
Start Date: 16/Apr/18 23:43
Worklog Time Spent: 10m 
  Work Description: aaltay commented on issue #5053: [BEAM-3981] Futurize 
coders subpackage
URL: https://github.com/apache/beam/pull/5053#issuecomment-381783927
 
 
   Run Python Dataflow ValidatesRunner


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91550)
Time Spent: 11h 20m  (was: 11h 10m)

> Futurize and fix python 2 compatibility for coders package
> --
>
> Key: BEAM-3981
> URL: https://issues.apache.org/jira/browse/BEAM-3981
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Robbe
>Assignee: Ahmet Altay
>Priority: Major
>  Time Spent: 11h 20m
>  Remaining Estimate: 0h
>
> Run automatic conversion with futurize tool on coders subpackage and fix 
> python 2 compatibility. This prepares the subpackage for python 3 support.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: Merge pull request #5131 from tvalentyn/patch-7

2018-04-16 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit e1c526d7f88add0aa44635a56223a4907b81e86b
Merge: e5a17db e26ba68
Author: Ahmet Altay 
AuthorDate: Mon Apr 16 16:43:20 2018 -0700

Merge pull request #5131 from tvalentyn/patch-7

Update containers at master to newly released beam-master-20180413.

 sdks/python/apache_beam/runners/dataflow/internal/dependency.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
al...@apache.org.


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Apex_Gradle #112

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[swegner] Fix a typo in gradle task group

--
[...truncated 27.92 MB...]
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.engine.Node emitEndStream
INFO: 13 sending EndOfStream
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.engine.Node emitEndStream
INFO: 14 sending EndOfStream
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.engine.Node emitEndStream
INFO: 15 sending EndOfStream
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.engine.Node emitEndStream
INFO: 16 sending EndOfStream
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.engine.Node emitEndStream
INFO: 17 sending EndOfStream
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.engine.Node emitEndStream
INFO: 18 sending EndOfStream
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.engine.Node emitEndStream
INFO: 19 sending EndOfStream
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:12 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:13 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:13 PM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Undeploy request: [7]
Apr 16, 2018 11:43:13 PM com.datatorrent.stram.engine.StreamingContainer 
undeploy
INFO: Undeploy complete.
Apr 16, 2018 11:43:13 PM com.datatorrent.bufferserver.server.Server$3 run
INFO: Removing ln 
LogicalNode@2a23ef78identifier=tcp://localhost:41654/7.output.7, 
upstream=7.output.7, group=stream2/8.input, partitions=[], 
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@38d195f8{da=com.datatorrent.bufferserver.internal.DataList$Block@67bcb877{identifier=7.output.7,
 data=1048576, readingOffset=0, writingOffset=237, 
starting_window=5ad5350d0001, ending_window=5ad5350d0007, refCount=2, 
uniqueIdentifier=0, next=null, future=null}}} from dl 
com.datatorrent.bufferserver.internal.DataList@52d1d3d7 {7.output.7}
Apr 16, 2018 11:43:13 PM com.datatorrent.stram.Journal write
WARNING: Journal output stream is null. Skipping write to the WAL.
Apr 16, 2018 11:43:13 PM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Undeploy request: [14]
Apr 16, 2018 11:43:13 PM com.datatorrent.stram.engine.StreamingContainer 
undeploy
INFO: Undeploy complete.
Apr 16, 2018 11:43:13 PM com.datatorrent.bufferserver.server.Server$3 run
INFO: Removing ln 

[beam] branch master updated (e5a17db -> e1c526d)

2018-04-16 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from e5a17db  Merge pull request #5114: Fix a typo in gradle task group
 add e26ba68  Update containers at master to newly released 
beam-master-20180413.
 new e1c526d  Merge pull request #5131 from tvalentyn/patch-7

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/python/apache_beam/runners/dataflow/internal/dependency.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
al...@apache.org.


[jira] [Work logged] (BEAM-4069) Empty pipeline options can be gracefully serialized/deserialized

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4069?focusedWorklogId=91547=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91547
 ]

ASF GitHub Bot logged work on BEAM-4069:


Author: ASF GitHub Bot
Created on: 16/Apr/18 23:35
Start Date: 16/Apr/18 23:35
Worklog Time Spent: 10m 
  Work Description: tgroh commented on a change in pull request #5126: 
[BEAM-4069] Gracefully deserialize empty options structs
URL: https://github.com/apache/beam/pull/5126#discussion_r181915277
 
 

 ##
 File path: 
sdks/java/core/src/main/java/org/apache/beam/sdk/options/ProxyInvocationHandler.java
 ##
 @@ -732,14 +732,22 @@ private void ensureSerializable(
 @Override
 public PipelineOptions deserialize(JsonParser jp, DeserializationContext 
ctxt)
 throws IOException, JsonProcessingException {
-  ObjectNode objectNode = (ObjectNode) jp.readValueAsTree();
-  ObjectNode optionsNode = (ObjectNode) objectNode.get("options");
+  ObjectNode objectNode = jp.readValueAsTree();
+  JsonNode rawOptionsNode = objectNode.get("options");
+  ObjectNode optionsNode;
+  if (rawOptionsNode == null || rawOptionsNode.isNull()) {
 
 Review comment:
   Can this not be pulled up some? e.g. 
   
   ```
   PipelineOptions options;
   if (rawOptionsNode == null || rawOptionsNode.isNull()) {
 options = PipelineOptionsFactory.create(); // or new 
ProxyInvocationHandler..., but I believe these to be equivalent
   } else {
 // code
 options = ProxyInvocationHandler...
   }
   ValueProvider.RuntimeValueProvider.setRuntimeOptions(options)
   return options;
   ```
   
   Because the two if statements aren't independent - so we can just glom them 
together and have a more obvious path through the method.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91547)
Time Spent: 50m  (was: 40m)

> Empty pipeline options can be gracefully serialized/deserialized
> 
>
> Key: BEAM-4069
> URL: https://issues.apache.org/jira/browse/BEAM-4069
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core
>Reporter: Ben Sidhom
>Assignee: Ben Sidhom
>Priority: Minor
>  Time Spent: 50m
>  Remaining Estimate: 0h
>
> PipelineOptionsTranslation.fromProto currently crashes with a 
> NullPointerException when passed an empty options Struct. This is due to 
> ProxyInvocationHandler.Deserializer expecting a non-empty enclosing Struct.
> Empty pipeline options may be passed by SDKs interacting with a job server, 
> so this case needs to be handled. Note that testing a round-trip of an 
> effectively-empty Java PipelineOptions object is not sufficient to catch this 
> because "empty" Java options still contain default fields not defined in 
> other SDKs.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-4095) Add abstractions for runners to provide artifacts to ArtifactRetrievalService

2018-04-16 Thread Axel Magnuson (JIRA)
Axel Magnuson created BEAM-4095:
---

 Summary: Add abstractions for runners to provide artifacts to 
ArtifactRetrievalService
 Key: BEAM-4095
 URL: https://issues.apache.org/jira/browse/BEAM-4095
 Project: Beam
  Issue Type: Improvement
  Components: runner-core
Reporter: Axel Magnuson
Assignee: Axel Magnuson


In the case of runners on cluster engines, the responsibility of storing and 
propagating artifacts can be left up to the runner.  In order for the runner to 
make artifacts available to the ArtifactRetrievalService, abstractions are 
necessary to provide these artifacts to the retrieval service.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-4093) Support Python ValidatesRunner test against TestDataflowRunner in streaming

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4093?focusedWorklogId=91545=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91545
 ]

ASF GitHub Bot logged work on BEAM-4093:


Author: ASF GitHub Bot
Created on: 16/Apr/18 23:21
Start Date: 16/Apr/18 23:21
Worklog Time Spent: 10m 
  Work Description: markflyhigh opened a new pull request #5147: 
[BEAM-4093] Support Python ValidatesRunner test in streaming
URL: https://github.com/apache/beam/pull/5147
 
 
   Improved `TestDataflowRunner` so that ValidatesRunner tests can run in 
streaming mode against it by specify `--wait_until_finish_duration` from 
commandline. Since streaming pipeline cannot terminate itself, framework will 
cancel the job after `wait_until_finish_duration` time.
   
   Note: This change only enables basic level of verification to the streaming 
job. Any failures that lead to unsuccessful termination state (like FAILED) 
will be caught and cause test fail. However, `asser_that` failures cannot be 
caught at this time.
   
   
   
   Follow this checklist to help us incorporate your contribution quickly and 
easily:
   
- [ ] Make sure there is a [JIRA 
issue](https://issues.apache.org/jira/projects/BEAM/issues/) filed for the 
change (usually before you start working on it).  Trivial changes like typos do 
not require a JIRA issue.  Your pull request should address just this issue, 
without pulling in other changes.
- [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue.
- [ ] Write a pull request description that is detailed enough to 
understand:
  - [ ] What the pull request does
  - [ ] Why it does it
  - [ ] How it does it
  - [ ] Why this approach
- [ ] Each commit in the pull request should have a meaningful subject line 
and body.
- [ ] Run `mvn clean verify` to make sure basic checks pass. A more 
thorough check will be performed on your pull request automatically.
- [ ] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91545)
Time Spent: 10m
Remaining Estimate: 0h

> Support Python ValidatesRunner test against TestDataflowRunner in streaming
> ---
>
> Key: BEAM-4093
> URL: https://issues.apache.org/jira/browse/BEAM-4093
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-py-core, testing
>Reporter: Mark Liu
>Assignee: Mark Liu
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-4069) Empty pipeline options can be gracefully serialized/deserialized

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4069?focusedWorklogId=91540=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91540
 ]

ASF GitHub Bot logged work on BEAM-4069:


Author: ASF GitHub Bot
Created on: 16/Apr/18 23:12
Start Date: 16/Apr/18 23:12
Worklog Time Spent: 10m 
  Work Description: bsidhom commented on issue #5126: [BEAM-4069] 
Gracefully deserialize empty options structs
URL: https://github.com/apache/beam/pull/5126#issuecomment-381778576
 
 
   R: @tgroh 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91540)
Time Spent: 40m  (was: 0.5h)

> Empty pipeline options can be gracefully serialized/deserialized
> 
>
> Key: BEAM-4069
> URL: https://issues.apache.org/jira/browse/BEAM-4069
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core
>Reporter: Ben Sidhom
>Assignee: Ben Sidhom
>Priority: Minor
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> PipelineOptionsTranslation.fromProto currently crashes with a 
> NullPointerException when passed an empty options Struct. This is due to 
> ProxyInvocationHandler.Deserializer expecting a non-empty enclosing Struct.
> Empty pipeline options may be passed by SDKs interacting with a job server, 
> so this case needs to be handled. Note that testing a round-trip of an 
> effectively-empty Java PipelineOptions object is not sufficient to catch this 
> because "empty" Java options still contain default fields not defined in 
> other SDKs.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #198

2018-04-16 Thread Apache Jenkins Server
See 


--
[...truncated 101.11 KB...]
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-common-artifact-filters/3.0.0/maven-common-artifact-filters-3.0.0.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-common-artifact-filters/3.0.0/maven-common-artifact-filters-3.0.0.pom
 (4.8 kB at 178 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-shared-components/22/maven-shared-components-22.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-shared-components/22/maven-shared-components-22.pom
 (5.1 kB at 189 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/maven-parent/27/maven-parent-27.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/maven-parent/27/maven-parent-27.pom
 (41 kB at 1.3 MB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/apache/17/apache-17.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/apache/17/apache-17.pom (16 kB 
at 595 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-shared-utils/3.0.0/maven-shared-utils-3.0.0.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-shared-utils/3.0.0/maven-shared-utils-3.0.0.pom
 (5.6 kB at 223 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/commons-io/commons-io/2.2/commons-io-2.2.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/commons-io/commons-io/2.2/commons-io-2.2.pom
 (11 kB at 368 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/commons/commons-parent/24/commons-parent-24.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/commons/commons-parent/24/commons-parent-24.pom
 (47 kB at 1.4 MB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/apache/9/apache-9.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/apache/9/apache-9.pom (15 kB at 
561 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/com/google/code/findbugs/jsr305/2.0.1/jsr305-2.0.1.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/com/google/code/findbugs/jsr305/2.0.1/jsr305-2.0.1.pom
 (965 B at 37 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/commons-codec/commons-codec/1.6/commons-codec-1.6.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/commons-codec/commons-codec/1.6/commons-codec-1.6.pom
 (11 kB at 446 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/commons/commons-parent/22/commons-parent-22.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/apache/commons/commons-parent/22/commons-parent-22.pom
 (42 kB at 1.2 MB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.pom
 (2.7 kB at 100 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/slf4j/slf4j-parent/1.7.5/slf4j-parent-1.7.5.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/slf4j/slf4j-parent/1.7.5/slf4j-parent-1.7.5.pom
 (12 kB at 422 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-velocity/1.1.8/plexus-velocity-1.1.8.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-velocity/1.1.8/plexus-velocity-1.1.8.pom
 (1.9 kB at 74 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-components/1.1.15/plexus-components-1.1.15.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-components/1.1.15/plexus-components-1.1.15.pom
 (2.8 kB at 84 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus/2.0.3/plexus-2.0.3.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus/2.0.3/plexus-2.0.3.pom
 (15 kB at 533 kB/s)
[INFO] Downloading from central: 
https://repo.maven.apache.org/maven2/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.pom
[INFO] Downloaded from central: 
https://repo.maven.apache.org/maven2/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.pom
 (13 kB at 284 kB/s)
[INFO] Downloading from central: 

Build failed in Jenkins: beam_PostCommit_Python_Verify #4709

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[swegner] Fix a typo in gradle task group

--
Started by GitHub push by kennknowles
[EnvInject] - Loading node environment variables.
Building remotely on beam1 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e5a17db4f4ccf30548bb18c96277e7afe1b8ae56 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e5a17db4f4ccf30548bb18c96277e7afe1b8ae56
Commit message: "Merge pull request #5114: Fix a typo in gradle task group"
 > git rev-list --no-walk 3b3f944d4b6aad10a20bc466f75da2e9210192ff # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PostCommit_Python_Verify] $ /bin/bash -xe 
/tmp/jenkins8925365282813352451.sh
+ cd src
+ bash sdks/python/run_postcommit.sh

# pip install --user installation location.
LOCAL_PATH=$HOME/.local/bin/

# Remove any tox cache from previous workspace
# TODO(udim): Remove this line and add '-r' to tox invocation instead.
rm -rf sdks/python/target/.tox

# INFRA does not install these packages
pip install --user --upgrade virtualenv tox
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:339: 
SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  SNIMissingWarning
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Requirement already up-to-date: virtualenv in 
/home/jenkins/.local/lib/python2.7/site-packages (15.2.0)
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Requirement already up-to-date: tox in 
/home/jenkins/.local/lib/python2.7/site-packages (3.0.0)
Requirement not upgraded as not directly required: py>=1.4.17 in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (1.5.3)
Requirement not upgraded as not directly required: pluggy<1.0,>=0.3.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (0.6.0)
Requirement not upgraded as not directly required: six in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (1.11.0)
cheetah 2.4.4 requires Markdown>=2.0.1, which is not installed.
apache-beam 2.5.0.dev0 requires hdfs<3.0.0,>=2.1.0, which is not installed.
apache-beam 2.5.0.dev0 requires pytz>=2018.3, which is not installed.
apache-beam 2.5.0.dev0 has requirement grpcio<2,>=1.8, but you'll have grpcio 
1.4.0 which is incompatible.

# Tox runs unit tests in a virtual environment
${LOCAL_PATH}/tox -e ALL -c sdks/python/tox.ini
GLOB sdist-make: 

ERROR: invocation failed (exit code 1), logfile: 

ERROR: actionid: tox
msg: packaging
cmdargs: ['/usr/bin/python', 
local('
 'sdist', '--formats=zip', '--dist-dir', 

Build failed in Jenkins: beam_PostCommit_Python_Verify #4708

2018-04-16 Thread Apache Jenkins Server
See 


--
GitHub pull request #4387 of commit 8e9cde6d599287fb9aaf3a4a6af5a231e46fa803, 
no merge conflicts.
Setting status of 8e9cde6d599287fb9aaf3a4a6af5a231e46fa803 to PENDING with url 
https://builds.apache.org/job/beam_PostCommit_Python_Verify/4708/ and message: 
'Build started sha1 is merged.'
Using context: Jenkins: Python SDK PostCommit Tests
[EnvInject] - Loading node environment variables.
Building remotely on beam1 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/4387/*:refs/remotes/origin/pr/4387/*
 > git rev-parse refs/remotes/origin/pr/4387/merge^{commit} # timeout=10
 > git rev-parse refs/remotes/origin/origin/pr/4387/merge^{commit} # timeout=10
Checking out Revision abf183a6844e884da47bf0698c876c29a05a6b73 
(refs/remotes/origin/pr/4387/merge)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f abf183a6844e884da47bf0698c876c29a05a6b73
Commit message: "Merge 4d70fecf59ae39d4ae60faf0d6113d24a0100204 into 
e5a17db4f4ccf30548bb18c96277e7afe1b8ae56"
First time build. Skipping changelog.
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PostCommit_Python_Verify] $ /bin/bash -xe 
/tmp/jenkins6812130094958090413.sh
+ cd src
+ bash sdks/python/run_postcommit.sh

# pip install --user installation location.
LOCAL_PATH=$HOME/.local/bin/

# Remove any tox cache from previous workspace
# TODO(udim): Remove this line and add '-r' to tox invocation instead.
rm -rf sdks/python/target/.tox

# INFRA does not install these packages
pip install --user --upgrade virtualenv tox
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:339: 
SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  SNIMissingWarning
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Requirement already up-to-date: virtualenv in 
/home/jenkins/.local/lib/python2.7/site-packages (15.2.0)
/usr/local/lib/python2.7/dist-packages/pip/_vendor/urllib3/util/ssl_.py:137: 
InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Requirement already up-to-date: tox in 
/home/jenkins/.local/lib/python2.7/site-packages (3.0.0)
Requirement not upgraded as not directly required: py>=1.4.17 in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (1.5.3)
Requirement not upgraded as not directly required: pluggy<1.0,>=0.3.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (0.6.0)
Requirement not upgraded as not directly required: six in 
/home/jenkins/.local/lib/python2.7/site-packages (from tox) (1.11.0)
cheetah 2.4.4 requires Markdown>=2.0.1, which is not installed.
apache-beam 2.5.0.dev0 requires hdfs<3.0.0,>=2.1.0, which is not installed.
apache-beam 

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1371

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[swegner] Fix a typo in gradle task group

--
[...truncated 4.39 KB...]
# Make sure to unalias pydoc if it's already there
alias pydoc 2>/dev/null >/dev/null && unalias pydoc

pydoc () {
python -m pydoc "$@"
}

# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands.  Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH-}" ] || [ -n "${ZSH_VERSION-}" ] ; then
hash -r 2>/dev/null
fi
cd sdks/python
pip install -e .[gcp,test]
Obtaining 
file://
Collecting avro<2.0.0,>=1.8.1 (from apache-beam==2.5.0.dev0)
:339:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  SNIMissingWarning
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting dill==0.2.6 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
Collecting grpcio<2,>=1.8 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 
https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
  InsecurePlatformWarning
  Using cached 
https://files.pythonhosted.org/packages/0d/54/b647a6323be6526be27b2c90bb042769f1a7a6e59bd1a5f2eeb795bfece4/grpcio-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.5.0.dev0)
:137:
 InsecurePlatformWarning: A true SSLContext object is not available. This 
prevents urllib3 from configuring SSL appropriately and may cause certain SSL 
connections to fail. You can upgrade to a newer version of Python to solve 
this. For more information, see 

[jira] [Work logged] (BEAM-4038) Support Kafka Headers in KafkaIO

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4038?focusedWorklogId=91533=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91533
 ]

ASF GitHub Bot logged work on BEAM-4038:


Author: ASF GitHub Bot
Created on: 16/Apr/18 22:56
Start Date: 16/Apr/18 22:56
Worklog Time Spent: 10m 
  Work Description: rangadi commented on a change in pull request #5111: 
[BEAM-4038] Support Kafka Headers in KafkaIO
URL: https://github.com/apache/beam/pull/5111#discussion_r181909253
 
 

 ##
 File path: 
sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaHeader.java
 ##
 @@ -0,0 +1,28 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.kafka;
+
+/**
+ * This is a copy of Kafka's {@link org.apache.kafka.common.header.Header}. 
Included here in order
+ * to support older Kafka versions (0.9.x).
+ */
+public interface KafkaHeader {
 
 Review comment:
   We can require recent kafka version to compile and run tests. But at runtime 
we want KafkaIO to work with older versions of kafka-clients at runtime. That's 
why we have the runtime reflection checks. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91533)
Time Spent: 1h 20m  (was: 1h 10m)

> Support Kafka Headers in KafkaIO
> 
>
> Key: BEAM-4038
> URL: https://issues.apache.org/jira/browse/BEAM-4038
> Project: Beam
>  Issue Type: New Feature
>  Components: io-java-kafka
>Reporter: Geet Kumar
>Assignee: Raghu Angadi
>Priority: Minor
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> Headers have been added to Kafka Consumer/Producer records (KAFKA-4208). The 
> purpose of this JIRA is to support this feature in KafkaIO.  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #110

2018-04-16 Thread Apache Jenkins Server
See 


Changes:

[swegner] Fix a typo in gradle task group

--
[...truncated 1.24 MB...]
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:626)
at scala.Option.getOrElse(Option.scala:121)
at 
org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:828)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:626)
at 
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:169)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:123)
at 
org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:83)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
at 
org.apache.beam.runners.spark.translation.streaming.CreateStreamTest.testFirstElementLate(CreateStreamTest.java:240)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
at 
org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
at 
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:239)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.runTestClass(JUnitTestClassExecuter.java:114)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.execute(JUnitTestClassExecuter.java:57)
at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassProcessor.processTestClass(JUnitTestClassProcessor.java:66)
at 
org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at 
org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:32)
at 
org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:93)
at com.sun.proxy.$Proxy3.processTestClass(Unknown Source)
at 
org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:108)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at 

[jira] [Work logged] (BEAM-3327) Add abstractions to manage Environment Instance lifecycles.

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3327?focusedWorklogId=91531=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91531
 ]

ASF GitHub Bot logged work on BEAM-3327:


Author: ASF GitHub Bot
Created on: 16/Apr/18 22:52
Start Date: 16/Apr/18 22:52
Worklog Time Spent: 10m 
  Work Description: axelmagn commented on issue #5144: [BEAM-3327] Rename 
EnvironmentManager to EnvironmentFactory
URL: https://github.com/apache/beam/pull/5144#issuecomment-381774978
 
 
   cc: @bsidhom 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91531)
Time Spent: 5h 40m  (was: 5.5h)

> Add abstractions to manage Environment Instance lifecycles.
> ---
>
> Key: BEAM-3327
> URL: https://issues.apache.org/jira/browse/BEAM-3327
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Axel Magnuson
>Priority: Major
>  Labels: portability
>  Time Spent: 5h 40m
>  Remaining Estimate: 0h
>
> This permits remote stage execution for arbitrary environments



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3327) Add abstractions to manage Environment Instance lifecycles.

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3327?focusedWorklogId=91530=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91530
 ]

ASF GitHub Bot logged work on BEAM-3327:


Author: ASF GitHub Bot
Created on: 16/Apr/18 22:52
Start Date: 16/Apr/18 22:52
Worklog Time Spent: 10m 
  Work Description: axelmagn commented on issue #5144: [BEAM-3327] Rename 
EnvironmentManager to EnvironmentFactory
URL: https://github.com/apache/beam/pull/5144#issuecomment-381774819
 
 
   R: @tgroh 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91530)
Time Spent: 5.5h  (was: 5h 20m)

> Add abstractions to manage Environment Instance lifecycles.
> ---
>
> Key: BEAM-3327
> URL: https://issues.apache.org/jira/browse/BEAM-3327
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Axel Magnuson
>Priority: Major
>  Labels: portability
>  Time Spent: 5.5h
>  Remaining Estimate: 0h
>
> This permits remote stage execution for arbitrary environments



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam-site] branch mergebot updated (b1d3ae3 -> 475bd8e)

2018-04-16 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a change to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git.


 discard b1d3ae3  This closes #419
 new 475bd8e  This closes #419

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (b1d3ae3)
\
 N -- N -- N   refs/heads/mergebot (475bd8e)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[jira] [Work logged] (BEAM-3327) Add abstractions to manage Environment Instance lifecycles.

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3327?focusedWorklogId=91529=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91529
 ]

ASF GitHub Bot logged work on BEAM-3327:


Author: ASF GitHub Bot
Created on: 16/Apr/18 22:51
Start Date: 16/Apr/18 22:51
Worklog Time Spent: 10m 
  Work Description: axelmagn opened a new pull request #5144: [BEAM-3327] 
Rename EnvironmentManager to EnvironmentFactory
URL: https://github.com/apache/beam/pull/5144
 
 
   EnvironmentManager has no planned functionality beyond creating
   new RemoteEnvironment instances via getRemoteEnvironment.
   Due to some confusion towards its functionality, I am renaming
   to EnvironmentFactory in coordination with developers who consume
   the interface.
   
   
   
   Follow this checklist to help us incorporate your contribution quickly and 
easily:
   
- [x] Make sure there is a [JIRA 
issue](https://issues.apache.org/jira/projects/BEAM/issues/) filed for the 
change (usually before you start working on it).  Trivial changes like typos do 
not require a JIRA issue.  Your pull request should address just this issue, 
without pulling in other changes.
- [x] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue.
- [x] Write a pull request description that is detailed enough to 
understand:
  - [x] What the pull request does
  - [x] Why it does it
  - [x] How it does it
  - [x] Why this approach
- [x] Each commit in the pull request should have a meaningful subject line 
and body.
- [x] Run `mvn clean verify` to make sure basic checks pass. A more 
thorough check will be performed on your pull request automatically.
- [x] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91529)
Time Spent: 5h 20m  (was: 5h 10m)

> Add abstractions to manage Environment Instance lifecycles.
> ---
>
> Key: BEAM-3327
> URL: https://issues.apache.org/jira/browse/BEAM-3327
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Axel Magnuson
>Priority: Major
>  Labels: portability
>  Time Spent: 5h 20m
>  Remaining Estimate: 0h
>
> This permits remote stage execution for arbitrary environments



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam-site] 01/01: This closes #419

2018-04-16 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit 475bd8e33bfcfdbabe458008f734bdc494c74a66
Merge: 6cfd3ba 07dd232
Author: Mergebot 
AuthorDate: Mon Apr 16 15:51:33 2018 -0700

This closes #419

 content/contribute/eclipse/index.html | 75 +--
 src/contribute/eclipse.md | 66 +-
 2 files changed, 84 insertions(+), 57 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[jira] [Work logged] (BEAM-3327) Add abstractions to manage Environment Instance lifecycles.

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3327?focusedWorklogId=91528=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91528
 ]

ASF GitHub Bot logged work on BEAM-3327:


Author: ASF GitHub Bot
Created on: 16/Apr/18 22:50
Start Date: 16/Apr/18 22:50
Worklog Time Spent: 10m 
  Work Description: bsidhom closed pull request #4751: [BEAM-3327] 
Implement simple Docker container manager
URL: https://github.com/apache/beam/pull/4751
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/runners/java-fn-execution/build.gradle 
b/runners/java-fn-execution/build.gradle
index b1aa9e8c0ca..c50ef34a1a2 100644
--- a/runners/java-fn-execution/build.gradle
+++ b/runners/java-fn-execution/build.gradle
@@ -48,3 +48,18 @@ dependencies {
   testCompile library.java.mockito_core
   testCompile library.java.slf4j_simple
 }
+
+test {
+  useJUnit {
+// Exclude tests that need Docker.
+excludeCategories 
"org.apache.beam.runners.fnexecution.environment.testing.NeedsDocker"
+  }
+}
+
+task testDocker(type: Test) {
+  group = "Verification"
+  description = "Runs Docker tests"
+  useJUnit {
+includeCategories 
"org.apache.beam.runners.fnexecution.environment.testing.NeedsDocker"
+  }
+}
diff --git a/runners/java-fn-execution/pom.xml 
b/runners/java-fn-execution/pom.xml
index 5096b299719..7f09a48fb9a 100644
--- a/runners/java-fn-execution/pom.xml
+++ b/runners/java-fn-execution/pom.xml
@@ -32,6 +32,53 @@
 
   jar
 
+  
+
+  
+org.apache.maven.plugins
+maven-surefire-plugin
+
+  
+  
+
org.apache.beam.runners.fnexecution.environment.testing.NeedsDocker
+  
+
+  
+
+  
+
+  
+
+  docker-tests
+  false
+  
+
+  
+org.apache.maven.plugins
+maven-surefire-plugin
+
+  
+docker-tests
+integration-test
+
+  test
+
+
+  
+
org.apache.beam.runners.fnexecution.environment.testing.NeedsDocker
+  
+  
+
+  
+
+  
+
+  
+
+  
+
   
 
   org.apache.beam
diff --git 
a/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/environment/DockerContainerEnvironment.java
 
b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/environment/DockerContainerEnvironment.java
new file mode 100644
index 000..1f95d8c8922
--- /dev/null
+++ 
b/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/environment/DockerContainerEnvironment.java
@@ -0,0 +1,65 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.fnexecution.environment;
+
+import org.apache.beam.model.pipeline.v1.RunnerApi.Environment;
+import org.apache.beam.runners.fnexecution.control.SdkHarnessClient;
+
+/**
+ * A {@link RemoteEnvironment} that talks to a Docker container. Accessors are 
thread-compatible.
+ */
+class DockerContainerEnvironment implements RemoteEnvironment {
+
+  static DockerContainerEnvironment create(DockerWrapper docker,
+  Environment environment, String containerId, SdkHarnessClient client) {
+return new DockerContainerEnvironment(docker, environment, containerId, 
client);
+  }
+
+  private final DockerWrapper docker;
+  private final Environment environment;
+  private final String containerId;
+  private final SdkHarnessClient client;
+
+  private DockerContainerEnvironment(DockerWrapper docker, Environment 
environment,
+  String containerId, SdkHarnessClient client) {
+this.docker = docker;
+this.environment = environment;
+this.containerId = containerId;
+this.client = client;
+  }
+
+  @Override
+  

[jira] [Work logged] (BEAM-3327) Add abstractions to manage Environment Instance lifecycles.

2018-04-16 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3327?focusedWorklogId=91527=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-91527
 ]

ASF GitHub Bot logged work on BEAM-3327:


Author: ASF GitHub Bot
Created on: 16/Apr/18 22:50
Start Date: 16/Apr/18 22:50
Worklog Time Spent: 10m 
  Work Description: bsidhom commented on issue #4751: [BEAM-3327] Implement 
simple Docker container manager
URL: https://github.com/apache/beam/pull/4751#issuecomment-381774510
 
 
   We're refactoring the RemoteEnvironment and EnvironmenManager interfaces a 
bit to better capture the semantics we need. I'm going to close this PR for now 
and open a new one that uses the new interface.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 91527)
Time Spent: 5h  (was: 4h 50m)

> Add abstractions to manage Environment Instance lifecycles.
> ---
>
> Key: BEAM-3327
> URL: https://issues.apache.org/jira/browse/BEAM-3327
> Project: Beam
>  Issue Type: New Feature
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Axel Magnuson
>Priority: Major
>  Labels: portability
>  Time Spent: 5h
>  Remaining Estimate: 0h
>
> This permits remote stage execution for arbitrary environments



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


  1   2   3   >