Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow_Gradle #3

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[migryz] Add script to sync data for Beam GitHub community metrics.

[migryz] Address PR comments.

[swegner] Add GitHub data to source freshness dashboard

[swegner] Set y axis limits to 0-100%

[migryz] Fix unittests

[aaltay] [BEAM-5617] Use bytes consistently for pcollection ids. (#6844)

[github] Merge pull request #6883: [BEAM-4461] Resubmit PR to switch SQL over to

[migryz] Address PR changes

[amyrvold] [BEAM-5735] Contributor guide improvements

[lcwik] [BEAM-5299] Define max timestamp for global window in proto (#6381)

[lcwik] Remove unintended comment in BeamFnLoggingClient (#6796)

[migryz] Add Code Velocity dashboard json

--
[...truncated 18.16 MB...]
INFO: 2018-10-31T03:07:57.261Z: Expanding CoGroupByKey operations into 
optimizable parts.
Oct 31, 2018 3:08:03 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-31T03:07:57.597Z: Expanding GroupByKey operations into 
optimizable parts.
Oct 31, 2018 3:08:03 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-31T03:07:57.726Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
Oct 31, 2018 3:08:03 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-31T03:07:57.783Z: Elided trivial flatten 
Oct 31, 2018 3:08:03 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-31T03:07:57.832Z: Elided trivial flatten 
Oct 31, 2018 3:08:03 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-31T03:07:57.879Z: Elided trivial flatten 
Oct 31, 2018 3:08:03 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-31T03:07:57.930Z: Unzipping flatten s43 for input 
s31.org.apache.beam.sdk.values.PCollection.:402#536e37dc9c742e20
Oct 31, 2018 3:08:03 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-31T03:07:57.977Z: Fusing unzipped copy of 
PAssert$27/GroupGlobally/GroupDummyAndContents/Reify, through flatten 
PAssert$27/GroupGlobally/FlattenDummyAndContents, into producer 
PAssert$27/GroupGlobally/KeyForDummy/AddKeys/Map
Oct 31, 2018 3:08:03 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-31T03:07:58.028Z: Fusing consumer 
PAssert$27/GroupGlobally/GroupDummyAndContents/GroupByWindow into 
PAssert$27/GroupGlobally/GroupDummyAndContents/Read
Oct 31, 2018 3:08:03 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-31T03:07:58.069Z: Fusing consumer PAssert$27/GetPane/Map into 
PAssert$27/GroupGlobally/ParDo(Concat)
Oct 31, 2018 3:08:03 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-31T03:07:58.113Z: Fusing consumer 
PAssert$27/VerifyAssertions/ParDo(DefaultConclude) into PAssert$27/RunChecks
Oct 31, 2018 3:08:03 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-31T03:07:58.159Z: Fusing consumer PAssert$27/RunChecks into 
PAssert$27/GetPane/Map
Oct 31, 2018 3:08:03 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-31T03:07:58.208Z: Fusing consumer 
PAssert$27/GroupGlobally/Values/Values/Map into 
PAssert$27/GroupGlobally/GroupDummyAndContents/GroupByWindow
Oct 31, 2018 3:08:03 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-31T03:07:58.255Z: Fusing consumer 
PAssert$27/GroupGlobally/ParDo(Concat) into 
PAssert$27/GroupGlobally/Values/Values/Map
Oct 31, 2018 3:08:03 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-31T03:07:58.301Z: Unzipping flatten s43-u80 for input 
s45-reify-value58-c78
Oct 31, 2018 3:08:03 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-31T03:07:58.352Z: Fusing unzipped copy of 
PAssert$27/GroupGlobally/GroupDummyAndContents/Write, through flatten s43-u80, 
into producer PAssert$27/GroupGlobally/GroupDummyAndContents/Reify
Oct 31, 2018 3:08:03 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-31T03:07:58.404Z: Fusing consumer 
PAssert$27/GroupGlobally/GroupDummyAndContents/Reify into 
PAssert$27/GroupGlobally/WindowIntoDummy/Window.Assign
Oct 31, 2018 3:08:03 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-31T03:07:58.446Z: Fusing consumer 
PAssert$27/GroupGlobally/GroupDummyAndContents/Write into 
PAssert$27/GroupGlobally/GroupDummyAndContents/Reify
Oct 31, 2018 3:08:03 AM 

Build failed in Jenkins: beam_PerformanceTests_ParquetIOIT_HDFS #561

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[migryz] Add script to sync data for Beam GitHub community metrics.

[migryz] Address PR comments.

[swegner] Add GitHub data to source freshness dashboard

[swegner] Set y axis limits to 0-100%

[migryz] Fix unittests

[aaltay] [BEAM-5617] Use bytes consistently for pcollection ids. (#6844)

[github] Merge pull request #6883: [BEAM-4461] Resubmit PR to switch SQL over to

[migryz] Address PR changes

[amyrvold] [BEAM-5735] Contributor guide improvements

[lcwik] [BEAM-5299] Define max timestamp for global window in proto (#6381)

[lcwik] Remove unintended comment in BeamFnLoggingClient (#6796)

[migryz] Add Code Velocity dashboard json

--
[...truncated 362.07 KB...]
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
... 21 more
Caused by: java.io.InvalidClassException: 
org.apache.beam.sdk.values.WindowingStrategy; local class incompatible: stream 
classdesc serialVersionUID = -6607512772692666907, local class serialVersionUID 
= -3616600070988263902
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
at 
org.apache.beam.sdk.util.SerializableUtils.deserializeFromByteArray(SerializableUtils.java:71)
... 28 more
java.lang.RuntimeException: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:192)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:163)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.create(IntrinsicMapTaskExecutorFactory.java:123)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:336)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:290)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2214)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899)
at 
com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:90)
at 
com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.createParDoOperation(IntrinsicMapTaskExecutorFactory.java:262)
at 

Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT #1164

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[migryz] Add script to sync data for Beam GitHub community metrics.

[migryz] Address PR comments.

[swegner] Add GitHub data to source freshness dashboard

[swegner] Set y axis limits to 0-100%

[migryz] Fix unittests

[aaltay] [BEAM-5617] Use bytes consistently for pcollection ids. (#6844)

[github] Merge pull request #6883: [BEAM-4461] Resubmit PR to switch SQL over to

[migryz] Address PR changes

[amyrvold] [BEAM-5735] Contributor guide improvements

[lcwik] [BEAM-5299] Define max timestamp for global window in proto (#6381)

[lcwik] Remove unintended comment in BeamFnLoggingClient (#6796)

[migryz] Add Code Velocity dashboard json

--
[...truncated 349.65 KB...]
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
... 21 more
Caused by: java.io.InvalidClassException: 
org.apache.beam.sdk.values.WindowingStrategy; local class incompatible: stream 
classdesc serialVersionUID = -6607512772692666907, local class serialVersionUID 
= -3616600070988263902
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
at 
org.apache.beam.sdk.util.SerializableUtils.deserializeFromByteArray(SerializableUtils.java:71)
... 28 more
java.lang.RuntimeException: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:192)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:163)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.create(IntrinsicMapTaskExecutorFactory.java:123)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:336)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:290)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2214)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899)
at 
com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:90)
at 
com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
at 

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #843

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[migryz] Add script to sync data for Beam GitHub community metrics.

[migryz] Address PR comments.

[swegner] Add GitHub data to source freshness dashboard

[swegner] Set y axis limits to 0-100%

[migryz] Fix unittests

[aaltay] [BEAM-5617] Use bytes consistently for pcollection ids. (#6844)

[github] Merge pull request #6883: [BEAM-4461] Resubmit PR to switch SQL over to

[migryz] Address PR changes

[amyrvold] [BEAM-5735] Contributor guide improvements

[lcwik] [BEAM-5299] Define max timestamp for global window in proto (#6381)

[lcwik] Remove unintended comment in BeamFnLoggingClient (#6796)

[migryz] Add Code Velocity dashboard json

--
[...truncated 366.69 KB...]
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
... 21 more
Caused by: java.io.InvalidClassException: 
org.apache.beam.sdk.values.WindowingStrategy; local class incompatible: stream 
classdesc serialVersionUID = -6607512772692666907, local class serialVersionUID 
= -3616600070988263902
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
at 
org.apache.beam.sdk.util.SerializableUtils.deserializeFromByteArray(SerializableUtils.java:71)
... 28 more
java.lang.RuntimeException: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:192)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:163)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.create(IntrinsicMapTaskExecutorFactory.java:123)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:336)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:290)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2214)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899)
at 
com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:90)
at 
com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.createParDoOperation(IntrinsicMapTaskExecutorFactory.java:262)
at 

Build failed in Jenkins: beam_PerformanceTests_ParquetIOIT #668

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[migryz] Add script to sync data for Beam GitHub community metrics.

[migryz] Address PR comments.

[swegner] Add GitHub data to source freshness dashboard

[swegner] Set y axis limits to 0-100%

[migryz] Fix unittests

[aaltay] [BEAM-5617] Use bytes consistently for pcollection ids. (#6844)

[github] Merge pull request #6883: [BEAM-4461] Resubmit PR to switch SQL over to

[migryz] Address PR changes

[amyrvold] [BEAM-5735] Contributor guide improvements

[lcwik] [BEAM-5299] Define max timestamp for global window in proto (#6381)

[lcwik] Remove unintended comment in BeamFnLoggingClient (#6796)

[migryz] Add Code Velocity dashboard json

--
[...truncated 350.73 KB...]
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
... 21 more
Caused by: java.io.InvalidClassException: 
org.apache.beam.sdk.values.WindowingStrategy; local class incompatible: stream 
classdesc serialVersionUID = -6607512772692666907, local class serialVersionUID 
= -3616600070988263902
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
at 
org.apache.beam.sdk.util.SerializableUtils.deserializeFromByteArray(SerializableUtils.java:71)
... 28 more
java.lang.RuntimeException: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:192)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:163)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.create(IntrinsicMapTaskExecutorFactory.java:123)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:336)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:290)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2214)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899)
at 
com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:90)
at 
com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
at 

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #859

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[migryz] Add script to sync data for Beam GitHub community metrics.

[migryz] Address PR comments.

[swegner] Add GitHub data to source freshness dashboard

[swegner] Set y axis limits to 0-100%

[migryz] Fix unittests

[aaltay] [BEAM-5617] Use bytes consistently for pcollection ids. (#6844)

[github] Merge pull request #6883: [BEAM-4461] Resubmit PR to switch SQL over to

[migryz] Address PR changes

[amyrvold] [BEAM-5735] Contributor guide improvements

[lcwik] [BEAM-5299] Define max timestamp for global window in proto (#6381)

[lcwik] Remove unintended comment in BeamFnLoggingClient (#6796)

[migryz] Add Code Velocity dashboard json

--
[...truncated 361.26 KB...]
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
... 21 more
Caused by: java.io.InvalidClassException: 
org.apache.beam.sdk.values.WindowingStrategy; local class incompatible: stream 
classdesc serialVersionUID = -6607512772692666907, local class serialVersionUID 
= -3616600070988263902
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
at 
org.apache.beam.sdk.util.SerializableUtils.deserializeFromByteArray(SerializableUtils.java:71)
... 28 more
java.lang.RuntimeException: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:192)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:163)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.create(IntrinsicMapTaskExecutorFactory.java:123)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:336)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:290)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2214)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899)
at 
com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:90)
at 
com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.createParDoOperation(IntrinsicMapTaskExecutorFactory.java:262)
at 

Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT_HDFS #844

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[migryz] Add script to sync data for Beam GitHub community metrics.

[migryz] Address PR comments.

[swegner] Add GitHub data to source freshness dashboard

[swegner] Set y axis limits to 0-100%

[migryz] Fix unittests

[aaltay] [BEAM-5617] Use bytes consistently for pcollection ids. (#6844)

[github] Merge pull request #6883: [BEAM-4461] Resubmit PR to switch SQL over to

[migryz] Address PR changes

[amyrvold] [BEAM-5735] Contributor guide improvements

[lcwik] [BEAM-5299] Define max timestamp for global window in proto (#6381)

[lcwik] Remove unintended comment in BeamFnLoggingClient (#6796)

[migryz] Add Code Velocity dashboard json

--
[...truncated 363.76 KB...]
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
... 21 more
Caused by: java.io.InvalidClassException: 
org.apache.beam.sdk.values.WindowingStrategy; local class incompatible: stream 
classdesc serialVersionUID = -6607512772692666907, local class serialVersionUID 
= -3616600070988263902
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
at 
org.apache.beam.sdk.util.SerializableUtils.deserializeFromByteArray(SerializableUtils.java:71)
... 28 more
java.lang.RuntimeException: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:192)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:163)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.create(IntrinsicMapTaskExecutorFactory.java:123)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:336)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:290)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2214)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899)
at 
com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:90)
at 
com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.createParDoOperation(IntrinsicMapTaskExecutorFactory.java:262)
at 

Build failed in Jenkins: beam_PerformanceTests_TextIOIT #1197

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[migryz] Add script to sync data for Beam GitHub community metrics.

[migryz] Address PR comments.

[swegner] Add GitHub data to source freshness dashboard

[swegner] Set y axis limits to 0-100%

[migryz] Fix unittests

[aaltay] [BEAM-5617] Use bytes consistently for pcollection ids. (#6844)

[github] Merge pull request #6883: [BEAM-4461] Resubmit PR to switch SQL over to

[migryz] Address PR changes

[amyrvold] [BEAM-5735] Contributor guide improvements

[lcwik] [BEAM-5299] Define max timestamp for global window in proto (#6381)

[lcwik] Remove unintended comment in BeamFnLoggingClient (#6796)

[migryz] Add Code Velocity dashboard json

--
[...truncated 346.81 KB...]
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
... 21 more
Caused by: java.io.InvalidClassException: 
org.apache.beam.sdk.values.WindowingStrategy; local class incompatible: stream 
classdesc serialVersionUID = -6607512772692666907, local class serialVersionUID 
= -3616600070988263902
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
at 
org.apache.beam.sdk.util.SerializableUtils.deserializeFromByteArray(SerializableUtils.java:71)
... 28 more
java.lang.RuntimeException: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:192)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:163)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.create(IntrinsicMapTaskExecutorFactory.java:123)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:336)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:290)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2214)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899)
at 
com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:90)
at 
com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
at 

Build failed in Jenkins: beam_PerformanceTests_TFRecordIOIT #1167

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[migryz] Add script to sync data for Beam GitHub community metrics.

[migryz] Address PR comments.

[swegner] Add GitHub data to source freshness dashboard

[swegner] Set y axis limits to 0-100%

[migryz] Fix unittests

[aaltay] [BEAM-5617] Use bytes consistently for pcollection ids. (#6844)

[github] Merge pull request #6883: [BEAM-4461] Resubmit PR to switch SQL over to

[migryz] Address PR changes

[amyrvold] [BEAM-5735] Contributor guide improvements

[lcwik] [BEAM-5299] Define max timestamp for global window in proto (#6381)

[lcwik] Remove unintended comment in BeamFnLoggingClient (#6796)

[migryz] Add Code Velocity dashboard json

--
[...truncated 310.73 KB...]
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
... 21 more
Caused by: java.io.InvalidClassException: 
org.apache.beam.sdk.values.WindowingStrategy; local class incompatible: stream 
classdesc serialVersionUID = -6607512772692666907, local class serialVersionUID 
= -3616600070988263902
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
at 
org.apache.beam.sdk.util.SerializableUtils.deserializeFromByteArray(SerializableUtils.java:71)
... 28 more
java.lang.RuntimeException: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:192)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:163)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.create(IntrinsicMapTaskExecutorFactory.java:123)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:336)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:290)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2214)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899)
at 
com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:90)
at 
com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
at 

Jenkins build is back to normal : beam_SeedJob #2906

2018-10-30 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PerformanceTests_AvroIOIT_HDFS #843

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[migryz] Add script to sync data for Beam GitHub community metrics.

[migryz] Address PR comments.

[swegner] Add GitHub data to source freshness dashboard

[swegner] Set y axis limits to 0-100%

[migryz] Fix unittests

[aaltay] [BEAM-5617] Use bytes consistently for pcollection ids. (#6844)

[github] Merge pull request #6883: [BEAM-4461] Resubmit PR to switch SQL over to

[migryz] Address PR changes

[amyrvold] [BEAM-5735] Contributor guide improvements

[lcwik] [BEAM-5299] Define max timestamp for global window in proto (#6381)

[lcwik] Remove unintended comment in BeamFnLoggingClient (#6796)

[migryz] Add Code Velocity dashboard json

--
[...truncated 364.31 KB...]
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
... 21 more
Caused by: java.io.InvalidClassException: 
org.apache.beam.sdk.values.WindowingStrategy; local class incompatible: stream 
classdesc serialVersionUID = -6607512772692666907, local class serialVersionUID 
= -3616600070988263902
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
at 
org.apache.beam.sdk.util.SerializableUtils.deserializeFromByteArray(SerializableUtils.java:71)
... 28 more
java.lang.RuntimeException: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:192)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:163)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.create(IntrinsicMapTaskExecutorFactory.java:123)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:336)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:290)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2214)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899)
at 
com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:90)
at 
com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.createParDoOperation(IntrinsicMapTaskExecutorFactory.java:262)
at 

Build failed in Jenkins: beam_PerformanceTests_JDBC #1277

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[migryz] Add script to sync data for Beam GitHub community metrics.

[migryz] Address PR comments.

[swegner] Add GitHub data to source freshness dashboard

[swegner] Set y axis limits to 0-100%

[migryz] Fix unittests

[aaltay] [BEAM-5617] Use bytes consistently for pcollection ids. (#6844)

[github] Merge pull request #6883: [BEAM-4461] Resubmit PR to switch SQL over to

[migryz] Address PR changes

[amyrvold] [BEAM-5735] Contributor guide improvements

[lcwik] [BEAM-5299] Define max timestamp for global window in proto (#6381)

[lcwik] Remove unintended comment in BeamFnLoggingClient (#6796)

[migryz] Add Code Velocity dashboard json

--
[...truncated 258.65 KB...]
... 21 more
Caused by: java.io.InvalidClassException: 
org.apache.beam.sdk.values.WindowingStrategy; local class incompatible: stream 
classdesc serialVersionUID = -6607512772692666907, local class serialVersionUID 
= -3616600070988263902
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
at 
org.apache.beam.sdk.util.SerializableUtils.deserializeFromByteArray(SerializableUtils.java:71)
... 28 more
java.lang.RuntimeException: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:192)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:163)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.create(IntrinsicMapTaskExecutorFactory.java:123)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:336)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:290)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2214)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899)
at 
com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:90)
at 
com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.createParDoOperation(IntrinsicMapTaskExecutorFactory.java:262)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.access$000(IntrinsicMapTaskExecutorFactory.java:84)
at 

Build failed in Jenkins: beam_PerformanceTests_HadoopInputFormat #945

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[migryz] Add script to sync data for Beam GitHub community metrics.

[migryz] Address PR comments.

[swegner] Add GitHub data to source freshness dashboard

[swegner] Set y axis limits to 0-100%

[migryz] Fix unittests

[aaltay] [BEAM-5617] Use bytes consistently for pcollection ids. (#6844)

[github] Merge pull request #6883: [BEAM-4461] Resubmit PR to switch SQL over to

[migryz] Address PR changes

[amyrvold] [BEAM-5735] Contributor guide improvements

[lcwik] [BEAM-5299] Define max timestamp for global window in proto (#6381)

[lcwik] Remove unintended comment in BeamFnLoggingClient (#6796)

[migryz] Add Code Velocity dashboard json

--
[...truncated 341.98 KB...]
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
... 21 more
Caused by: java.io.InvalidClassException: 
org.apache.beam.sdk.values.WindowingStrategy; local class incompatible: stream 
classdesc serialVersionUID = -6607512772692666907, local class serialVersionUID 
= -3616600070988263902
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
at 
org.apache.beam.sdk.util.SerializableUtils.deserializeFromByteArray(SerializableUtils.java:71)
... 28 more
java.lang.RuntimeException: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:192)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:163)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.create(IntrinsicMapTaskExecutorFactory.java:123)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:336)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:290)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2214)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899)
at 
com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:90)
at 
com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.createParDoOperation(IntrinsicMapTaskExecutorFactory.java:262)
at 

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT #939

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[migryz] Add script to sync data for Beam GitHub community metrics.

[migryz] Address PR comments.

[swegner] Add GitHub data to source freshness dashboard

[swegner] Set y axis limits to 0-100%

[migryz] Fix unittests

[aaltay] [BEAM-5617] Use bytes consistently for pcollection ids. (#6844)

[github] Merge pull request #6883: [BEAM-4461] Resubmit PR to switch SQL over to

[migryz] Address PR changes

[amyrvold] [BEAM-5735] Contributor guide improvements

[lcwik] [BEAM-5299] Define max timestamp for global window in proto (#6381)

[lcwik] Remove unintended comment in BeamFnLoggingClient (#6796)

[migryz] Add Code Velocity dashboard json

--
[...truncated 351.96 KB...]
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
... 21 more
Caused by: java.io.InvalidClassException: 
org.apache.beam.sdk.values.WindowingStrategy; local class incompatible: stream 
classdesc serialVersionUID = -6607512772692666907, local class serialVersionUID 
= -3616600070988263902
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
at 
org.apache.beam.sdk.util.SerializableUtils.deserializeFromByteArray(SerializableUtils.java:71)
... 28 more
java.lang.RuntimeException: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:192)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:163)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
at 
com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
at 
com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
at 
com.google.cloud.dataflow.worker.IntrinsicMapTaskExecutorFactory.create(IntrinsicMapTaskExecutorFactory.java:123)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:336)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:290)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: 
com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.UncheckedExecutionException:
 java.lang.IllegalArgumentException: unable to deserialize Serialized DoFnInfo
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2214)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at 
com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899)
at 
com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:90)
at 
com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
at 

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #1568

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[migryz] Add script to sync data for Beam GitHub community metrics.

[migryz] Address PR comments.

[swegner] Add GitHub data to source freshness dashboard

[swegner] Set y axis limits to 0-100%

[migryz] Fix unittests

[migryz] Address PR changes

[amyrvold] [BEAM-5735] Contributor guide improvements

[lcwik] [BEAM-5299] Define max timestamp for global window in proto (#6381)

[lcwik] Remove unintended comment in BeamFnLoggingClient (#6796)

[migryz] Add Code Velocity dashboard json

--
[...truncated 176.20 KB...]
Caching disabled for task 
':beam-runners-google-cloud-dataflow-java-fn-api-worker:shadowJar': Caching has 
not been enabled for the task
Task ':beam-runners-google-cloud-dataflow-java-fn-api-worker:shadowJar' is not 
up-to-date because:
  No history is available.
***
GRADLE SHADOW STATS

Total Jars: 130 (includes project)
Total Time: 60.072s [60072ms]
Average Time/Jar: 0.4620923076923s [462.0923076923ms]
***
:beam-runners-google-cloud-dataflow-java-fn-api-worker:shadowJar (Thread[Task 
worker for ':' Thread 7,5,main]) completed. Took 1 mins 1.717 secs.

> Task :beam-sdks-python:validatesRunnerBatchTests
test_as_list_and_as_dict_side_inputs 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... FAIL
test_multi_valued_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_as_singleton_with_different_defaults 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

==
FAIL: test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest)
--
Traceback (most recent call last):
  File 
"
 line 182, in test_iterable_side_input
pipeline.run()
  File 
"
 line 111, in run
"Pipeline execution failed."
AssertionError: Pipeline execution failed.
 >> begin captured stdout << -
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-10-30_17_18_15-10525923181538746610?project=apache-beam-testing.

- >> end captured stdout << --

--
XML: 

--
Ran 16 tests in 919.600s

FAILED (failures=1)

> Task :beam-sdks-python:validatesRunnerBatchTests FAILED
:beam-sdks-python:validatesRunnerBatchTests (Thread[Task worker for 
':',5,main]) completed. Took 15 mins 23.486 secs.
:beam-sdks-python:validatesRunnerStreamingTests (Thread[Task worker for ':' 
Thread 6,5,main]) started.

> Task :beam-sdks-python:validatesRunnerStreamingTests
Caching disabled for task ':beam-sdks-python:validatesRunnerStreamingTests': 
Caching has not been enabled for the task
Task ':beam-sdks-python:validatesRunnerStreamingTests' is not up-to-date 
because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: 

 Command: sh -c . 

Build failed in Jenkins: beam_SeedJob #2905

2018-10-30 Thread Apache Jenkins Server
See 

--
GitHub pull request #6896 of commit f6f88a4d0b9140468d10c91e8ffa788edbc30453, 
no merge conflicts.
Setting status of f6f88a4d0b9140468d10c91e8ffa788edbc30453 to PENDING with url 
https://builds.apache.org/job/beam_SeedJob/2905/ and message: 'Build started 
for merge commit.'
Using context: Jenkins: Seed Job
[EnvInject] - Loading node environment variables.
Building remotely on beam13 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/6896/*:refs/remotes/origin/pr/6896/*
 > git rev-parse refs/remotes/origin/pr/6896/merge^{commit} # timeout=10
 > git rev-parse refs/remotes/origin/origin/pr/6896/merge^{commit} # timeout=10
Checking out Revision bfcce535b8885a2c02425770b48963ce2730ff7e 
(refs/remotes/origin/pr/6896/merge)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f bfcce535b8885a2c02425770b48963ce2730ff7e
Commit message: "Merge f6f88a4d0b9140468d10c91e8ffa788edbc30453 into 
3030f79f588ff76715a57f6048e853266c7a11ce"
First time build. Skipping changelog.
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
Processing DSL script job_00_seed.groovy
Processing DSL script job_Dependency_Check.groovy
Processing DSL script job_Inventory.groovy
Processing DSL script job_PerformanceTests_Dataflow.groovy
Processing DSL script job_PerformanceTests_FileBasedIO_IT.groovy
Processing DSL script job_PerformanceTests_FileBasedIO_IT_HDFS.groovy
Processing DSL script job_PerformanceTests_HadoopInputFormat.groovy
Processing DSL script job_PerformanceTests_JDBC.groovy
Processing DSL script job_PerformanceTests_MongoDBIO_IT.groovy
Processing DSL script job_PerformanceTests_Python.groovy
Processing DSL script job_PerformanceTests_Spark.groovy
Processing DSL script job_PostCommit_CommunityMetrics.groovy
Processing DSL script job_PostCommit_Go_GradleBuild.groovy
Processing DSL script job_PostCommit_Java_GradleBuild.groovy
Processing DSL script job_PostCommit_Java_Nexmark_Dataflow.groovy
Processing DSL script job_PostCommit_Java_Nexmark_Direct.groovy
Processing DSL script job_PostCommit_Java_Nexmark_Flink.groovy
Processing DSL script job_PostCommit_Java_Nexmark_Spark.groovy
Processing DSL script job_PostCommit_Java_PortabilityApi_GradleBuild.groovy
Processing DSL script job_PostCommit_Java_PortableValidatesRunner_Flink.groovy
Processing DSL script job_PostCommit_Java_ValidatesRunner_Apex.groovy
Processing DSL script job_PostCommit_Java_ValidatesRunner_Dataflow.groovy
Processing DSL script job_PostCommit_Java_ValidatesRunner_Flink.groovy
Processing DSL script job_PostCommit_Java_ValidatesRunner_Gearpump.groovy
Processing DSL script 
job_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow.groovy
Processing DSL script job_PostCommit_Java_ValidatesRunner_Samza.groovy
Processing DSL script job_PostCommit_Java_ValidatesRunner_Spark.groovy
Processing DSL script job_PostCommit_Python_ValidatesContainer_Dataflow.groovy
Processing DSL script job_PostCommit_Python_ValidatesRunner_Dataflow.groovy
Processing DSL script job_PostCommit_Python_ValidatesRunner_Flink.groovy
Processing DSL script job_PostCommit_Python_Verify.groovy
Processing DSL script job_PostCommit_Website_Publish.groovy
Processing DSL script job_PostRelease_NightlySnapshot.groovy
Processing DSL script job_PreCommit_CommunityMetrics.groovy
Processing DSL script job_PreCommit_Go.groovy
Processing DSL script job_PreCommit_Java.groovy
ERROR: (PrecommitJobBuilder.groovy, line 110) No such property: $it for class: 
javaposse.jobdsl.dsl.helpers.step.GradleContext

-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #873

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[migryz] Add script to sync data for Beam GitHub community metrics.

[migryz] Address PR comments.

[swegner] Add GitHub data to source freshness dashboard

[swegner] Set y axis limits to 0-100%

[migryz] Fix unittests

[migryz] Address PR changes

[amyrvold] [BEAM-5735] Contributor guide improvements

[lcwik] [BEAM-5299] Define max timestamp for global window in proto (#6381)

[lcwik] Remove unintended comment in BeamFnLoggingClient (#6796)

[migryz] Add Code Velocity dashboard json

--
[...truncated 887.33 KB...]

Performance:
  Conf  Runtime(sec)(Baseline)  Events(/sec)(Baseline)   Results
(Baseline)
    -1.0-1.0  -1
  
*** Job was unexpectedly updated ***
  0001  -1.0-1.0  -1
  
*** Job was unexpectedly updated ***
  0002  -1.0-1.0  -1
  
*** Job was unexpectedly updated ***
  0003  -1.0-1.0  -1
  
*** Job was unexpectedly updated ***
  0004  -1.0-1.0  -1
  
*** Job was unexpectedly updated ***
  0005  -1.0-1.0  -1
  
*** Job was unexpectedly updated ***
  0006  -1.0-1.0  -1
  
*** Job was unexpectedly updated ***
  0007  -1.0-1.0  -1
  
*** Job was unexpectedly updated ***
  0008  -1.0-1.0  -1
  
*** Job was unexpectedly updated ***
  0009  -1.0-1.0  -1
  
*** Job was unexpectedly updated ***
  0010  -1.0-1.0  -1
  
*** Job was unexpectedly updated ***
  0011  -1.0-1.0  -1
  
*** Job was unexpectedly updated ***
  0012  -1.0-1.0  -1
  
*** Job was unexpectedly updated ***
==

Oct 31, 2018 12:19:49 AM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 31, 2018 12:19:49 AM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 129 files. Enable logging at DEBUG level to see which 
files will be staged.
Oct 31, 2018 12:19:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Oct 31, 2018 12:19:49 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 129 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Oct 31, 2018 12:19:50 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 129 files cached, 0 files newly uploaded
Oct 31, 2018 12:19:50 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Create.Values/Read(CreateSource) as step s1
Oct 31, 2018 12:19:50 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/PrepareWrite/ParDo(Anonymous) as step s2
Oct 31, 2018 12:19:50 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
savePerfsToBigQuery/BatchLoads/JobIdCreationRoot/Read(CreateSource) as step s3
Oct 31, 2018 12:19:50 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding savePerfsToBigQuery/BatchLoads/CreateJobId as step s4
Oct 31, 2018 12:19:50 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
 as step s5
Oct 31, 2018 12:19:50 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
savePerfsToBigQuery/BatchLoads/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey
 as step s6
Oct 31, 2018 

Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #575

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[migryz] Add script to sync data for Beam GitHub community metrics.

[migryz] Address PR comments.

[swegner] Add GitHub data to source freshness dashboard

[swegner] Set y axis limits to 0-100%

[migryz] Fix unittests

[migryz] Address PR changes

[amyrvold] [BEAM-5735] Contributor guide improvements

[lcwik] [BEAM-5299] Define max timestamp for global window in proto (#6381)

[lcwik] Remove unintended comment in BeamFnLoggingClient (#6796)

[migryz] Add Code Velocity dashboard json

--
[...truncated 4.46 MB...]
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - 
http://localhost:32943 was granted leadership with 
leaderSessionID=f9412a51-bcf8-471e-9484-e573da0724e5
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader http://localhost:32943 , 
session=f9412a51-bcf8-471e-9484-e573da0724e5
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Starting job dispatcher(s) for JobManger
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService 
- Starting RPC endpoint for 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher at 
akka://flink/user/dispatcherc0be571a-1aff-4f55-b1b0-f09937a35d5f .
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher@659fbcb7 @ 
akka://flink/user/dispatcherc0be571a-1aff-4f55-b1b0-f09937a35d5f
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Flink Mini Cluster started successfully
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Dispatcher 
akka://flink/user/dispatcherc0be571a-1aff-4f55-b1b0-f09937a35d5f was granted 
leadership with fencing token cf4883e2-989e-44c5-8261-a03801e06ff3
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Recovering all 
persisted jobs.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader 
akka://flink/user/dispatcherc0be571a-1aff-4f55-b1b0-f09937a35d5f , 
session=cf4883e2-989e-44c5-8261-a03801e06ff3
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Submitting job 
2b30bb4548b9cf3ee8873592adf93288 (test_windowing_1540945183.99).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.JobMaster at akka://flink/user/jobmanager_41 
.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Initializing job 
test_windowing_1540945183.99 (2b30bb4548b9cf3ee8873592adf93288).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Using restart strategy 
NoRestartStrategy for test_windowing_1540945183.99 
(2b30bb4548b9cf3ee8873592adf93288).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool at 
akka://flink/user/40011289-5723-4006-9c63-448eaf0e093a .
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job recovers via 
failover strategy: full graph restart
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Running initialization on master 
for job test_windowing_1540945183.99 (2b30bb4548b9cf3ee8873592adf93288).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Successfully ran initialization 
on master in 0 ms.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - No state backend has been 
configured, using default (Memory / JobManager) MemoryStateBackend (data in 
heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 
'null', asynchronous: TRUE, maxStateSize: 5242880)
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.jobmaster.JobManagerRunner@4e239c8d @ 
akka://flink/user/jobmanager_41
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobManagerRunner - JobManager runner for job 
test_windowing_1540945183.99 (2b30bb4548b9cf3ee8873592adf93288) was granted 
leadership with session id ebca540a-dd6b-4849-a103-d5adbab21fae at 
akka://flink/user/jobmanager_41.

Jenkins build is back to normal : beam_PostCommit_Java_GradleBuild #1790

2018-10-30 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Java_PortabilityApi_GradleBuild #9

2018-10-30 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java_PortabilityApi_GradleBuild #8

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[github] Fix Community Metrics Prober test (#6876)

--
[...truncated 1.58 MB...]
INFO: 2018-10-30T19:10:14.940Z: Fusing consumer 
BigQueryIO.Write/BatchLoads/JobIdCreationRoot/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
 into 
BigQueryIO.Write/BatchLoads/JobIdCreationRoot/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
Oct 30, 2018 7:10:28 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-30T19:10:14.990Z: Fusing consumer 
BigQueryIO.Write/BatchLoads/JobIdCreationRoot/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
 into 
BigQueryIO.Write/BatchLoads/JobIdCreationRoot/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
Oct 30, 2018 7:10:28 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-30T19:10:15.037Z: Fusing consumer 
BigQueryIO.Write/BatchLoads/JobIdCreationRoot/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
 into 
BigQueryIO.Write/BatchLoads/JobIdCreationRoot/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
Oct 30, 2018 7:10:28 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-30T19:10:15.124Z: Fusing consumer 
BigQueryIO.Write/BatchLoads/JobIdCreationRoot/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
 into 
BigQueryIO.Write/BatchLoads/JobIdCreationRoot/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
Oct 30, 2018 7:10:28 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-30T19:10:15.171Z: Fusing consumer 
BigQueryIO.Read/BigQueryIO.TypedRead/TriggerIdCreation/Read(CreateSource)/ParDo(SplitBoundedSource)
 into 
BigQueryIO.Read/BigQueryIO.TypedRead/TriggerIdCreation/Read(CreateSource)/Impulse
Oct 30, 2018 7:10:28 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-30T19:10:15.208Z: Fusing consumer 
BigQueryIO.Write/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+BigQueryIO.Write/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial
 into 
BigQueryIO.Write/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
Oct 30, 2018 7:10:28 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-30T19:10:15.257Z: Fusing consumer 
BigQueryIO.Read/BigQueryIO.TypedRead/TriggerIdCreation/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair
 with random key into 
BigQueryIO.Read/BigQueryIO.TypedRead/TriggerIdCreation/Read(CreateSource)/ParDo(SplitBoundedSource)
Oct 30, 2018 7:10:28 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-30T19:10:15.295Z: Fusing consumer 
BigQueryIO.Read/BigQueryIO.TypedRead/TriggerIdCreation/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
 into 
BigQueryIO.Read/BigQueryIO.TypedRead/TriggerIdCreation/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
Oct 30, 2018 7:10:28 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-30T19:10:15.342Z: Fusing consumer 
BigQueryIO.Read/BigQueryIO.TypedRead/TriggerIdCreation/Read(CreateSource)/ParDo(ReadFromBoundedSource)
 into 
BigQueryIO.Read/BigQueryIO.TypedRead/TriggerIdCreation/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
Oct 30, 2018 7:10:28 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-30T19:10:15.389Z: Fusing consumer 
BigQueryIO.Read/BigQueryIO.TypedRead/TriggerIdCreation/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
 into 
BigQueryIO.Read/BigQueryIO.TypedRead/TriggerIdCreation/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
Oct 30, 2018 7:10:28 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-30T19:10:16.032Z: Executing operation 
BigQueryIO.Write/BatchLoads/SinglePartitionsReshuffle/GroupByKey/Create
Oct 30, 2018 7:10:28 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-30T19:10:16.080Z: Executing operation 
BigQueryIO.Read/BigQueryIO.TypedRead/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
Oct 30, 2018 7:10:28 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-10-30T19:10:16.131Z: Starting 1 workers in 

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1789

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [BEAM-4783] Fix invalid parameter to set the partitioner in Spark GbK

[timrobertson100] [BEAM-5036] Optimize the FileBasedSink 
WriteOperation.moveToOutput()

[scott] [BEAM-5913] Re-enable parallel builds for Java Jenkins jobs (#6882)

[scott] [BEAM-308] Print warning about using non-public PipelineOptions

--
[...truncated 57.11 MB...]
Oct 30, 2018 7:13:52 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.objenesis/objenesis/2.6/639033469776fd37c08358c6b92a4761feb2af4b/objenesis-2.6.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1030191350-316c1f76/output/results/staging/objenesis-2.6-X_rD9RQFypspFZcKIks-jw.jar
Oct 30, 2018 7:13:52 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/io.netty/netty-transport/4.1.25.Final/19a6f1f649894b6705aa9d8cbcced188dff133b0/netty-transport-4.1.25.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1030191350-316c1f76/output/results/staging/netty-transport-4.1.25.Final-3-E9kW1IVWJlpjxjfs5xBA.jar
Oct 30, 2018 7:13:52 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.httpcomponents/httpcore/4.4.6/e3fd8ced1f52c7574af952e2e6da0df8df08eb82/httpcore-4.4.6.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1030191350-316c1f76/output/results/staging/httpcore-4.4.6-qfvVA-CAJQfv7q_7VrvfUg.jar
Oct 30, 2018 7:13:52 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.esotericsoftware.kryo/kryo/2.21/9a4e69cff8d225729656f7e97e40893b23bffef/kryo-2.21.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1030191350-316c1f76/output/results/staging/kryo-2.21-olkSUBNMYGe-WXkdlwcytQ.jar
Oct 30, 2018 7:13:52 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.4/d94ae6d7d27242eaa4b6c323f881edbb98e48da6/snappy-java-1.1.4.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1030191350-316c1f76/output/results/staging/snappy-java-1.1.4-SFNwbMuGq13aaoKVzeS1Tw.jar
Oct 30, 2018 7:13:52 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.9/9ce04e34240f674bc72680f8b843b1457383161a/commons-codec-1.9.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1030191350-316c1f76/output/results/staging/commons-codec-1.9-dWFTVmBcgSgBPanjrGKiSQ.jar
Oct 30, 2018 7:13:52 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.esotericsoftware.minlog/minlog/1.2/59bfcd171d82f9981a5e242b9e840191f650e209/minlog-1.2.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1030191350-316c1f76/output/results/staging/minlog-1.2-98-99jtn3wu_pMfLJgiFvA.jar
Oct 30, 2018 7:13:52 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/20.0/89507701249388e1ed5ddcf8c41f4ce1be7831ef/guava-20.0.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1030191350-316c1f76/output/results/staging/guava-20.0-8yqKJSRiDb7Mn2v2ogwpPw.jar
Oct 30, 2018 7:13:52 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.squareup.okhttp/okhttp/2.5.0/4de2b4ed3445c37ec1720a7d214712e845a24636/okhttp-2.5.0.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1030191350-316c1f76/output/results/staging/okhttp-2.5.0-64v0X4G_nxfR_PsuymOqpg.jar
Oct 30, 2018 7:13:52 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.beust/jcommander/1.27/58c9cbf0f1fa296f93c712f2cf46de50471920f9/jcommander-1.27.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1030191350-316c1f76/output/results/staging/jcommander-1.27-YxYbyOYD5gurYr1hw0nzcA.jar
Oct 30, 2018 7:13:52 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #1565

2018-10-30 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Java_PVR_Flink #166

2018-10-30 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #1564

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[github] Fix Community Metrics Prober test (#6876)

--
[...truncated 165.96 KB...]
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ERROR
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_multi_valued_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

==
ERROR: test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest)
--
Traceback (most recent call last):
  File 
"
 line 149, in test_empty_singleton_side_input
pipeline.run()
  File 
"
 line 107, in run
else test_runner_api))
  File 
"
 line 405, in run
self._options).run(False)
  File 
"
 line 418, in run
return self.runner.run_pipeline(self)
  File 
"
 line 52, in run_pipeline
self.result = super(TestDataflowRunner, self).run_pipeline(pipeline)
  File 
"
 line 394, in run_pipeline
self.dataflow_client.create_job(self.job), self)
  File 
"
 line 184, in wrapper
return fun(*args, **kwargs)
  File 
"
 line 496, in create_job
self.create_job_description(job)
  File 
"
 line 525, in create_job_description
resources = self._stage_resources(job.options)
  File 
"
 line 458, in _stage_resources
staging_location=google_cloud_options.staging_location)
  File 
"
 line 268, in stage_job_resources
shutil.rmtree(temp_dir)
  File "/usr/lib/python2.7/shutil.py", line 239, in rmtree
onerror(os.listdir, path, sys.exc_info())
  File "/usr/lib/python2.7/shutil.py", line 237, in rmtree
names = os.listdir(path)
OSError: [Errno 2] No such file or directory: '/tmp/tmpldn_KL'

--
XML: 

--
Ran 16 tests in 1038.419s

FAILED (errors=1)

> Task :beam-sdks-python:validatesRunnerBatchTests FAILED
:beam-sdks-python:validatesRunnerBatchTests (Thread[Task worker for 
':',5,main]) completed. Took 17 mins 20.653 secs.
:beam-sdks-python:validatesRunnerStreamingTests (Thread[Task worker for 
':',5,main]) started.

> Task :beam-sdks-python:validatesRunnerStreamingTests
Caching disabled for task ':beam-sdks-python:validatesRunnerStreamingTests': 
Caching has not been enabled for the task
Task ':beam-sdks-python:validatesRunnerStreamingTests' is not up-to-date 
because:
  Task has not declared any outputs despite executing actions.
Starting process 

Jenkins build is back to normal : beam_Prober_CommunityMetrics #26

2018-10-30 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #570

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[github] Fix Community Metrics Prober test (#6876)

--
[...truncated 4.45 MB...]
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - 
http://localhost:36149 was granted leadership with 
leaderSessionID=6c6e8c8d-2eda-4065-bf3f-6621c8de7b90
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader http://localhost:36149 , 
session=6c6e8c8d-2eda-4065-bf3f-6621c8de7b90
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService 
- Starting RPC endpoint for 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher at 
akka://flink/user/dispatcherb987afdf-707f-4268-b3c1-6c2719a83b49 .
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher@175d8918 @ 
akka://flink/user/dispatcherb987afdf-707f-4268-b3c1-6c2719a83b49
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Flink Mini Cluster started successfully
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Dispatcher 
akka://flink/user/dispatcherb987afdf-707f-4268-b3c1-6c2719a83b49 was granted 
leadership with fencing token e570a36f-8f9e-4b73-bf26-eb330bfa4fd0
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Recovering all 
persisted jobs.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader 
akka://flink/user/dispatcherb987afdf-707f-4268-b3c1-6c2719a83b49 , 
session=e570a36f-8f9e-4b73-bf26-eb330bfa4fd0
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Submitting job 
916c8a307714742bd5758684322a4998 (test_windowing_1540921068.75).
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.JobMaster at akka://flink/user/jobmanager_41 
.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Initializing job 
test_windowing_1540921068.75 (916c8a307714742bd5758684322a4998).
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Using restart strategy 
NoRestartStrategy for test_windowing_1540921068.75 
(916c8a307714742bd5758684322a4998).
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool at 
akka://flink/user/e0471779-df18-43d2-b7a3-58bd10a1bb72 .
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job recovers via 
failover strategy: full graph restart
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Running initialization on master 
for job test_windowing_1540921068.75 (916c8a307714742bd5758684322a4998).
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Successfully ran initialization 
on master in 0 ms.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - No state backend has been 
configured, using default (Memory / JobManager) MemoryStateBackend (data in 
heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 
'null', asynchronous: TRUE, maxStateSize: 5242880)
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.jobmaster.JobManagerRunner@558980e0 @ 
akka://flink/user/jobmanager_41
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.jobmaster.JobManagerRunner - JobManager runner for job 
test_windowing_1540921068.75 (916c8a307714742bd5758684322a4998) was granted 
leadership with session id 60ea2427-e0b8-4828-81b7-520245596e38 at 
akka://flink/user/jobmanager_41.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Starting execution of job 
test_windowing_1540921068.75 (916c8a307714742bd5758684322a4998)
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job 
test_windowing_1540921068.75 (916c8a307714742bd5758684322a4998) switched from 
state CREATED to RUNNING.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: Custom Source 
-> 14Create/Impulse.None/beam:env:docker:v1:0 

Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #1563

2018-10-30 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java_PVR_Flink #165

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [BEAM-4783] Fix invalid parameter to set the partitioner in Spark GbK

[timrobertson100] [BEAM-5036] Optimize the FileBasedSink 
WriteOperation.moveToOutput()

[scott] [BEAM-5913] Re-enable parallel builds for Java Jenkins jobs (#6882)

[scott] [BEAM-308] Print warning about using non-public PipelineOptions

--
[...truncated 499.12 MB...]
[MapPartition (MapPartition at 
PAssert$148/GroupGlobally/GroupDummyAndContents.out/beam:env:docker:v1:0) 
(3/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources 
for MapPartition (MapPartition at 
PAssert$148/GroupGlobally/GroupDummyAndContents.out/beam:env:docker:v1:0) 
(3/16) (a3ff4fef6929568ea4c10f73ed627534).
[DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@33a46ad9) 
(12/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all 
FileSystem streams are closed for task DataSink 
(org.apache.flink.api.java.io.DiscardingOutputFormat@33a46ad9) (12/16) 
(89fb22303f1b20e405236ff2b266f6ec) [FINISHED]
[jobmanager-future-thread-9] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink 
(org.apache.flink.api.java.io.DiscardingOutputFormat@33a46ad9) (3/16) 
(e08c2c0eb566cf8fe7a9f6cfc59dcc28) switched from CREATED to SCHEDULED.
[DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@33a46ad9) 
(16/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink 
(org.apache.flink.api.java.io.DiscardingOutputFormat@33a46ad9) (16/16) 
(2cf68514a1f4ece6090116c78168e4c2) switched from CREATED to DEPLOYING.
[MapPartition (MapPartition at 
PAssert$148/GroupGlobally/GroupDummyAndContents.out/beam:env:docker:v1:0) 
(3/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all 
FileSystem streams are closed for task MapPartition (MapPartition at 
PAssert$148/GroupGlobally/GroupDummyAndContents.out/beam:env:docker:v1:0) 
(3/16) (a3ff4fef6929568ea4c10f73ed627534) [FINISHED]
[DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@33a46ad9) 
(16/16)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem 
stream leak safety net for task DataSink 
(org.apache.flink.api.java.io.DiscardingOutputFormat@33a46ad9) (16/16) 
(2cf68514a1f4ece6090116c78168e4c2) [DEPLOYING]
[DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@33a46ad9) 
(16/16)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for 
task DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@33a46ad9) 
(16/16) (2cf68514a1f4ece6090116c78168e4c2) [DEPLOYING].
[MapPartition (MapPartition at 
PAssert$148/GroupGlobally/GroupDummyAndContents.out/beam:env:docker:v1:0) 
(13/16)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition 
(MapPartition at 
PAssert$148/GroupGlobally/GroupDummyAndContents.out/beam:env:docker:v1:0) 
(13/16) (ac656bbd3d4ea9fdfceffdd3b5c8ac0d) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at 
PAssert$148/GroupGlobally/GroupDummyAndContents.out/beam:env:docker:v1:0) 
(13/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task 
resources for MapPartition (MapPartition at 
PAssert$148/GroupGlobally/GroupDummyAndContents.out/beam:env:docker:v1:0) 
(13/16) (ac656bbd3d4ea9fdfceffdd3b5c8ac0d).
[MapPartition (MapPartition at 
PAssert$148/GroupGlobally/GroupDummyAndContents.out/beam:env:docker:v1:0) 
(2/16)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition 
(MapPartition at 
PAssert$148/GroupGlobally/GroupDummyAndContents.out/beam:env:docker:v1:0) 
(2/16) (79c6e81d8fadb4f4df1de9bf6bb702cc) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at 
PAssert$148/GroupGlobally/GroupDummyAndContents.out/beam:env:docker:v1:0) 
(2/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources 
for MapPartition (MapPartition at 
PAssert$148/GroupGlobally/GroupDummyAndContents.out/beam:env:docker:v1:0) 
(2/16) (79c6e81d8fadb4f4df1de9bf6bb702cc).
[MapPartition (MapPartition at 
PAssert$148/GroupGlobally/GroupDummyAndContents.out/beam:env:docker:v1:0) 
(13/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all 
FileSystem streams are closed for task MapPartition (MapPartition at 
PAssert$148/GroupGlobally/GroupDummyAndContents.out/beam:env:docker:v1:0) 
(13/16) (ac656bbd3d4ea9fdfceffdd3b5c8ac0d) [FINISHED]
[MapPartition (MapPartition at 
PAssert$148/GroupGlobally/GroupDummyAndContents.out/beam:env:docker:v1:0) 
(2/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all 
FileSystem streams are closed for task MapPartition (MapPartition at 
PAssert$148/GroupGlobally/GroupDummyAndContents.out/beam:env:docker:v1:0) 
(2/16) (79c6e81d8fadb4f4df1de9bf6bb702cc) [FINISHED]
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition 
(MapPartition at 

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #1562

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[iemejia] [BEAM-4783] Fix invalid parameter to set the partitioner in Spark GbK

--
[...truncated 169.95 KB...]
  File 
"
 line 1208, in cancel
return self.state
  File 
"
 line 1148, in state
self._update_job()
  File 
"
 line 1104, in _update_job
self._job = self._runner.dataflow_client.get_job(self.job_id())
  File 
"
 line 184, in wrapper
return fun(*args, **kwargs)
  File 
"
 line 635, in get_job
response = self._client.projects_locations_jobs.Get(request)
  File 
"
 line 604, in Get
config, request, global_params=global_params)
  File 
"
 line 720, in _RunMethod
http, http_request, **opts)
  File 
"
 line 346, in MakeRequest
check_response_func=check_response_func)
  File 
"
 line 396, in _MakeRequestNoRetry
redirections=redirections, connection_type=connection_type)
  File 
"
 line 175, in new_request
redirections, connection_type)
  File 
"
 line 282, in request
connection_type=connection_type)
  File 
"
 line 1694, in request
(response, content) = self._request(conn, authority, uri, request_uri, 
method, body, headers, redirections, cachekey)
  File 
"
 line 1434, in _request
(response, content) = self._conn_request(conn, request_uri, method, body, 
headers)
  File 
"
 line 1390, in _conn_request
response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1136, in getresponse
response.begin()
  File "/usr/lib/python2.7/httplib.py", line 453, in begin
version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 409, in _read_status
line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
v = self._sslobj.read(len)
  File 
"
 line 276, in signalhandler
raise TimedOutException()
TimedOutException: 'test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest)'

--
XML: 

--
Ran 16 tests in 3498.662s

FAILED (errors=1)

> Task :beam-sdks-python:validatesRunnerBatchTests FAILED
:beam-sdks-python:validatesRunnerBatchTests (Thread[Task worker for 
':',5,main]) completed. Took 58 mins 21.087 secs.
:beam-sdks-python:validatesRunnerStreamingTests (Thread[Task worker 

Jenkins build is back to normal : beam_PostCommit_Python_VR_Flink #568

2018-10-30 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PerformanceTests_HadoopInputFormat #943

2018-10-30 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_Prober_CommunityMetrics #25

2018-10-30 Thread Apache Jenkins Server
See 


--
[...truncated 3.16 KB...]
Selected primary task 'build' from project :
file or directory 
'
 not found
:buildSrc:compileJava (Thread[Task worker for ':buildSrc' Thread 3,5,main]) 
started.
Using local directory build cache for build ':buildSrc' (location = 
/home/jenkins/.gradle/caches/build-cache-1, removeUnusedEntriesAfter = 7 days).

> Task :buildSrc:compileJava NO-SOURCE
file or directory 
'
 not found
Skipping task ':buildSrc:compileJava' as it has no source files and no previous 
output files.
:buildSrc:compileJava (Thread[Task worker for ':buildSrc' Thread 3,5,main]) 
completed. Took 0.087 secs.
:buildSrc:compileGroovy (Thread[Task worker for ':buildSrc' Thread 3,5,main]) 
started.

> Task :buildSrc:compileGroovy FROM-CACHE
Build cache key for task ':buildSrc:compileGroovy' is 
3e10f9fa451895a42bbc626745f0364b
Task ':buildSrc:compileGroovy' is not up-to-date because:
  No history is available.
Origin for task ':buildSrc:compileGroovy': {executionTime=2216, 
hostName=apache-beam-jenkins-slave-group-t4pj, operatingSystem=Linux, 
buildInvocationId=4hwjrypmjvdedi4fwbsznogq4e, creationTime=1540789472558, 
type=org.gradle.api.tasks.compile.GroovyCompile_Decorated, userName=jenkins, 
gradleVersion=4.10.2, 
rootPath=/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Java_Examples_Dataflow_Commit/src/buildSrc,
 path=:compileGroovy}
Unpacked output for task ':buildSrc:compileGroovy' from cache.
:buildSrc:compileGroovy (Thread[Task worker for ':buildSrc' Thread 3,5,main]) 
completed. Took 0.106 secs.
:buildSrc:processResources (Thread[Task worker for ':buildSrc' Thread 
3,5,main]) started.

> Task :buildSrc:processResources NO-SOURCE
file or directory 
'
 not found
Skipping task ':buildSrc:processResources' as it has no source files and no 
previous output files.
:buildSrc:processResources (Thread[Task worker for ':buildSrc' Thread 
3,5,main]) completed. Took 0.005 secs.
:buildSrc:classes (Thread[Task worker for ':buildSrc' Thread 3,5,main]) started.

> Task :buildSrc:classes UP-TO-DATE
Skipping task ':buildSrc:classes' as it has no actions.
:buildSrc:classes (Thread[Task worker for ':buildSrc' Thread 3,5,main]) 
completed. Took 0.001 secs.
:buildSrc:jar (Thread[Task worker for ':buildSrc' Thread 3,5,main]) started.

> Task :buildSrc:jar
Build cache key for task ':buildSrc:jar' is 238bb4cf199eab9785d8eff2b7f77208
Caching disabled for task ':buildSrc:jar': Caching has not been enabled for the 
task
Task ':buildSrc:jar' is not up-to-date because:
  No history is available.
:buildSrc:jar (Thread[Task worker for ':buildSrc' Thread 3,5,main]) completed. 
Took 0.15 secs.
:buildSrc:assemble (Thread[Task worker for ':buildSrc' Thread 3,5,main]) 
started.

> Task :buildSrc:assemble
Skipping task ':buildSrc:assemble' as it has no actions.
:buildSrc:assemble (Thread[Task worker for ':buildSrc' Thread 3,5,main]) 
completed. Took 0.0 secs.
:buildSrc:spotlessGroovy (Thread[Task worker for ':buildSrc' Thread 3,5,main]) 
started.

> Task :buildSrc:spotlessGroovy
file or directory 
'
 not found
file or directory 
'
 not found
file or directory 
'
 not found
Caching disabled for task ':buildSrc:spotlessGroovy': Caching has not been 
enabled for the task
Task ':buildSrc:spotlessGroovy' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task 
':buildSrc:spotlessGroovy'.
file or directory 
'
 not found
:buildSrc:spotlessGroovy (Thread[Task worker for ':buildSrc' Thread 3,5,main]) 
completed. Took 1.806 secs.
:buildSrc:spotlessGroovyCheck (Thread[Task worker for ':buildSrc' Thread 
3,5,main]) started.

> Task :buildSrc:spotlessGroovyCheck
Skipping task ':buildSrc:spotlessGroovyCheck' as it has no actions.
:buildSrc:spotlessGroovyCheck (Thread[Task worker for ':buildSrc' Thread 
3,5,main]) completed. Took 0.001 secs.
:buildSrc:spotlessGroovyGradle (Thread[Task worker for ':buildSrc' Thread 
3,5,main]) started.

> Task :buildSrc:spotlessGroovyGradle
Caching disabled for task ':buildSrc:spotlessGroovyGradle': Caching has not 
been enabled for the task
Task ':buildSrc:spotlessGroovyGradle' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for 

Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #567

2018-10-30 Thread Apache Jenkins Server
See 


--
[...truncated 4.45 MB...]
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader http://localhost:38557 , 
session=441e9891-5daf-4560-9986-ddfc496ed015
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService 
- Starting RPC endpoint for 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher at 
akka://flink/user/dispatcher5338b2ff-0202-45aa-8459-04ce0c292719 .
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher@38147226 @ 
akka://flink/user/dispatcher5338b2ff-0202-45aa-8459-04ce0c292719
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Flink Mini Cluster started successfully
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Dispatcher 
akka://flink/user/dispatcher5338b2ff-0202-45aa-8459-04ce0c292719 was granted 
leadership with fencing token 084b0904-2727-458b-a296-ff9e4f3d15cc
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Recovering all 
persisted jobs.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader 
akka://flink/user/dispatcher5338b2ff-0202-45aa-8459-04ce0c292719 , 
session=084b0904-2727-458b-a296-ff9e4f3d15cc
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Submitting job 
e8d97fe61b43ff2d6bac92dc6cd8cd1e (test_windowing_1540907126.62).
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.JobMaster at akka://flink/user/jobmanager_41 
.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Initializing job 
test_windowing_1540907126.62 (e8d97fe61b43ff2d6bac92dc6cd8cd1e).
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Using restart strategy 
NoRestartStrategy for test_windowing_1540907126.62 
(e8d97fe61b43ff2d6bac92dc6cd8cd1e).
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool at 
akka://flink/user/b5d3da76-8fe9-45e4-aced-af4effe77269 .
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job recovers via 
failover strategy: full graph restart
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Running initialization on master 
for job test_windowing_1540907126.62 (e8d97fe61b43ff2d6bac92dc6cd8cd1e).
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Successfully ran initialization 
on master in 0 ms.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - No state backend has been 
configured, using default (Memory / JobManager) MemoryStateBackend (data in 
heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 
'null', asynchronous: TRUE, maxStateSize: 5242880)
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.jobmaster.JobManagerRunner@7148c464 @ 
akka://flink/user/jobmanager_41
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobManagerRunner - JobManager runner for job 
test_windowing_1540907126.62 (e8d97fe61b43ff2d6bac92dc6cd8cd1e) was granted 
leadership with session id 46319b19-79f2-443c-ba84-927324202acb at 
akka://flink/user/jobmanager_41.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Starting execution of job 
test_windowing_1540907126.62 (e8d97fe61b43ff2d6bac92dc6cd8cd1e)
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job 
test_windowing_1540907126.62 (e8d97fe61b43ff2d6bac92dc6cd8cd1e) switched from 
state CREATED to RUNNING.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: Custom Source 
-> 14Create/Impulse.None/beam:env:docker:v1:0 -> ToKeyedWorkItem (1/1) 
(757afa4f2ba0d88b76cca1e1a9693083) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: Custom Source 
-> 26assert_that/Create/Impulse.None/beam:env:docker:v1:0 (1/1) 

Build failed in Jenkins: beam_Release_Gradle_NightlySnapshot #223

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[klk] Fix IWYU error in examples/java test deps

[klk] [BEAM-3573] Move GCPIO test utilities to main jar

[scott] Remove space from Community Metrics pre-commit job name

[kenn] [BEAM-5817] Use an explicit enum for naming Nexmark benchmarks

[lukasz.gajowy] Revert "Switch SQL over to use the new Beam Group transform."

[kenn] Revert "Re-enable parallel build of Java PreCommit"

[kenn] Revert "[BEAM-5887] Fix classifier for unshaded tests"

[kenn] Rollforward "[BEAM-5887] Fix classifier for unshaded tests"

[mxm] [BEAM-5909] Filter timer PCollections in QueryablePipelines#getUserState

[github] Link to Timely Processing blog post in Programming Guide

[github] use site.baseurl

[thw] Fix IntelliJ Wiki link, remove Eclipse.

[github] Fit and finish changes for pre- and post-commit dashboards (#6854)

[mwylde] [BEAM-5724] Generalize flink executable context to allow more than 1

[lcwik] [BEAM-5800] Support WordCountIT using fn-api worker (#6760)

[scott] Flesh out the roll-forward case in postcommit policy (#6873)

[25622840+adude3141] add countdownlatch to df worker test

[lcwik] [BEAM-5801] Support postcommit ITs using fn-api worker (#6762)

[github] Blog post for 2.8.0 release (#6852)

--
[...truncated 40.55 MB...]
:beam-vendor-sdks-java-extensions-protobuf:test (Thread[Task worker for ':' 
Thread 5,5,main]) completed. Took 0.001 secs.
:beam-vendor-sdks-java-extensions-protobuf:validateShadedJarDoesntLeakNonOrgApacheBeamClasses
 (Thread[Task worker for ':' Thread 5,5,main]) started.

> Task 
> :beam-vendor-sdks-java-extensions-protobuf:validateShadedJarDoesntLeakNonOrgApacheBeamClasses
Caching disabled for task 
':beam-vendor-sdks-java-extensions-protobuf:validateShadedJarDoesntLeakNonOrgApacheBeamClasses':
 Caching has not been enabled for the task
Task 
':beam-vendor-sdks-java-extensions-protobuf:validateShadedJarDoesntLeakNonOrgApacheBeamClasses'
 is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-vendor-sdks-java-extensions-protobuf:validateShadedJarDoesntLeakNonOrgApacheBeamClasses
 (Thread[Task worker for ':' Thread 5,5,main]) completed. Took 0.032 secs.
:beam-vendor-sdks-java-extensions-protobuf:check (Thread[Task worker for ':' 
Thread 5,5,main]) started.

> Task :beam-vendor-sdks-java-extensions-protobuf:check
Skipping task ':beam-vendor-sdks-java-extensions-protobuf:check' as it has no 
actions.
:beam-vendor-sdks-java-extensions-protobuf:check (Thread[Task worker for ':' 
Thread 5,5,main]) completed. Took 0.0 secs.
:beam-vendor-sdks-java-extensions-protobuf:build (Thread[Task worker for ':' 
Thread 7,5,main]) started.

> Task :beam-vendor-sdks-java-extensions-protobuf:build
Skipping task ':beam-vendor-sdks-java-extensions-protobuf:build' as it has no 
actions.
:beam-vendor-sdks-java-extensions-protobuf:build (Thread[Task worker for ':' 
Thread 7,5,main]) completed. Took 0.0 secs.
:beam-website:assemble (Thread[Task worker for ':' Thread 7,5,main]) started.

> Task :beam-website:assemble UP-TO-DATE
Skipping task ':beam-website:assemble' as it has no actions.
:beam-website:assemble (Thread[Task worker for ':' Thread 7,5,main]) completed. 
Took 0.0 secs.
:beam-website:setupBuildDir (Thread[Task worker for ':' Thread 7,5,main]) 
started.

> Task :beam-website:setupBuildDir
Build cache key for task ':beam-website:setupBuildDir' is 
e17a1504b1e4eb6e1d3dfd5f3dd82a55
Caching disabled for task ':beam-website:setupBuildDir': Caching has not been 
enabled for the task
Task ':beam-website:setupBuildDir' is not up-to-date because:
  No history is available.
:beam-website:setupBuildDir (Thread[Task worker for ':' Thread 7,5,main]) 
completed. Took 0.039 secs.
:beam-website:buildDockerImage (Thread[Task worker for ':' Thread 7,5,main]) 
started.

> Task :beam-website:buildDockerImage
Caching disabled for task ':beam-website:buildDockerImage': Caching has not 
been enabled for the task
Task ':beam-website:buildDockerImage' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: 

 Command: docker build -t beam-website .
Successfully started process 'command 'docker''
Sending build context to Docker daemon   26.3MB
Step 1/7 : FROM ruby:2.5
 ---> ef118940ccc5
Step 2/7 : WORKDIR /ruby
 ---> Using cache
 ---> fc91fe574469
Step 3/7 : RUN gem install bundler
 ---> Using cache
 ---> 16fc5328ccb6
Step 4/7 : ADD Gemfile Gemfile.lock /ruby/
 ---> Using cache
 ---> 682e85ec7e99
Step 5/7 : RUN bundle install --deployment --path $GEM_HOME
 ---> Using cache
 ---> 5b1b365d1a4d
Step 6/7 : ENV LC_ALL C.UTF-8
 ---> Using cache
 ---> 9cee326336a3
Step 7/7 : CMD sleep 3600
 ---> Using cache
 ---> f3216fa9b8ea
Successfully built f3216fa9b8ea
Successfully tagged 

Build failed in Jenkins: beam_PerformanceTests_HadoopInputFormat #942

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[25622840+adude3141] add countdownlatch to df worker test

[github] Blog post for 2.8.0 release (#6852)

--
[...truncated 456.16 KB...]
at 
org.apache.beam.sdk.transforms.MapElements$1$DoFnInvoker.invokeProcessElement(Unknown
 Source)
at 
org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:275)
at 
org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:240)
at 
com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:324)
at 
com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
at 
com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
at 
com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
at 
org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:309)
at 
org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:77)
at 
org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:621)
at 
org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:609)
at 
org.apache.beam.runners.dataflow.ReshuffleOverrideFactory$ReshuffleWithOnlyTrigger$1.processElement(ReshuffleOverrideFactory.java:84)
at 
org.apache.beam.runners.dataflow.ReshuffleOverrideFactory$ReshuffleWithOnlyTrigger$1$DoFnInvoker.invokeProcessElement(Unknown
 Source)
at 
org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:275)
at 
org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:240)
at 
com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:324)
at 
com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
at 
com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
at 
com.google.cloud.dataflow.worker.GroupAlsoByWindowsParDoFn$1.output(GroupAlsoByWindowsParDoFn.java:181)
... 21 more
Caused by: org.postgresql.util.PSQLException: The connection attempt failed.
at 
org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:257)
at 
org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49)
at org.postgresql.jdbc.PgConnection.(PgConnection.java:195)
at org.postgresql.Driver.makeConnection(Driver.java:452)
at org.postgresql.Driver.connect(Driver.java:254)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at 
org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:94)
at 
org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:79)
at 
org.apache.commons.dbcp2.DataSourceConnectionFactory.createConnection(DataSourceConnectionFactory.java:44)
at 
org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:256)
at 
org.apache.commons.pool2.impl.GenericObjectPool.create(GenericObjectPool.java:868)
at 
org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:435)
at 
org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:363)
at 
org.apache.commons.dbcp2.PoolingDataSource.getConnection(PoolingDataSource.java:134)
at 
org.apache.beam.sdk.io.jdbc.JdbcIO$Write$WriteFn.startBundle(JdbcIO.java:791)
Caused by: java.net.SocketTimeoutException: connect timed out
at java.net.PlainSocketImpl.socketConnect(Native Method)
at 
java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at 
java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at 
java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.postgresql.core.PGStream.(PGStream.java:69)
at 
org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:156)
at 
org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49)
at org.postgresql.jdbc.PgConnection.(PgConnection.java:195)
at org.postgresql.Driver.makeConnection(Driver.java:452)
at org.postgresql.Driver.connect(Driver.java:254)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at 

Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #566

2018-10-30 Thread Apache Jenkins Server
See 


--
[...truncated 4.45 MB...]
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - 
http://localhost:46733 was granted leadership with 
leaderSessionID=13f28da4-0a03-49dd-bebe-9a689d93aa81
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader http://localhost:46733 , 
session=13f28da4-0a03-49dd-bebe-9a689d93aa81
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService 
- Starting RPC endpoint for 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher at 
akka://flink/user/dispatcher7ad245a6-d158-4dad-b14a-b0bb2f2a86dd .
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher@5c5547b0 @ 
akka://flink/user/dispatcher7ad245a6-d158-4dad-b14a-b0bb2f2a86dd
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Flink Mini Cluster started successfully
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Dispatcher 
akka://flink/user/dispatcher7ad245a6-d158-4dad-b14a-b0bb2f2a86dd was granted 
leadership with fencing token 8174f6e2-dc02-4c77-8adc-405fa6e278fb
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Recovering all 
persisted jobs.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader 
akka://flink/user/dispatcher7ad245a6-d158-4dad-b14a-b0bb2f2a86dd , 
session=8174f6e2-dc02-4c77-8adc-405fa6e278fb
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Submitting job 
fd6f0cbf16c1f247050721c7e4196556 (test_windowing_1540879928.49).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.JobMaster at akka://flink/user/jobmanager_41 
.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Initializing job 
test_windowing_1540879928.49 (fd6f0cbf16c1f247050721c7e4196556).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Using restart strategy 
NoRestartStrategy for test_windowing_1540879928.49 
(fd6f0cbf16c1f247050721c7e4196556).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool at 
akka://flink/user/9f8e7f60-e399-49fc-ad4e-f947e3922787 .
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job recovers via 
failover strategy: full graph restart
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Running initialization on master 
for job test_windowing_1540879928.49 (fd6f0cbf16c1f247050721c7e4196556).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Successfully ran initialization 
on master in 0 ms.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - No state backend has been 
configured, using default (Memory / JobManager) MemoryStateBackend (data in 
heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 
'null', asynchronous: TRUE, maxStateSize: 5242880)
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.jobmaster.JobManagerRunner@49a54f1d @ 
akka://flink/user/jobmanager_41
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.jobmaster.JobManagerRunner - JobManager runner for job 
test_windowing_1540879928.49 (fd6f0cbf16c1f247050721c7e4196556) was granted 
leadership with session id 31e74ae3-2773-42c9-80e6-b852b9923423 at 
akka://flink/user/jobmanager_41.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Starting execution of job 
test_windowing_1540879928.49 (fd6f0cbf16c1f247050721c7e4196556)
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job 
test_windowing_1540879928.49 (fd6f0cbf16c1f247050721c7e4196556) switched from 
state CREATED to RUNNING.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: Custom Source 
-> 14Create/Impulse.None/beam:env:docker:v1:0 -> ToKeyedWorkItem (1/1) 
(5134a76cd2770564fc50d6a2aca5) switched from 

Build failed in Jenkins: beam_Prober_CommunityMetrics #24

2018-10-30 Thread Apache Jenkins Server
See 


Changes:

[25622840+adude3141] add countdownlatch to df worker test

[github] Blog post for 2.8.0 release (#6852)

--
[...truncated 3.37 KB...]
file or directory 
'
 not found
:buildSrc:compileJava (Thread[Task worker for ':buildSrc' Thread 2,5,main]) 
started.
Using local directory build cache for build ':buildSrc' (location = 
/home/jenkins/.gradle/caches/build-cache-1, removeUnusedEntriesAfter = 7 days).

> Task :buildSrc:compileJava NO-SOURCE
file or directory 
'
 not found
Skipping task ':buildSrc:compileJava' as it has no source files and no previous 
output files.
:buildSrc:compileJava (Thread[Task worker for ':buildSrc' Thread 2,5,main]) 
completed. Took 0.126 secs.
:buildSrc:compileGroovy (Thread[Task worker for ':buildSrc',5,main]) started.

> Task :buildSrc:compileGroovy FROM-CACHE
Build cache key for task ':buildSrc:compileGroovy' is 
3e10f9fa451895a42bbc626745f0364b
Task ':buildSrc:compileGroovy' is not up-to-date because:
  No history is available.
Origin for task ':buildSrc:compileGroovy': {executionTime=3382, 
hostName=apache-beam-jenkins-slave-group-rjdl, operatingSystem=Linux, 
buildInvocationId=tojr7di4vze4dkkydfs5r4sbba, creationTime=1540758955733, 
type=org.gradle.api.tasks.compile.GroovyCompile_Decorated, userName=jenkins, 
gradleVersion=4.10.2, 
rootPath=/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Java_Commit/src/buildSrc,
 path=:compileGroovy}
Unpacked output for task ':buildSrc:compileGroovy' from cache.
:buildSrc:compileGroovy (Thread[Task worker for ':buildSrc',5,main]) completed. 
Took 0.159 secs.
:buildSrc:processResources (Thread[Task worker for ':buildSrc',5,main]) started.

> Task :buildSrc:processResources NO-SOURCE
file or directory 
'
 not found
Skipping task ':buildSrc:processResources' as it has no source files and no 
previous output files.
:buildSrc:processResources (Thread[Task worker for ':buildSrc',5,main]) 
completed. Took 0.005 secs.
:buildSrc:classes (Thread[Task worker for ':buildSrc',5,main]) started.

> Task :buildSrc:classes UP-TO-DATE
Skipping task ':buildSrc:classes' as it has no actions.
:buildSrc:classes (Thread[Task worker for ':buildSrc',5,main]) completed. Took 
0.0 secs.
:buildSrc:jar (Thread[Task worker for ':buildSrc',5,main]) started.

> Task :buildSrc:jar
Build cache key for task ':buildSrc:jar' is 238bb4cf199eab9785d8eff2b7f77208
Caching disabled for task ':buildSrc:jar': Caching has not been enabled for the 
task
Task ':buildSrc:jar' is not up-to-date because:
  No history is available.
:buildSrc:jar (Thread[Task worker for ':buildSrc',5,main]) completed. Took 
0.167 secs.
:buildSrc:assemble (Thread[Task worker for ':buildSrc' Thread 3,5,main]) 
started.

> Task :buildSrc:assemble
Skipping task ':buildSrc:assemble' as it has no actions.
:buildSrc:assemble (Thread[Task worker for ':buildSrc' Thread 3,5,main]) 
completed. Took 0.0 secs.
:buildSrc:spotlessGroovy (Thread[Task worker for ':buildSrc' Thread 3,5,main]) 
started.

> Task :buildSrc:spotlessGroovy
file or directory 
'
 not found
file or directory 
'
 not found
file or directory 
'
 not found
Caching disabled for task ':buildSrc:spotlessGroovy': Caching has not been 
enabled for the task
Task ':buildSrc:spotlessGroovy' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task 
':buildSrc:spotlessGroovy'.
file or directory 
'
 not found
:buildSrc:spotlessGroovy (Thread[Task worker for ':buildSrc' Thread 3,5,main]) 
completed. Took 2.732 secs.
:buildSrc:spotlessGroovyCheck (Thread[Task worker for ':buildSrc',5,main]) 
started.

> Task :buildSrc:spotlessGroovyCheck
Skipping task ':buildSrc:spotlessGroovyCheck' as it has no actions.
:buildSrc:spotlessGroovyCheck (Thread[Task worker for ':buildSrc',5,main]) 
completed. Took 0.001 secs.
:buildSrc:spotlessGroovyGradle (Thread[Task worker for ':buildSrc',5,main]) 
started.

> Task :buildSrc:spotlessGroovyGradle
Caching disabled for task ':buildSrc:spotlessGroovyGradle': Caching has not 
been enabled for the task
Task ':buildSrc:spotlessGroovyGradle' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task