Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Dataflow_Gradle #1455

2018-11-14 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Java_GradleBuild #1890

2018-11-14 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_Release_Gradle_NightlySnapshot #238

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[valentyn] Add tests for assert_that with cases that cover negative numbers and

[kedin] [SQL] Add extra cast tests for string->date parsing

[github] Update flink.md

[robertwb] Add coder to timer spec, populate in SDKs.

[robertwb] [BEAM-5999] Generate input and output collections from timer specs.

[markliu] [BEAM-5788] Update storage_v1 API

[ehudm] Properly clean up docker-compose artifacts.

[chamikara] Adds a roadmap section for connectors

[robertwb] Heap-based global top.

[coheigea] Remove some unnecessary type definitions

[vaclav.plajt] [BEAM-6054] Euphoria translation providers refactored.

[vaclav.plajt] [BEAM-6054] Review fixes.

[robertwb] [BEAM-6008] Propagate errors through portable portable runner.

[robertwb] Enable inherited error checking tests.

[robertwb] [BEAM-5692] Send current state to callback on registration.

[ehudm] [BEAM-6047] Try to fix HDFS DNS issues

[supercclank] Add getBigtableProject to BigtableTestOptions and change 
BigtableReadIt

[thw] [BEAM-6034] Flink auto watermark interval pipeline option

[github] [BEAM-6056] Migrate gRPC to use org.apache.beam.vendor.grpc.v1_13_1 as

[Ankur] Revert "Remove unused is_pair_like hack."

[supercclank] Set to empty string instead.

[supercclank] Fix space.

--
[...truncated 43.62 MB...]
file or directory 
'
 not found
Build cache key for task ':beam-vendor-grpc-v1_13_1:sourcesJar' is 
2b50c5905885674a697a325a077c7fa7
Caching disabled for task ':beam-vendor-grpc-v1_13_1:sourcesJar': Caching has 
not been enabled for the task
Task ':beam-vendor-grpc-v1_13_1:sourcesJar' is not up-to-date because:
  No history is available.
file or directory 
'
 not found
file or directory 
'
 not found
:beam-vendor-grpc-v1_13_1:sourcesJar (Thread[Task worker for ':' Thread 
8,5,main]) completed. Took 0.007 secs.
:beam-vendor-grpc-v1_13_1:testSourcesJar (Thread[Task worker for ':' Thread 
8,5,main]) started.

> Task :beam-vendor-grpc-v1_13_1:testSourcesJar
file or directory 
'
 not found
file or directory 
'
 not found
Build cache key for task ':beam-vendor-grpc-v1_13_1:testSourcesJar' is 
2b50c5905885674a697a325a077c7fa7
Caching disabled for task ':beam-vendor-grpc-v1_13_1:testSourcesJar': Caching 
has not been enabled for the task
Task ':beam-vendor-grpc-v1_13_1:testSourcesJar' is not up-to-date because:
  No history is available.
file or directory 
'
 not found
file or directory 
'
 not found
:beam-vendor-grpc-v1_13_1:testSourcesJar (Thread[Task worker for ':' Thread 
8,5,main]) completed. Took 0.003 secs.
:beam-vendor-grpc-v1_13_1:publishMavenJavaPublicationToMavenRepository 
(Thread[Task worker for ':' Thread 8,5,main]) started.

> Task :beam-vendor-grpc-v1_13_1:publishMavenJavaPublicationToMavenRepository
Caching disabled for task 
':beam-vendor-grpc-v1_13_1:publishMavenJavaPublicationToMavenRepository': 
Caching has not been enabled for the task
Task ':beam-vendor-grpc-v1_13_1:publishMavenJavaPublicationToMavenRepository' 
is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Publishing to repository 'maven' 
(https://repository.apache.org/content/repositories/snapshots)
Deploying to https://repository.apache.org/content/repositories/snapshots
Downloading: 
org/apache/beam/beam-vendor-grpc-v1_13_1/2.9.0-SNAPSHOT/maven-metadata.xml from 
repository remote at 
https://repository.apache.org/content/repositories/snapshots/
Resource missing. [HTTP GET: 
https://repository.apache.org/content/repositories/snapshots/org/apache/beam/beam-vendor-grpc-v1_13_1/2.9.0-SNAPSHOT/maven-metadata.xml]
Could not find metadata 
org.apache.beam:beam-vendor-grpc-v1_13_1:2.9.0-SNAPSHOT/maven-metadata.xml in 
remote (https://repository.apache.org/content/repositories/snapshots)
Uploading: 
org/apache/beam/beam-vendor-grpc-v1_13_1/2.9.0-SNAPSHOT/beam-vendor-grpc-v1_13_1-2.9.0-20181114.083426-1.jar
 to repository remote at 
https://repository.apache.org/content/repositories/snapshots/
Uploading: 
org/apache/beam/beam-vendor-grpc-v1_13_1/2.9.0-SNAPSHOT/beam-vendor-grpc-v1_13_1-2.9.0-20181114.083426-1.pom
 to 

Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #747

2018-11-14 Thread Apache Jenkins Server
See 


--
[...truncated 4.05 MB...]
[flink-runner-job-server] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Rest endpoint 
listening at localhost:33103
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint@1e95acea @ 
http://localhost:33103
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Starting job dispatcher(s) for JobManger
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - 
http://localhost:33103 was granted leadership with 
leaderSessionID=3894623e-24de-4dc3-9e0a-31e0fe1aab82
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader http://localhost:33103 , 
session=3894623e-24de-4dc3-9e0a-31e0fe1aab82
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService 
- Starting RPC endpoint for 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher at 
akka://flink/user/dispatcher73c64c47-de46-47cc-9db4-d114d42aa34e .
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher@56be0e6e @ 
akka://flink/user/dispatcher73c64c47-de46-47cc-9db4-d114d42aa34e
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Flink Mini Cluster started successfully
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Dispatcher 
akka://flink/user/dispatcher73c64c47-de46-47cc-9db4-d114d42aa34e was granted 
leadership with fencing token 93e31a5e-960e-4f07-83f9-bc331f183e09
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Recovering all 
persisted jobs.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader 
akka://flink/user/dispatcher73c64c47-de46-47cc-9db4-d114d42aa34e , 
session=93e31a5e-960e-4f07-83f9-bc331f183e09
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Submitting job 
c90a93085ebc2c8edcb18eb2c8225348 (test_windowing_1542197522.23).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.JobMaster at akka://flink/user/jobmanager_39 
.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Initializing job 
test_windowing_1542197522.23 (c90a93085ebc2c8edcb18eb2c8225348).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Using restart strategy 
NoRestartStrategy for test_windowing_1542197522.23 
(c90a93085ebc2c8edcb18eb2c8225348).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool at 
akka://flink/user/d2a1a7ac-fbdc-44bb-a9c7-8c464967653b .
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job recovers via 
failover strategy: full graph restart
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Running initialization on master 
for job test_windowing_1542197522.23 (c90a93085ebc2c8edcb18eb2c8225348).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Successfully ran initialization 
on master in 0 ms.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - No state backend has been 
configured, using default (Memory / JobManager) MemoryStateBackend (data in 
heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 
'null', asynchronous: TRUE, maxStateSize: 5242880)
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.jobmaster.JobManagerRunner@6bfd2c83 @ 
akka://flink/user/jobmanager_39
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobManagerRunner - JobManager runner for job 
test_windowing_1542197522.23 (c90a93085ebc2c8edcb18eb2c8225348) was granted 
leadership with session id e700a5d5-025e-47c3-a108-3a9560bef159 at 
akka://flink/user/jobmanager_39.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Starting execution of job 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #1026

2018-11-14 Thread Apache Jenkins Server
See 


--
[...truncated 257.46 KB...]
  } ],
  "message" : "The total number of changes to the object 
temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-PM_r0wNgIUYjnsw3STy5Nw.jar
 exceeds the rate limit. Please reduce the rate of create, update, and delete 
requests."
}
at 
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at 
com.google.cloud.hadoop.util.AbstractGoogleAsyncWriteChannel$UploadOperation.call(AbstractGoogleAsyncWriteChannel.java:358)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

Nov 14, 2018 12:04:15 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-PM_r0wNgIUYjnsw3STy5Nw.jar
Nov 14, 2018 12:04:18 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 28 files newly uploaded
Nov 14, 2018 12:04:19 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7.ReadBounded as step s1
Nov 14, 2018 12:04:19 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Monitor as step s2
Nov 14, 2018 12:04:19 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Snoop as step s3
Nov 14, 2018 12:04:19 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 14, 2018 12:04:19 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 14, 2018 12:04:19 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 14, 2018 12:04:19 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 14, 2018 12:04:19 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 14, 2018 12:04:19 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 14, 2018 12:04:19 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 14, 2018 12:04:19 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 as step s11
Nov 14, 2018 12:04:19 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/StripNonce/Map
 as step s12
Nov 14, 2018 12:04:19 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Window.Remerge/Identity/Map
 as step s13
Nov 14, 2018 12:04:19 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 

Jenkins build is back to normal : beam_PostCommit_Python_VR_Flink #746

2018-11-14 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PreCommit_Java_Cron #588

2018-11-14 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Python_Verify #6558

2018-11-14 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Go_GradleBuild #1676

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[adrianwit] [BEAM-5729] added database/sql based reader and writer

--
[...truncated 669.19 KB...]
{12: int/int[varintz] GLO}
{13: int/int[varintz] GLO}
{14: int/int[varintz] GLO}
Edges: 1: Impulse [] -> [Out: []uint8 -> {1: []uint8/bytes GLO}]
2: ParDo [In(Main): []uint8 <- {1: []uint8/bytes GLO}] -> [Out: T -> {2: 
int/int[varintz] GLO}]
3: ParDo [In(Main): int <- {2: int/int[varintz] GLO}] -> [Out: int -> {3: 
int/int[varintz] GLO} Out: int -> {4: int/int[varintz] GLO} Out: int -> {5: 
int/int[varintz] GLO} Out: int -> {6: int/int[varintz] GLO} Out: int -> {7: 
int/int[varintz] GLO} Out: int -> {8: int/int[varintz] GLO} Out: int -> {9: 
int/int[varintz] GLO}]
4: Flatten [In(Main): int <- {3: int/int[varintz] GLO} In(Main): int <- {4: 
int/int[varintz] GLO} In(Main): int <- {5: int/int[varintz] GLO} In(Main): int 
<- {6: int/int[varintz] GLO} In(Main): int <- {7: int/int[varintz] GLO} 
In(Main): int <- {8: int/int[varintz] GLO} In(Main): int <- {9: 
int/int[varintz] GLO}] -> [Out: int -> {10: int/int[varintz] GLO}]
5: Impulse [] -> [Out: []uint8 -> {11: []uint8/bytes GLO}]
6: ParDo [In(Main): []uint8 <- {11: []uint8/bytes GLO} In(Iter): T <- {10: 
int/int[varintz] GLO} In(Iter): T <- {2: int/int[varintz] GLO}] -> [Out: T -> 
{12: int/int[varintz] GLO} Out: T -> {13: int/int[varintz] GLO} Out: T -> {14: 
int/int[varintz] GLO}]
7: ParDo [In(Main): X <- {12: int/int[varintz] GLO}] -> []
8: ParDo [In(Main): X <- {14: int/int[varintz] GLO}] -> []
2018/11/15 00:41:21 Plan[plan]:
12: Impulse[0]
13: Impulse[0]
1: ParDo[passert.failFn] Out:[]
2: Discard
3: ParDo[passert.failFn] Out:[]
4: ParDo[passert.diffFn] Out:[1 2 3]
5: wait[2] Out:4
6: buffer[6]. wait:5 Out:4
7: buffer[7]. wait:5 Out:4
8: Flatten[7]. Out:buffer[6]. wait:5 Out:4
9: ParDo[beam.partitionFn] Out:[8 8 8 8 8 8 8]
10: Multiplex. Out:[9 7]
11: ParDo[beam.createFn] Out:[10]
2018/11/15 00:41:21 wait[5] unblocked w/ 1 [false]
2018/11/15 00:41:21 wait[5] done
--- PASS: TestPartitionFlattenIdentity (0.00s)
=== RUN   Example_metricsDeclaredAnywhere
--- PASS: Example_metricsDeclaredAnywhere (0.00s)
=== RUN   Example_metricsReusable
--- PASS: Example_metricsReusable (0.00s)
PASS
coverage: 44.8% of statements
ok  github.com/apache/beam/sdks/go/pkg/beam 0.017s  coverage: 44.8% of 
statements
=== RUN   TestOptions
--- PASS: TestOptions (0.00s)
=== RUN   TestKey
--- PASS: TestKey (0.00s)
=== RUN   TestRegister
--- PASS: TestRegister (0.00s)
PASS
coverage: 48.5% of statements
ok  github.com/apache/beam/sdks/go/pkg/beam/core/runtime0.003s  
coverage: 48.5% of statements
=== RUN   TestMergeMaps
--- PASS: TestMergeMaps (0.00s)
=== RUN   TestShallowClone
--- PASS: TestShallowClone (0.00s)
=== RUN   TestShallowCloneNil
--- PASS: TestShallowCloneNil (0.00s)
PASS
coverage: 6.4% of statements
ok  github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx  0.003s  
coverage: 6.4% of statements

> Task :beam-sdks-go:test FAILED
Test for github.com/apache/beam/sdks/go/pkg/beam finished, 7 completed, 0 failed
Result of package github.com/apache/beam/sdks/go/pkg/beam/core/runtime:
Test for github.com/apache/beam/sdks/go/pkg/beam/core/runtime finished, 3 
completed, 0 failed
Result of package github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx:
Test for github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx finished, 3 
completed, 0 failed
Generating HTML test report...
Finished generating test html results (0.145 secs) into: 

Invalidating in-memory cache of 

:beam-sdks-go:test (Thread[Task worker for ':' Thread 3,5,main]) completed. 
Took 26.815 secs.
:beam-sdks-go-container:prepare (Thread[Task worker for ':' Thread 3,5,main]) 
started.

> Task :beam-sdks-go-container:prepare
Caching disabled for task ':beam-sdks-go-container:prepare': Caching has not 
been enabled for the task
Task ':beam-sdks-go-container:prepare' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Use project GOPATH: 

:beam-sdks-go-container:prepare (Thread[Task worker for ':' Thread 3,5,main]) 
completed. Took 0.002 secs.
:beam-sdks-go-container:resolveBuildDependencies (Thread[Task worker for ':' 
Thread 3,5,main]) started.

> Task :beam-sdks-go-container:resolveBuildDependencies UP-TO-DATE
Build cache key for task ':beam-sdks-go-container:resolveBuildDependencies' is 
ee37f5260adc7cd3ce2d7ead07ac1ecc
Caching disabled for task ':beam-sdks-go-container:resolveBuildDependencies': 
Caching has not been enabled for the task
Skipping task 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow_Gradle #95

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[scott] Roll-forwawrd "Merge pull request #6979: Add helper task to print

[scott] Share docker image name along with shared task

[mxm] ParDoTest: Check expected errors inline instead of after pipeline

[altay] fix lint error un userstate.py

[mxm] Add output collection id to timer specs.

[mxm] Move doFn wrapper initialization to open()

[mxm] [BEAM-6009] Add note about default mode for PortableValidatesRunner

[mxm] [BEAM-4681] Add support for portable timers in Flink streaming mode

[altay] Convert Top combiner to a full PTransform

--
[...truncated 18.21 MB...]
Nov 14, 2018 11:49:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T23:49:41.353Z: Fusing consumer 
PAssert$27/VerifyAssertions/ParDo(DefaultConclude) into PAssert$27/RunChecks
Nov 14, 2018 11:49:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T23:49:41.377Z: Fusing consumer PAssert$27/RunChecks into 
PAssert$27/GetPane/Map
Nov 14, 2018 11:49:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T23:49:41.413Z: Fusing consumer 
PAssert$27/GroupGlobally/Values/Values/Map into 
PAssert$27/GroupGlobally/GroupDummyAndContents/GroupByWindow
Nov 14, 2018 11:49:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T23:49:41.448Z: Fusing consumer 
PAssert$27/GroupGlobally/ParDo(Concat) into 
PAssert$27/GroupGlobally/Values/Values/Map
Nov 14, 2018 11:49:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T23:49:41.482Z: Unzipping flatten s43-u80 for input 
s45-reify-value58-c78
Nov 14, 2018 11:49:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T23:49:41.515Z: Fusing unzipped copy of 
PAssert$27/GroupGlobally/GroupDummyAndContents/Write, through flatten s43-u80, 
into producer PAssert$27/GroupGlobally/GroupDummyAndContents/Reify
Nov 14, 2018 11:49:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T23:49:41.550Z: Fusing consumer 
PAssert$27/GroupGlobally/GroupDummyAndContents/Reify into 
PAssert$27/GroupGlobally/WindowIntoDummy/Window.Assign
Nov 14, 2018 11:49:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T23:49:41.584Z: Fusing consumer 
PAssert$27/GroupGlobally/GroupDummyAndContents/Write into 
PAssert$27/GroupGlobally/GroupDummyAndContents/Reify
Nov 14, 2018 11:49:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T23:49:41.610Z: Fusing consumer 
Create123/Read(CreateSource)/ParDo(ReadFromBoundedSource) into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
Nov 14, 2018 11:49:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T23:49:41.632Z: Fusing consumer 
PAssert$27/GroupGlobally/Window.Into()/Window.Assign into OutputSideInputs
Nov 14, 2018 11:49:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T23:49:41.667Z: Fusing consumer 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable 
into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
Nov 14, 2018 11:49:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T23:49:41.701Z: Fusing consumer 
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/GroupByWindow
 into 
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read
Nov 14, 2018 11:49:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T23:49:41.737Z: Fusing consumer 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
 into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
Nov 14, 2018 11:49:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T23:49:41.762Z: Fusing consumer 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write 
into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
Nov 14, 2018 11:49:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T23:49:41.796Z: Fusing consumer 
PAssert$27/GroupGlobally/WindowIntoDummy/Window.Assign into 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #1035

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[adrianwit] [BEAM-5729] added database/sql based reader and writer

--
[...truncated 242.11 KB...]
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at 
com.google.cloud.hadoop.util.AbstractGoogleAsyncWriteChannel$UploadOperation.call(AbstractGoogleAsyncWriteChannel.java:358)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

Nov 15, 2018 12:29:32 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-54mVu1Dg77tix4KuPI8kng.jar
Nov 15, 2018 12:29:35 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 29 files newly uploaded
Nov 15, 2018 12:29:36 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7.ReadBounded as step s1
Nov 15, 2018 12:29:36 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Monitor as step s2
Nov 15, 2018 12:29:36 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Snoop as step s3
Nov 15, 2018 12:29:36 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 15, 2018 12:29:36 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 15, 2018 12:29:36 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 15, 2018 12:29:36 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 15, 2018 12:29:36 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 15, 2018 12:29:36 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 15, 2018 12:29:36 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 15, 2018 12:29:36 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 as step s11
Nov 15, 2018 12:29:36 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/StripNonce/Map
 as step s12
Nov 15, 2018 12:29:36 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Window.Remerge/Identity/Map
 as step s13
Nov 15, 2018 12:29:36 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PrepareCold/Map
 as step s14
Nov 15, 2018 12:29:36 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Flatten.PCollections
 as step s15
Nov 15, 2018 

Build failed in Jenkins: beam_PostCommit_Python_Verify #6559

2018-11-14 Thread Apache Jenkins Server
See 


--
[...truncated 1.48 MB...]
test_delete_error (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_delete_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_exists (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory_trailing_slash 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_match_file_empty 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_limits 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_zero_limit 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs_failed (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_open (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_open_bad_path (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_rename_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_as_tablerows 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_table_schema_without_project 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_both_query_and_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_neither_query_nor_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_nested_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_parse_schema_descriptor 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySink) ... SKIP: GCP dependencies 
are not installed

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #1034

2018-11-14 Thread Apache Jenkins Server
See 


--
[...truncated 246.32 KB...]
  } ],
  "message" : "The total number of changes to the object 
temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-Kohn-9hft0FIQ03bMTyfSg.jar
 exceeds the rate limit. Please reduce the rate of create, update, and delete 
requests."
}
at 
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at 
com.google.cloud.hadoop.util.AbstractGoogleAsyncWriteChannel$UploadOperation.call(AbstractGoogleAsyncWriteChannel.java:358)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

Nov 15, 2018 12:04:30 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-Kohn-9hft0FIQ03bMTyfSg.jar
Nov 15, 2018 12:04:34 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 29 files newly uploaded
Nov 15, 2018 12:04:34 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7.ReadBounded as step s1
Nov 15, 2018 12:04:34 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Monitor as step s2
Nov 15, 2018 12:04:34 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Snoop as step s3
Nov 15, 2018 12:04:34 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 15, 2018 12:04:34 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 15, 2018 12:04:34 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 15, 2018 12:04:34 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 15, 2018 12:04:34 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 15, 2018 12:04:34 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 15, 2018 12:04:34 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 15, 2018 12:04:34 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 as step s11
Nov 15, 2018 12:04:34 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/StripNonce/Map
 as step s12
Nov 15, 2018 12:04:34 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Window.Remerge/Identity/Map
 as step s13
Nov 15, 2018 12:04:34 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow_Gradle #96

2018-11-14 Thread Apache Jenkins Server
See 


--
[...truncated 18.28 MB...]
INFO: 2018-11-15T02:22:13.877Z: Fusing consumer 
PAssert$27/GroupGlobally/GroupDummyAndContents/Reify into 
PAssert$27/GroupGlobally/WindowIntoDummy/Window.Assign
Nov 15, 2018 2:22:15 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T02:22:13.913Z: Fusing consumer 
PAssert$27/GroupGlobally/GroupDummyAndContents/Write into 
PAssert$27/GroupGlobally/GroupDummyAndContents/Reify
Nov 15, 2018 2:22:15 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T02:22:13.949Z: Fusing consumer 
Create123/Read(CreateSource)/ParDo(ReadFromBoundedSource) into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
Nov 15, 2018 2:22:15 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T02:22:13.986Z: Fusing consumer 
PAssert$27/GroupGlobally/Window.Into()/Window.Assign into OutputSideInputs
Nov 15, 2018 2:22:15 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T02:22:14.023Z: Fusing consumer 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable 
into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
Nov 15, 2018 2:22:15 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T02:22:14.056Z: Fusing consumer 
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/GroupByWindow
 into 
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read
Nov 15, 2018 2:22:15 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T02:22:14.092Z: Fusing consumer 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
 into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
Nov 15, 2018 2:22:15 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T02:22:14.129Z: Fusing consumer 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write 
into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
Nov 15, 2018 2:22:15 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T02:22:14.153Z: Fusing consumer 
PAssert$27/GroupGlobally/WindowIntoDummy/Window.Assign into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource)
Nov 15, 2018 2:22:15 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T02:22:14.187Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource)
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
Nov 15, 2018 2:22:15 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T02:22:14.210Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
Nov 15, 2018 2:22:15 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T02:22:14.242Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
Nov 15, 2018 2:22:15 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T02:22:14.269Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
Nov 15, 2018 2:22:15 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T02:22:14.291Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
Nov 15, 2018 2:22:15 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T02:22:14.329Z: Fusing consumer OutputSideInputs into 
Create123/Read(CreateSource)/ParDo(ReadFromBoundedSource)
Nov 15, 2018 2:22:15 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #1037

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-6067] Specify pipeline_coder_id property in non-Beam-standard

--
[...truncated 236.85 KB...]
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-e6SXQewxDkFMcQsxHMZHOA.jar
Nov 15, 2018 3:35:13 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-e6SXQewxDkFMcQsxHMZHOA.jar
Nov 15, 2018 3:35:13 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-e6SXQewxDkFMcQsxHMZHOA.jar
Nov 15, 2018 3:35:13 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-e6SXQewxDkFMcQsxHMZHOA.jar
Nov 15, 2018 3:35:13 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-e6SXQewxDkFMcQsxHMZHOA.jar
Nov 15, 2018 3:35:13 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-e6SXQewxDkFMcQsxHMZHOA.jar
Nov 15, 2018 3:35:17 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 29 files newly uploaded
Nov 15, 2018 3:35:17 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7.ReadBounded as step s1
Nov 15, 2018 3:35:17 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Monitor as step s2
Nov 15, 2018 3:35:17 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Snoop as step s3
Nov 15, 2018 3:35:17 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 15, 2018 3:35:17 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 15, 2018 3:35:17 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 15, 2018 3:35:17 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 15, 2018 3:35:17 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 15, 2018 3:35:17 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 15, 2018 3:35:17 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 

Build failed in Jenkins: beam_PostCommit_Go_GradleBuild #1678

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-6067] Specify pipeline_coder_id property in non-Beam-standard

--
[...truncated 669.55 KB...]
{14: int/int[varintz] GLO}
Edges: 1: Impulse [] -> [Out: []uint8 -> {1: []uint8/bytes GLO}]
2: ParDo [In(Main): []uint8 <- {1: []uint8/bytes GLO}] -> [Out: T -> {2: 
int/int[varintz] GLO}]
3: ParDo [In(Main): int <- {2: int/int[varintz] GLO}] -> [Out: int -> {3: 
int/int[varintz] GLO} Out: int -> {4: int/int[varintz] GLO} Out: int -> {5: 
int/int[varintz] GLO} Out: int -> {6: int/int[varintz] GLO} Out: int -> {7: 
int/int[varintz] GLO} Out: int -> {8: int/int[varintz] GLO} Out: int -> {9: 
int/int[varintz] GLO}]
4: Flatten [In(Main): int <- {3: int/int[varintz] GLO} In(Main): int <- {4: 
int/int[varintz] GLO} In(Main): int <- {5: int/int[varintz] GLO} In(Main): int 
<- {6: int/int[varintz] GLO} In(Main): int <- {7: int/int[varintz] GLO} 
In(Main): int <- {8: int/int[varintz] GLO} In(Main): int <- {9: 
int/int[varintz] GLO}] -> [Out: int -> {10: int/int[varintz] GLO}]
5: Impulse [] -> [Out: []uint8 -> {11: []uint8/bytes GLO}]
6: ParDo [In(Main): []uint8 <- {11: []uint8/bytes GLO} In(Iter): T <- {10: 
int/int[varintz] GLO} In(Iter): T <- {2: int/int[varintz] GLO}] -> [Out: T -> 
{12: int/int[varintz] GLO} Out: T -> {13: int/int[varintz] GLO} Out: T -> {14: 
int/int[varintz] GLO}]
7: ParDo [In(Main): X <- {12: int/int[varintz] GLO}] -> []
8: ParDo [In(Main): X <- {14: int/int[varintz] GLO}] -> []
2018/11/15 03:48:00 Plan[plan]:
12: Impulse[0]
13: Impulse[0]
1: ParDo[passert.failFn] Out:[]
2: Discard
3: ParDo[passert.failFn] Out:[]
4: ParDo[passert.diffFn] Out:[1 2 3]
5: wait[2] Out:4
6: buffer[6]. wait:5 Out:4
7: buffer[7]. wait:5 Out:4
8: Flatten[7]. Out:buffer[6]. wait:5 Out:4
9: ParDo[beam.partitionFn] Out:[8 8 8 8 8 8 8]
10: Multiplex. Out:[9 7]
11: ParDo[beam.createFn] Out:[10]
2018/11/15 03:48:00 wait[5] unblocked w/ 1 [false]
2018/11/15 03:48:00 wait[5] done
--- PASS: TestPartitionFlattenIdentity (0.00s)
=== RUN   Example_metricsDeclaredAnywhere
--- PASS: Example_metricsDeclaredAnywhere (0.00s)
=== RUN   Example_metricsReusable
--- PASS: Example_metricsReusable (0.00s)
PASS
coverage: 44.8% of statements
ok  github.com/apache/beam/sdks/go/pkg/beam 0.016s  coverage: 44.8% of 
statements
=== RUN   TestOptions
--- PASS: TestOptions (0.00s)
=== RUN   TestKey
--- PASS: TestKey (0.00s)
=== RUN   TestRegister
--- PASS: TestRegister (0.00s)
PASS
coverage: 48.5% of statements
ok  github.com/apache/beam/sdks/go/pkg/beam/core/runtime0.002s  
coverage: 48.5% of statements
=== RUN   TestMergeMaps
--- PASS: TestMergeMaps (0.00s)
=== RUN   TestShallowClone
--- PASS: TestShallowClone (0.00s)
=== RUN   TestShallowCloneNil
--- PASS: TestShallowCloneNil (0.00s)
PASS
coverage: 6.4% of statements
ok  github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx  0.003s  
coverage: 6.4% of statements

> Task :beam-sdks-go:test
Test for github.com/apache/beam/sdks/go/pkg/beam finished, 7 completed, 0 failed
Result of package github.com/apache/beam/sdks/go/pkg/beam/core/runtime:
Test for github.com/apache/beam/sdks/go/pkg/beam/core/runtime finished, 3 
completed, 0 failed
Result of package github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx:
Test for github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx finished, 3 
completed, 0 failed
Generating HTML test report...
Finished generating test html results (0.138 secs) into: 

Invalidating in-memory cache of 


> Task :beam-sdks-go:test FAILED
:beam-sdks-go:test (Thread[Task worker for ':' Thread 11,5,main]) completed. 
Took 27.115 secs.
:beam-sdks-go-container:prepare (Thread[Task worker for ':' Thread 11,5,main]) 
started.

> Task :beam-sdks-go-container:prepare
Caching disabled for task ':beam-sdks-go-container:prepare': Caching has not 
been enabled for the task
Task ':beam-sdks-go-container:prepare' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Use project GOPATH: 

:beam-sdks-go-container:prepare (Thread[Task worker for ':' Thread 11,5,main]) 
completed. Took 0.002 secs.
:beam-sdks-go-container:resolveBuildDependencies (Thread[Task worker for ':' 
Thread 11,5,main]) started.

> Task :beam-sdks-go-container:resolveBuildDependencies UP-TO-DATE
Build cache key for task ':beam-sdks-go-container:resolveBuildDependencies' is 
ee37f5260adc7cd3ce2d7ead07ac1ecc
Caching disabled for task ':beam-sdks-go-container:resolveBuildDependencies': 
Caching has not been enabled for the task
Skipping task 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow_Gradle #97

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[adrianwit] [BEAM-5729] added database/sql based reader and writer

--
[...truncated 18.27 MB...]
INFO: 2018-11-15T04:43:06.798Z: Fusing consumer 
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/GroupByWindow
 into 
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read
Nov 15, 2018 4:43:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T04:43:06.828Z: Fusing consumer 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
 into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
Nov 15, 2018 4:43:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T04:43:06.853Z: Fusing consumer 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write 
into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
Nov 15, 2018 4:43:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T04:43:06.887Z: Fusing consumer 
PAssert$27/GroupGlobally/WindowIntoDummy/Window.Assign into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource)
Nov 15, 2018 4:43:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T04:43:06.910Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource)
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
Nov 15, 2018 4:43:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T04:43:06.946Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
Nov 15, 2018 4:43:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T04:43:06.982Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
Nov 15, 2018 4:43:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T04:43:07.016Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
Nov 15, 2018 4:43:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T04:43:07.054Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
Nov 15, 2018 4:43:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T04:43:07.089Z: Fusing consumer OutputSideInputs into 
Create123/Read(CreateSource)/ParDo(ReadFromBoundedSource)
Nov 15, 2018 4:43:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T04:43:07.125Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair
 with random key
Nov 15, 2018 4:43:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T04:43:07.161Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair
 with random key into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(SplitBoundedSource)
Nov 15, 2018 4:43:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T04:43:07.194Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(SplitBoundedSource)
 into PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Impulse
Nov 15, 2018 4:43:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T04:43:07.230Z: Fusing consumer 
PAssert$27/GroupGlobally/KeyForDummy/AddKeys/Map into 
PAssert$27/GroupGlobally/RewindowActuals/Window.Assign
Nov 

Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #759

2018-11-14 Thread Apache Jenkins Server
See 


--
[...truncated 4.05 MB...]
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - 
http://localhost:46377 was granted leadership with 
leaderSessionID=48c94024-b16b-4284-bb5a-88d3a60b5e5e
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader http://localhost:46377 , 
session=48c94024-b16b-4284-bb5a-88d3a60b5e5e
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService 
- Starting RPC endpoint for 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher at 
akka://flink/user/dispatcher620cfdbd-1a4c-41c3-85b3-8540bc4ba1d7 .
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher@61ac0f8e @ 
akka://flink/user/dispatcher620cfdbd-1a4c-41c3-85b3-8540bc4ba1d7
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Flink Mini Cluster started successfully
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Dispatcher 
akka://flink/user/dispatcher620cfdbd-1a4c-41c3-85b3-8540bc4ba1d7 was granted 
leadership with fencing token 646cee62-432d-450b-ba81-c66691b40136
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Recovering all 
persisted jobs.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader 
akka://flink/user/dispatcher620cfdbd-1a4c-41c3-85b3-8540bc4ba1d7 , 
session=646cee62-432d-450b-ba81-c66691b40136
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Submitting job 
cbd9a4f836e5fe19a1bf2b84a993f9e3 (test_windowing_1542262288.77).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.JobMaster at akka://flink/user/jobmanager_39 
.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Initializing job 
test_windowing_1542262288.77 (cbd9a4f836e5fe19a1bf2b84a993f9e3).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Using restart strategy 
NoRestartStrategy for test_windowing_1542262288.77 
(cbd9a4f836e5fe19a1bf2b84a993f9e3).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool at 
akka://flink/user/7aca4bc3-131c-497c-bf55-7aa1efe9d5b6 .
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job recovers via 
failover strategy: full graph restart
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Running initialization on master 
for job test_windowing_1542262288.77 (cbd9a4f836e5fe19a1bf2b84a993f9e3).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Successfully ran initialization 
on master in 0 ms.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - No state backend has been 
configured, using default (Memory / JobManager) MemoryStateBackend (data in 
heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 
'null', asynchronous: TRUE, maxStateSize: 5242880)
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.jobmaster.JobManagerRunner@49ff4d10 @ 
akka://flink/user/jobmanager_39
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobManagerRunner - JobManager runner for job 
test_windowing_1542262288.77 (cbd9a4f836e5fe19a1bf2b84a993f9e3) was granted 
leadership with session id da151595-3376-4eb7-9071-3d92df0a2fe0 at 
akka://flink/user/jobmanager_39.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Starting execution of job 
test_windowing_1542262288.77 (cbd9a4f836e5fe19a1bf2b84a993f9e3)
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job 
test_windowing_1542262288.77 (cbd9a4f836e5fe19a1bf2b84a993f9e3) switched from 
state CREATED to RUNNING.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: Custom Source 
-> 14Create/Impulse.None/beam:env:docker:v1:0 -> ToKeyedWorkItem (1/1) 
(b6f2bc061606bf681259c63e7f8942fd) switched from 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #1038

2018-11-14 Thread Apache Jenkins Server
See 


--
[...truncated 262.36 KB...]
"message" : "The total number of changes to the object 
temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-KfizwJMY2Z3NUmKlH8LMGQ.jar
 exceeds the rate limit. Please reduce the rate of create, update, and delete 
requests.",
"reason" : "rateLimitExceeded"
  } ],
  "message" : "The total number of changes to the object 
temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-KfizwJMY2Z3NUmKlH8LMGQ.jar
 exceeds the rate limit. Please reduce the rate of create, update, and delete 
requests."
}
at 
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at 
com.google.cloud.hadoop.util.AbstractGoogleAsyncWriteChannel$UploadOperation.call(AbstractGoogleAsyncWriteChannel.java:358)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

Nov 15, 2018 6:03:56 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-KfizwJMY2Z3NUmKlH8LMGQ.jar
Nov 15, 2018 6:03:56 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-KfizwJMY2Z3NUmKlH8LMGQ.jar
Nov 15, 2018 6:04:00 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 29 files newly uploaded
Nov 15, 2018 6:04:00 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7.ReadBounded as step s1
Nov 15, 2018 6:04:00 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Monitor as step s2
Nov 15, 2018 6:04:00 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Snoop as step s3
Nov 15, 2018 6:04:00 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 15, 2018 6:04:00 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 15, 2018 6:04:00 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 15, 2018 6:04:00 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 15, 2018 6:04:00 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 15, 2018 6:04:00 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 15, 2018 6:04:00 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 15, 2018 6:04:00 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 

Build failed in Jenkins: beam_PostCommit_Go_GradleBuild #1679

2018-11-14 Thread Apache Jenkins Server
See 


--
[...truncated 674.79 KB...]
{12: int/int[varintz] GLO}
{13: int/int[varintz] GLO}
{14: int/int[varintz] GLO}
Edges: 1: Impulse [] -> [Out: []uint8 -> {1: []uint8/bytes GLO}]
2: ParDo [In(Main): []uint8 <- {1: []uint8/bytes GLO}] -> [Out: T -> {2: 
int/int[varintz] GLO}]
3: ParDo [In(Main): int <- {2: int/int[varintz] GLO}] -> [Out: int -> {3: 
int/int[varintz] GLO} Out: int -> {4: int/int[varintz] GLO} Out: int -> {5: 
int/int[varintz] GLO} Out: int -> {6: int/int[varintz] GLO} Out: int -> {7: 
int/int[varintz] GLO} Out: int -> {8: int/int[varintz] GLO} Out: int -> {9: 
int/int[varintz] GLO}]
4: Flatten [In(Main): int <- {3: int/int[varintz] GLO} In(Main): int <- {4: 
int/int[varintz] GLO} In(Main): int <- {5: int/int[varintz] GLO} In(Main): int 
<- {6: int/int[varintz] GLO} In(Main): int <- {7: int/int[varintz] GLO} 
In(Main): int <- {8: int/int[varintz] GLO} In(Main): int <- {9: 
int/int[varintz] GLO}] -> [Out: int -> {10: int/int[varintz] GLO}]
5: Impulse [] -> [Out: []uint8 -> {11: []uint8/bytes GLO}]
6: ParDo [In(Main): []uint8 <- {11: []uint8/bytes GLO} In(Iter): T <- {10: 
int/int[varintz] GLO} In(Iter): T <- {2: int/int[varintz] GLO}] -> [Out: T -> 
{12: int/int[varintz] GLO} Out: T -> {13: int/int[varintz] GLO} Out: T -> {14: 
int/int[varintz] GLO}]
7: ParDo [In(Main): X <- {12: int/int[varintz] GLO}] -> []
8: ParDo [In(Main): X <- {14: int/int[varintz] GLO}] -> []
2018/11/15 06:11:24 Plan[plan]:
12: Impulse[0]
13: Impulse[0]
1: ParDo[passert.failFn] Out:[]
2: Discard
3: ParDo[passert.failFn] Out:[]
4: ParDo[passert.diffFn] Out:[1 2 3]
5: wait[2] Out:4
6: buffer[6]. wait:5 Out:4
7: buffer[7]. wait:5 Out:4
8: Flatten[7]. Out:buffer[6]. wait:5 Out:4
9: ParDo[beam.partitionFn] Out:[8 8 8 8 8 8 8]
10: Multiplex. Out:[9 7]
11: ParDo[beam.createFn] Out:[10]
2018/11/15 06:11:24 wait[5] unblocked w/ 1 [false]
2018/11/15 06:11:24 wait[5] done
--- PASS: TestPartitionFlattenIdentity (0.00s)
=== RUN   Example_metricsDeclaredAnywhere
--- PASS: Example_metricsDeclaredAnywhere (0.00s)
=== RUN   Example_metricsReusable
--- PASS: Example_metricsReusable (0.00s)
PASS
coverage: 44.8% of statements
ok  github.com/apache/beam/sdks/go/pkg/beam 0.014s  coverage: 44.8% of 
statements
=== RUN   TestOptions
--- PASS: TestOptions (0.00s)
=== RUN   TestKey
--- PASS: TestKey (0.00s)
=== RUN   TestRegister
--- PASS: TestRegister (0.00s)
PASS
coverage: 48.5% of statements
ok  github.com/apache/beam/sdks/go/pkg/beam/core/runtime0.003s  
coverage: 48.5% of statements
=== RUN   TestMergeMaps
--- PASS: TestMergeMaps (0.00s)
=== RUN   TestShallowClone
--- PASS: TestShallowClone (0.00s)
=== RUN   TestShallowCloneNil
--- PASS: TestShallowCloneNil (0.00s)
PASS
coverage: 6.4% of statements
ok  github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx  0.003s  
coverage: 6.4% of statements

> Task :beam-sdks-go:test FAILED
Test for github.com/apache/beam/sdks/go/pkg/beam finished, 7 completed, 0 failed
Result of package github.com/apache/beam/sdks/go/pkg/beam/core/runtime:
Test for github.com/apache/beam/sdks/go/pkg/beam/core/runtime finished, 3 
completed, 0 failed
Result of package github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx:
Test for github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx finished, 3 
completed, 0 failed
Generating HTML test report...
Finished generating test html results (0.136 secs) into: 

Invalidating in-memory cache of 

:beam-sdks-go:test (Thread[Task worker for ':' Thread 11,5,main]) completed. 
Took 26.719 secs.
:beam-sdks-go-container:prepare (Thread[Task worker for ':' Thread 11,5,main]) 
started.

> Task :beam-sdks-go-container:prepare
Caching disabled for task ':beam-sdks-go-container:prepare': Caching has not 
been enabled for the task
Task ':beam-sdks-go-container:prepare' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Use project GOPATH: 

:beam-sdks-go-container:prepare (Thread[Task worker for ':' Thread 11,5,main]) 
completed. Took 0.001 secs.
:beam-sdks-go-container:resolveBuildDependencies (Thread[Task worker for ':' 
Thread 11,5,main]) started.

> Task :beam-sdks-go-container:resolveBuildDependencies UP-TO-DATE
Build cache key for task ':beam-sdks-go-container:resolveBuildDependencies' is 
ee37f5260adc7cd3ce2d7ead07ac1ecc
Caching disabled for task ':beam-sdks-go-container:resolveBuildDependencies': 
Caching has not been enabled for the task
Skipping task ':beam-sdks-go-container:resolveBuildDependencies' as it is 
up-to-date.

Build failed in Jenkins: beam_PostCommit_Go_GradleBuild #1677

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[robinyqiu] Revert PR #6903 because it is breaking test error reporting

--
[...truncated 669.34 KB...]
3: ParDo [In(Main): int <- {2: int/int[varintz] GLO}] -> [Out: int -> {3: 
int/int[varintz] GLO} Out: int -> {4: int/int[varintz] GLO} Out: int -> {5: 
int/int[varintz] GLO} Out: int -> {6: int/int[varintz] GLO} Out: int -> {7: 
int/int[varintz] GLO} Out: int -> {8: int/int[varintz] GLO} Out: int -> {9: 
int/int[varintz] GLO}]
4: Flatten [In(Main): int <- {3: int/int[varintz] GLO} In(Main): int <- {4: 
int/int[varintz] GLO} In(Main): int <- {5: int/int[varintz] GLO} In(Main): int 
<- {6: int/int[varintz] GLO} In(Main): int <- {7: int/int[varintz] GLO} 
In(Main): int <- {8: int/int[varintz] GLO} In(Main): int <- {9: 
int/int[varintz] GLO}] -> [Out: int -> {10: int/int[varintz] GLO}]
5: Impulse [] -> [Out: []uint8 -> {11: []uint8/bytes GLO}]
6: ParDo [In(Main): []uint8 <- {11: []uint8/bytes GLO} In(Iter): T <- {10: 
int/int[varintz] GLO} In(Iter): T <- {2: int/int[varintz] GLO}] -> [Out: T -> 
{12: int/int[varintz] GLO} Out: T -> {13: int/int[varintz] GLO} Out: T -> {14: 
int/int[varintz] GLO}]
7: ParDo [In(Main): X <- {12: int/int[varintz] GLO}] -> []
8: ParDo [In(Main): X <- {14: int/int[varintz] GLO}] -> []
2018/11/15 03:35:36 Plan[plan]:
12: Impulse[0]
13: Impulse[0]
1: ParDo[passert.failFn] Out:[]
2: Discard
3: ParDo[passert.failFn] Out:[]
4: ParDo[passert.diffFn] Out:[1 2 3]
5: wait[2] Out:4
6: buffer[6]. wait:5 Out:4
7: buffer[7]. wait:5 Out:4
8: Flatten[7]. Out:buffer[6]. wait:5 Out:4
9: ParDo[beam.partitionFn] Out:[8 8 8 8 8 8 8]
10: Multiplex. Out:[9 7]
11: ParDo[beam.createFn] Out:[10]
2018/11/15 03:35:36 wait[5] unblocked w/ 1 [false]
2018/11/15 03:35:36 wait[5] done
--- PASS: TestPartitionFlattenIdentity (0.00s)
=== RUN   Example_metricsDeclaredAnywhere
--- PASS: Example_metricsDeclaredAnywhere (0.00s)
=== RUN   Example_metricsReusable
--- PASS: Example_metricsReusable (0.00s)
PASS
coverage: 44.8% of statements
ok  github.com/apache/beam/sdks/go/pkg/beam 0.016s  coverage: 44.8% of 
statements
=== RUN   TestOptions
--- PASS: TestOptions (0.00s)
=== RUN   TestKey
--- PASS: TestKey (0.00s)
=== RUN   TestRegister
--- PASS: TestRegister (0.00s)
PASS
coverage: 48.5% of statements
ok  github.com/apache/beam/sdks/go/pkg/beam/core/runtime0.003s  
coverage: 48.5% of statements

> Task :beam-sdks-go:test
Test for github.com/apache/beam/sdks/go/pkg/beam/core/runtime/harness finished, 
1 completed, 0 failed
Result of package github.com/apache/beam/sdks/go/pkg/beam:
Test for github.com/apache/beam/sdks/go/pkg/beam finished, 7 completed, 0 failed
Result of package github.com/apache/beam/sdks/go/pkg/beam/core/runtime:
Test for github.com/apache/beam/sdks/go/pkg/beam/core/runtime finished, 3 
completed, 0 failed
Result of package github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx:

=== RUN   TestMergeMaps
--- PASS: TestMergeMaps (0.00s)
=== RUN   TestShallowClone
--- PASS: TestShallowClone (0.00s)
=== RUN   TestShallowCloneNil
--- PASS: TestShallowCloneNil (0.00s)
PASS
coverage: 6.4% of statements
ok  github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx  0.004s  
coverage: 6.4% of statements

> Task :beam-sdks-go:test FAILED
Test for github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx finished, 3 
completed, 0 failed
Generating HTML test report...
Finished generating test html results (0.16 secs) into: 

Invalidating in-memory cache of 

:beam-sdks-go:test (Thread[Task worker for ':' Thread 10,5,main]) completed. 
Took 29.414 secs.
:beam-sdks-go-container:prepare (Thread[Task worker for ':' Thread 10,5,main]) 
started.

> Task :beam-sdks-go-container:prepare
Caching disabled for task ':beam-sdks-go-container:prepare': Caching has not 
been enabled for the task
Task ':beam-sdks-go-container:prepare' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Use project GOPATH: 

:beam-sdks-go-container:prepare (Thread[Task worker for ':' Thread 10,5,main]) 
completed. Took 0.003 secs.
:beam-sdks-go-container:resolveBuildDependencies (Thread[Task worker for ':' 
Thread 10,5,main]) started.

> Task :beam-sdks-go-container:resolveBuildDependencies UP-TO-DATE
Build cache key for task ':beam-sdks-go-container:resolveBuildDependencies' is 
ee37f5260adc7cd3ce2d7ead07ac1ecc
Caching disabled for task ':beam-sdks-go-container:resolveBuildDependencies': 
Caching has not been enabled for the task
Skipping task ':beam-sdks-go-container:resolveBuildDependencies' as it is 
up-to-date.

Build failed in Jenkins: beam_PreCommit_Java_Cron #589

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[adrianwit] [BEAM-5729] added database/sql based reader and writer

[robinyqiu] Revert PR #6903 because it is breaking test error reporting

[lcwik] [BEAM-6067] Specify pipeline_coder_id property in non-Beam-standard

--
[...truncated 52.92 MB...]
  BeamAggregationRel(group=[{0, 1}], num=[COUNT()], 
window=[SlidingWindows($1, PT5S, PT10S, PT0S)])
BeamCalcRel(expr#0..4=[{inputs}], auction=[$t0], $f1=[$t3])
  BeamIOSourceRel(table=[[beam, Bid]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery3Test > 
testJoinsPeopleWithAuctions STANDARD_ERROR
Nov 15, 2018 6:16:30 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `P`.`name`, `P`.`city`, `P`.`state`, `A`.`id`
FROM `beam`.`Auction` AS `A`
INNER JOIN `beam`.`Person` AS `P` ON `A`.`seller` = `P`.`id`
WHERE `A`.`category` = 10 AND (`P`.`state` = 'OR' OR `P`.`state` = 'ID' OR 
`P`.`state` = 'CA')
Nov 15, 2018 6:16:30 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(name=[$11], city=[$14], state=[$15], id=[$0])
  LogicalFilter(condition=[AND(=($8, 10), OR(=($15, 'OR'), =($15, 'ID'), 
=($15, 'CA')))])
LogicalJoin(condition=[=($7, $10)], joinType=[inner])
  BeamIOSourceRel(table=[[beam, Auction]])
  BeamIOSourceRel(table=[[beam, Person]])

Nov 15, 2018 6:16:30 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..17=[{inputs}], name=[$t11], city=[$t14], state=[$t15], 
id=[$t0])
  BeamJoinRel(condition=[=($7, $10)], joinType=[inner])
BeamCalcRel(expr#0..9=[{inputs}], expr#10=[10], expr#11=[=($t8, $t10)], 
proj#0..9=[{exprs}], $condition=[$t11])
  BeamIOSourceRel(table=[[beam, Auction]])
BeamCalcRel(expr#0..7=[{inputs}], expr#8=['OR'], expr#9=[=($t5, $t8)], 
expr#10=['ID'], expr#11=[=($t5, $t10)], expr#12=['CA'], expr#13=[=($t5, $t12)], 
expr#14=[OR($t9, $t11, $t13)], proj#0..7=[{exprs}], $condition=[$t14])
  BeamIOSourceRel(table=[[beam, Person]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery7Test > testBids STANDARD_ERROR
Nov 15, 2018 6:16:30 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, `B`.`extra`
FROM (SELECT `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, 
`B`.`extra`, TUMBLE_START(`B`.`dateTime`, INTERVAL '10' SECOND) AS `starttime`
FROM `beam`.`Bid` AS `B`
GROUP BY `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, 
`B`.`extra`, TUMBLE(`B`.`dateTime`, INTERVAL '10' SECOND)) AS `B`
INNER JOIN (SELECT MAX(`B1`.`price`) AS `maxprice`, 
TUMBLE_START(`B1`.`dateTime`, INTERVAL '10' SECOND) AS `starttime`
FROM `beam`.`Bid` AS `B1`
GROUP BY TUMBLE(`B1`.`dateTime`, INTERVAL '10' SECOND)) AS `B1` ON 
`B`.`starttime` = `B1`.`starttime` AND `B`.`price` = `B1`.`maxprice`
Nov 15, 2018 6:16:30 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(auction=[$0], price=[$1], bidder=[$2], dateTime=[$3], 
extra=[$4])
  LogicalJoin(condition=[AND(=($5, $7), =($1, $6))], joinType=[inner])
LogicalProject(auction=[$0], price=[$1], bidder=[$2], dateTime=[$3], 
extra=[$4], starttime=[$5])
  LogicalAggregate(group=[{0, 1, 2, 3, 4, 5}])
LogicalProject(auction=[$0], price=[$2], bidder=[$1], 
dateTime=[$3], extra=[$4], $f5=[TUMBLE($3, 1)])
  BeamIOSourceRel(table=[[beam, Bid]])
LogicalProject(maxprice=[$1], starttime=[$0])
  LogicalAggregate(group=[{0}], maxprice=[MAX($1)])
LogicalProject($f0=[TUMBLE($3, 1)], price=[$2])
  BeamIOSourceRel(table=[[beam, Bid]])

Nov 15, 2018 6:16:30 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..7=[{inputs}], proj#0..4=[{exprs}])
  BeamJoinRel(condition=[AND(=($5, $7), =($1, $6))], joinType=[inner])
BeamCalcRel(expr#0..5=[{inputs}], proj#0..5=[{exprs}])
  BeamAggregationRel(group=[{0, 1, 2, 3, 4, 5}], 
window=[FixedWindows($5, PT10S, PT0S)])
BeamCalcRel(expr#0..4=[{inputs}], auction=[$t0], price=[$t2], 
bidder=[$t1], dateTime=[$t3], extra=[$t4], $f5=[$t3])
  BeamIOSourceRel(table=[[beam, Bid]])
BeamCalcRel(expr#0..1=[{inputs}], maxprice=[$t1], starttime=[$t0])
  BeamAggregationRel(group=[{0}], maxprice=[MAX($1)], 
window=[FixedWindows($0, PT10S, PT0S)])
BeamCalcRel(expr#0..4=[{inputs}], $f0=[$t3], price=[$t2])
  BeamIOSourceRel(table=[[beam, Bid]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery2Test > 

Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #758

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-6067] Specify pipeline_coder_id property in non-Beam-standard

--
[...truncated 4.04 MB...]
[flink-runner-job-server] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Failed to load web 
based job submission extension. Probable reason: flink-runtime-web is not in 
the classpath.
[flink-runner-job-server] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Rest endpoint 
listening at localhost:46769
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint@3fa34199 @ 
http://localhost:46769
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - 
http://localhost:46769 was granted leadership with 
leaderSessionID=df769912-5af6-4133-a9be-2be613c3942a
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader http://localhost:46769 , 
session=df769912-5af6-4133-a9be-2be613c3942a
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Starting job dispatcher(s) for JobManger
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService 
- Starting RPC endpoint for 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher at 
akka://flink/user/dispatcher4904aca1-120a-44be-9ea4-133d9b8ca63f .
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher@3cc2753 @ 
akka://flink/user/dispatcher4904aca1-120a-44be-9ea4-133d9b8ca63f
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Dispatcher 
akka://flink/user/dispatcher4904aca1-120a-44be-9ea4-133d9b8ca63f was granted 
leadership with fencing token 8c8b7384-302b-4f21-a641-0cd430fbb23c
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Recovering all 
persisted jobs.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader 
akka://flink/user/dispatcher4904aca1-120a-44be-9ea4-133d9b8ca63f , 
session=8c8b7384-302b-4f21-a641-0cd430fbb23c
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Flink Mini Cluster started successfully
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Submitting job 
6407cad1d2313ed84a919690b5d0d6c2 (test_windowing_1542253455.87).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.JobMaster at akka://flink/user/jobmanager_39 
.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Initializing job 
test_windowing_1542253455.87 (6407cad1d2313ed84a919690b5d0d6c2).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Using restart strategy 
NoRestartStrategy for test_windowing_1542253455.87 
(6407cad1d2313ed84a919690b5d0d6c2).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool at 
akka://flink/user/3f609833-cbcc-4c80-b211-94df5a6ac690 .
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job recovers via 
failover strategy: full graph restart
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Running initialization on master 
for job test_windowing_1542253455.87 (6407cad1d2313ed84a919690b5d0d6c2).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Successfully ran initialization 
on master in 0 ms.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - No state backend has been 
configured, using default (Memory / JobManager) MemoryStateBackend (data in 
heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 
'null', asynchronous: TRUE, maxStateSize: 5242880)
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.jobmaster.JobManagerRunner@5c942927 @ 
akka://flink/user/jobmanager_39
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobManagerRunner - JobManager runner for job 

Build failed in Jenkins: beam_PostCommit_Python_Verify #6560

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[adrianwit] [BEAM-5729] added database/sql based reader and writer

--
[...truncated 1.63 MB...]
test_delete_error (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_delete_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_exists (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory_trailing_slash 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_match_file_empty 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_limits 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_zero_limit 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs_failed (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_open (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_open_bad_path (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_rename_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_as_tablerows 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_table_schema_without_project 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_both_query_and_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_neither_query_nor_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_nested_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_parse_schema_descriptor 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #1036

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[robinyqiu] Revert PR #6903 because it is breaking test error reporting

--
[...truncated 235.36 KB...]
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-ykGg69D06i06URRVGBDDfQ.jar
Nov 15, 2018 3:25:29 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-ykGg69D06i06URRVGBDDfQ.jar
Nov 15, 2018 3:25:29 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-ykGg69D06i06URRVGBDDfQ.jar
Nov 15, 2018 3:25:29 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-ykGg69D06i06URRVGBDDfQ.jar
Nov 15, 2018 3:25:29 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-ykGg69D06i06URRVGBDDfQ.jar
Nov 15, 2018 3:25:29 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-ykGg69D06i06URRVGBDDfQ.jar
Nov 15, 2018 3:25:29 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-ykGg69D06i06URRVGBDDfQ.jar
Nov 15, 2018 3:25:29 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-ykGg69D06i06URRVGBDDfQ.jar
Nov 15, 2018 3:25:29 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-ykGg69D06i06URRVGBDDfQ.jar
Nov 15, 2018 3:25:33 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 29 files newly uploaded
Nov 15, 2018 3:25:33 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7.ReadBounded as step s1
Nov 15, 2018 3:25:33 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Monitor as step s2
Nov 15, 2018 3:25:33 AM 

Build failed in Jenkins: beam_PostCommit_Python_Verify #6562

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-6067] Specify pipeline_coder_id property in non-Beam-standard

--
[...truncated 1.47 MB...]
test_checksum (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_copy_directory (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_copy_directory_overwrite_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_copy_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_copy_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_copy_file_overwrite_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_create_success (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_create_write_read_compressed 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_delete_dir (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_delete_error (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_delete_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_exists (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory_trailing_slash 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_match_file_empty 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_limits 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_zero_limit 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs_failed (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_open (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_open_bad_path (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_rename_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 

Build failed in Jenkins: beam_PreCommit_Python_Cron #591

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[adrianwit] [BEAM-5729] added database/sql based reader and writer

[robinyqiu] Revert PR #6903 because it is breaking test error reporting

[lcwik] [BEAM-6067] Specify pipeline_coder_id property in non-Beam-standard

--
[...truncated 1.16 MB...]
test_checksum (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_copy_directory (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_copy_directory_overwrite_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_copy_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_copy_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_copy_file_overwrite_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_create_success (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_create_write_read_compressed 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_delete_dir (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_delete_error (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_delete_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_exists (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory_trailing_slash 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_match_file_empty 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_limits 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_zero_limit 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs_failed (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_open (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_open_bad_path (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_rename_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow_Gradle #98

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[robinyqiu] Revert PR #6903 because it is breaking test error reporting

[lcwik] [BEAM-6067] Specify pipeline_coder_id property in non-Beam-standard

--
[...truncated 18.23 MB...]
INFO: 2018-11-15T07:01:04.642Z: Fusing consumer 
PAssert$27/GroupGlobally/GroupDummyAndContents/Reify into 
PAssert$27/GroupGlobally/WindowIntoDummy/Window.Assign
Nov 15, 2018 7:01:09 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T07:01:04.700Z: Fusing consumer 
PAssert$27/GroupGlobally/GroupDummyAndContents/Write into 
PAssert$27/GroupGlobally/GroupDummyAndContents/Reify
Nov 15, 2018 7:01:09 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T07:01:04.755Z: Fusing consumer 
Create123/Read(CreateSource)/ParDo(ReadFromBoundedSource) into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
Nov 15, 2018 7:01:09 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T07:01:04.799Z: Fusing consumer 
PAssert$27/GroupGlobally/Window.Into()/Window.Assign into OutputSideInputs
Nov 15, 2018 7:01:09 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T07:01:04.841Z: Fusing consumer 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable 
into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
Nov 15, 2018 7:01:09 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T07:01:04.881Z: Fusing consumer 
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/GroupByWindow
 into 
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read
Nov 15, 2018 7:01:09 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T07:01:04.923Z: Fusing consumer 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
 into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
Nov 15, 2018 7:01:09 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T07:01:04.959Z: Fusing consumer 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write 
into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
Nov 15, 2018 7:01:09 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T07:01:04.990Z: Fusing consumer 
PAssert$27/GroupGlobally/WindowIntoDummy/Window.Assign into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource)
Nov 15, 2018 7:01:09 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T07:01:05.028Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource)
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
Nov 15, 2018 7:01:09 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T07:01:05.060Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
Nov 15, 2018 7:01:09 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T07:01:05.103Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
Nov 15, 2018 7:01:09 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T07:01:05.172Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
Nov 15, 2018 7:01:09 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-15T07:01:05.244Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
Nov 15, 2018 7:01:09 AM 

Build failed in Jenkins: beam_PostCommit_Python_Verify #6561

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[robinyqiu] Revert PR #6903 because it is breaking test error reporting

--
[...truncated 1.57 MB...]
test_match_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_limits 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_zero_limit 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs_failed (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_open (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_open_bad_path (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_rename_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_as_tablerows 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_table_schema_without_project 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_both_query_and_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_neither_query_nor_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_nested_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_parse_schema_descriptor 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySink) ... SKIP: GCP dependencies 
are not installed
test_project_table_display_data 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySink) ... SKIP: GCP dependencies 
are not installed
test_simple_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_table_spec_display_data 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySink) ... SKIP: GCP dependencies 
are not installed
test_date_partitioned_table_name 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySource) ... SKIP: GCP 
dependencies are not installed

Build failed in Jenkins: beam_PostCommit_Python_Verify #6563

2018-11-14 Thread Apache Jenkins Server
See 


--
[...truncated 1.47 MB...]
test_checksum (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_copy_directory (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_copy_directory_overwrite_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_copy_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_copy_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_copy_file_overwrite_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_create_success (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_create_write_read_compressed 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_delete_dir (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_delete_error (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_delete_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_exists (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory_trailing_slash 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_match_file_empty 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_limits 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_zero_limit 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs_failed (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_open (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_open_bad_path (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_rename_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #1734

2018-11-14 Thread Apache Jenkins Server
See 


--
[...truncated 297.85 KB...]
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: 
root: INFO: Created job with id: [2018-11-14_22_01_13-4155478926902841098]
root: INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_22_01_13-4155478926902841098?project=apache-beam-testing
root: INFO: Job 2018-11-14_22_01_13-4155478926902841098 is in state 
JOB_STATE_RUNNING
root: INFO: 2018-11-15T06:01:13.642Z: JOB_MESSAGE_DETAILED: Autoscaling is 
enabled for job 2018-11-14_22_01_13-4155478926902841098. The number of workers 
will be between 1 and 1000.
root: INFO: 2018-11-15T06:01:13.689Z: JOB_MESSAGE_DETAILED: Autoscaling was 
automatically enabled for job 2018-11-14_22_01_13-4155478926902841098.
root: INFO: 2018-11-15T06:01:18.138Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: INFO: 2018-11-15T06:01:19.733Z: JOB_MESSAGE_BASIC: Worker configuration: 
n1-standard-1 in us-central1-b.
root: INFO: 2018-11-15T06:01:20.384Z: JOB_MESSAGE_DETAILED: Expanding 
CoGroupByKey operations into optimizable parts.
root: INFO: 2018-11-15T06:01:20.434Z: JOB_MESSAGE_DETAILED: Expanding 
GroupByKey operations into optimizable parts.
root: INFO: 2018-11-15T06:01:20.484Z: JOB_MESSAGE_DETAILED: Lifting 
ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-11-15T06:01:20.526Z: JOB_MESSAGE_DEBUG: Annotating graph with 
Autotuner information.
root: INFO: 2018-11-15T06:01:20.595Z: JOB_MESSAGE_DETAILED: Fusing adjacent 
ParDo, Read, Write, and Flatten operations
root: INFO: 2018-11-15T06:01:20.644Z: JOB_MESSAGE_DETAILED: Fusing consumer 
ParDo(CounterDoFn) into Read
root: INFO: 2018-11-15T06:01:20.694Z: JOB_MESSAGE_DEBUG: Workflow config is 
missing a default resource spec.
root: INFO: 2018-11-15T06:01:20.750Z: JOB_MESSAGE_DEBUG: Adding StepResource 
setup and teardown to workflow graph.
root: INFO: 2018-11-15T06:01:20.791Z: JOB_MESSAGE_DEBUG: Adding workflow start 
and stop steps.
root: INFO: 2018-11-15T06:01:20.838Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-11-15T06:01:21.090Z: JOB_MESSAGE_DEBUG: Executing wait step 
start3
root: INFO: 2018-11-15T06:01:21.233Z: JOB_MESSAGE_BASIC: Executing operation 
Read+ParDo(CounterDoFn)
root: INFO: 2018-11-15T06:01:21.295Z: JOB_MESSAGE_DEBUG: Starting worker pool 
setup.
root: INFO: 2018-11-15T06:01:21.345Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
root: INFO: 2018-11-15T06:01:43.064Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised 
the number of workers to 0 based on the rate of progress in the currently 
running step(s).
root: INFO: 2018-11-15T06:02:29.357Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised 
the number of workers to 1 based on the rate of progress in the currently 
running step(s).
root: INFO: 2018-11-15T06:04:17.224Z: JOB_MESSAGE_DETAILED: Workers have 
started successfully.
root: INFO: 2018-11-15T06:04:17.279Z: JOB_MESSAGE_DETAILED: Workers have 
started successfully.
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
root: INFO: 2018-11-15T07:01:21.156Z: JOB_MESSAGE_ERROR: Workflow failed. 
Causes: The Dataflow job appears to be stuck because no worker activity has 
been seen in the last 1h. You can get help with Cloud Dataflow at 
https://cloud.google.com/dataflow/support.
root: INFO: 2018-11-15T07:01:21.307Z: JOB_MESSAGE_BASIC: Cancel request is 
committed for workflow job: 2018-11-14_22_01_13-4155478926902841098.
root: INFO: 2018-11-15T07:01:21.398Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-11-15T07:01:21.475Z: JOB_MESSAGE_DEBUG: Starting worker pool 
teardown.
root: INFO: 2018-11-15T07:01:21.509Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-11-15T07:03:43.389Z: JOB_MESSAGE_DETAILED: Autoscaling: 
Reduced the number of workers to 0 based on the rate of progress in the 
currently running step(s).
root: INFO: 2018-11-15T07:03:43.640Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2018-11-15T07:03:43.767Z: JOB_MESSAGE_DEBUG: Tearing down pending 
resources...
root: INFO: Job 2018-11-14_22_01_13-4155478926902841098 is in state 
JOB_STATE_FAILED
- >> end captured logging << -

--
XML: 

--
Ran 16 tests in 3775.961s

FAILED (errors=1)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_22_01_13-4155478926902841098?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_22_01_14-5683236171450396203?project=apache-beam-testing.
Found: 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #1027

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[echauchot] [BEAM-5813] wait for the refresh of ES indices in all test utils 
that do

--
[...truncated 249.92 KB...]
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at 
com.google.cloud.hadoop.util.AbstractGoogleAsyncWriteChannel$UploadOperation.call(AbstractGoogleAsyncWriteChannel.java:358)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

Nov 14, 2018 4:14:27 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-IaplrIV0PpCBLWCeIfGy1g.jar
Nov 14, 2018 4:14:30 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-IaplrIV0PpCBLWCeIfGy1g.jar
Nov 14, 2018 4:14:34 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 29 files newly uploaded
Nov 14, 2018 4:14:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7.ReadBounded as step s1
Nov 14, 2018 4:14:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Monitor as step s2
Nov 14, 2018 4:14:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Snoop as step s3
Nov 14, 2018 4:14:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 14, 2018 4:14:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 14, 2018 4:14:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 14, 2018 4:14:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 14, 2018 4:14:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 14, 2018 4:14:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 14, 2018 4:14:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 14, 2018 4:14:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 as step s11
Nov 14, 2018 4:14:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/StripNonce/Map
 as step s12
Nov 14, 2018 4:14:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Window.Remerge/Identity/Map
 as step s13
Nov 14, 2018 4:14:34 PM 

Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #748

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[echauchot] [BEAM-5813] wait for the refresh of ES indices in all test utils 
that do

--
[...truncated 3.92 MB...]
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader http://localhost:36127 , 
session=92a47059-832e-4b19-88f5-11e08af159f8
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService 
- Starting RPC endpoint for 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher at 
akka://flink/user/dispatcherab8030c6-3d71-4fd7-b794-3643b331679b .
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher@73e47934 @ 
akka://flink/user/dispatcherab8030c6-3d71-4fd7-b794-3643b331679b
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Flink Mini Cluster started successfully
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Dispatcher 
akka://flink/user/dispatcherab8030c6-3d71-4fd7-b794-3643b331679b was granted 
leadership with fencing token bc5822e8-3075-4602-8934-e2d517981a8a
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Recovering all 
persisted jobs.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader 
akka://flink/user/dispatcherab8030c6-3d71-4fd7-b794-3643b331679b , 
session=bc5822e8-3075-4602-8934-e2d517981a8a
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Submitting job 
51f5cc3fcda3966434bec749d130d078 (test_windowing_1542212500.06).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.JobMaster at akka://flink/user/jobmanager_39 
.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Initializing job 
test_windowing_1542212500.06 (51f5cc3fcda3966434bec749d130d078).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Using restart strategy 
NoRestartStrategy for test_windowing_1542212500.06 
(51f5cc3fcda3966434bec749d130d078).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool at 
akka://flink/user/d2785869-4b59-44a2-bf22-49cf47057193 .
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job recovers via 
failover strategy: full graph restart
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Running initialization on master 
for job test_windowing_1542212500.06 (51f5cc3fcda3966434bec749d130d078).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Successfully ran initialization 
on master in 0 ms.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - No state backend has been 
configured, using default (Memory / JobManager) MemoryStateBackend (data in 
heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 
'null', asynchronous: TRUE, maxStateSize: 5242880)
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.jobmaster.JobManagerRunner@74c9b15f @ 
akka://flink/user/jobmanager_39
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobManagerRunner - JobManager runner for job 
test_windowing_1542212500.06 (51f5cc3fcda3966434bec749d130d078) was granted 
leadership with session id 57a2b8d3-c05a-4e22-a671-11c9a5c5bc81 at 
akka://flink/user/jobmanager_39.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Starting execution of job 
test_windowing_1542212500.06 (51f5cc3fcda3966434bec749d130d078)
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job 
test_windowing_1542212500.06 (51f5cc3fcda3966434bec749d130d078) switched from 
state CREATED to RUNNING.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: Custom Source 
-> 14Create/Impulse.None/beam:env:docker:v1:0 -> ToKeyedWorkItem (1/1) 
(3b451cb57656a95b9c5981ab885ee0c0) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-4] INFO 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #1032

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[altay] Convert Top combiner to a full PTransform

--
[...truncated 345.28 KB...]
Nov 14, 2018 8:03:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JoinToFiles/JoinToFiles.Snoop as step s5
Nov 14, 2018 8:03:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
JoinToFiles/JoinToFiles/View.AsMap/GBKaSVForData/ParDo(GroupByKeyHashAndSortByKeyAndWindow)
 as step s6
Nov 14, 2018 8:03:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
JoinToFiles/JoinToFiles/View.AsMap/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly
 as step s7
Nov 14, 2018 8:03:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
JoinToFiles/JoinToFiles/View.AsMap/ParMultiDo(ToIsmRecordForMapLike) as step s8
Nov 14, 2018 8:03:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JoinToFiles/JoinToFiles/View.AsMap/GBKaSVForSize as step s9
Nov 14, 2018 8:03:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
JoinToFiles/JoinToFiles/View.AsMap/ParDo(ToIsmMetadataRecordForSize) as step s10
Nov 14, 2018 8:03:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JoinToFiles/JoinToFiles/View.AsMap/GBKaSVForKeys as step s11
Nov 14, 2018 8:03:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
JoinToFiles/JoinToFiles/View.AsMap/ParDo(ToIsmMetadataRecordForKey) as step s12
Nov 14, 2018 8:03:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JoinToFiles/JoinToFiles/View.AsMap/Flatten.PCollections as step s13
Nov 14, 2018 8:03:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JoinToFiles/JoinToFiles/View.AsMap/CreateDataflowView as step s14
Nov 14, 2018 8:03:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JoinToFiles/JoinToFiles/justBids/IsBid/ParDo(Anonymous) as step s15
Nov 14, 2018 8:03:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JoinToFiles/JoinToFiles/justBids/AsBid as step s16
Nov 14, 2018 8:03:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JoinToFiles/JoinToFiles/JoinToFiles.JoinToFiles as step s17
Nov 14, 2018 8:03:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JoinToFiles/JoinToFiles.Debug as step s18
Nov 14, 2018 8:03:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JoinToFiles/JoinToFiles.Stamp as step s19
Nov 14, 2018 8:03:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JoinToFiles.Format as step s20
Nov 14, 2018 8:03:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JoinToFiles.DevNull as step s21
Nov 14, 2018 8:03:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to 
gs://temp-storage-for-perf-tests/nexmark/staging/
Nov 14, 2018 8:03:39 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <58911 bytes, hash UMgQDEHpel4-QGk6xajYGQ> to 
gs://temp-storage-for-perf-tests/nexmark/staging/pipeline-UMgQDEHpel4-QGk6xajYGQ.pb
Dataflow SDK version: 2.9.0-SNAPSHOT
Nov 14, 2018 8:03:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
Submitted job: 2018-11-14_12_03_39-15367662654408494005
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_12_03_39-15367662654408494005?project=apache-beam-testing
Nov 14, 2018 8:03:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2018-11-14_12_03_39-15367662654408494005
Nov 14, 2018 8:03:42 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-mxOjcYQfrzMYL2s8cJL9RA.jar
Nov 14, 2018 8:03:43 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #1728

2018-11-14 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #1033

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[altay] fix lint error un userstate.py

--
[...truncated 234.39 KB...]
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-fi_2dXL68r0FmSEBpxo71g.jar
Nov 14, 2018 8:13:21 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-fi_2dXL68r0FmSEBpxo71g.jar
Nov 14, 2018 8:13:21 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-fi_2dXL68r0FmSEBpxo71g.jar
Nov 14, 2018 8:13:21 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-fi_2dXL68r0FmSEBpxo71g.jar
Nov 14, 2018 8:13:21 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-fi_2dXL68r0FmSEBpxo71g.jar
Nov 14, 2018 8:13:26 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 29 files newly uploaded
Nov 14, 2018 8:13:27 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7.ReadBounded as step s1
Nov 14, 2018 8:13:27 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Monitor as step s2
Nov 14, 2018 8:13:27 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Snoop as step s3
Nov 14, 2018 8:13:27 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 14, 2018 8:13:27 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 14, 2018 8:13:27 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 14, 2018 8:13:27 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 14, 2018 8:13:27 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 14, 2018 8:13:27 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 14, 2018 8:13:27 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 14, 2018 8:13:27 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 as step s11
Nov 14, 2018 8:13:27 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 

Build failed in Jenkins: beam_PostCommit_Python_Verify #6556

2018-11-14 Thread Apache Jenkins Server
See 


--
[...truncated 1.64 MB...]
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_as_tablerows 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_table_schema_without_project 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_both_query_and_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_neither_query_nor_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_nested_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_parse_schema_descriptor 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySink) ... SKIP: GCP dependencies 
are not installed
test_project_table_display_data 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySink) ... SKIP: GCP dependencies 
are not installed
test_simple_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_table_spec_display_data 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySink) ... SKIP: GCP dependencies 
are not installed
test_date_partitioned_table_name 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySource) ... SKIP: GCP 
dependencies are not installed
test_display_data_item_on_validate_true 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySource) ... SKIP: GCP 
dependencies are not installed
test_parse_table_reference 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySource) ... SKIP: GCP 
dependencies are not installed
test_query_only_display_data 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySource) ... SKIP: GCP 
dependencies are not installed
test_specify_query_flattened_records 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySource) ... SKIP: GCP 
dependencies are not installed
test_specify_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySource) ... SKIP: GCP 
dependencies are not installed
test_specify_query_unflattened_records 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySource) ... SKIP: GCP 
dependencies are not installed
test_specify_query_without_table 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySource) ... SKIP: GCP 

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #1727

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[mxm] ParDoTest: Check expected errors inline instead of after pipeline

[mxm] Add output collection id to timer specs.

[mxm] Move doFn wrapper initialization to open()

[mxm] [BEAM-6009] Add note about default mode for PortableValidatesRunner

[mxm] [BEAM-4681] Add support for portable timers in Flink streaming mode

--
[...truncated 270.83 KB...]
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

--
XML: 

--
Ran 16 tests in 863.502s

OK
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_11_07_47-7067831804315221390?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_11_15_24-13141169015901902169?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_11_07_48-13452410540378718354?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_11_15_33-3129819376384485788?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_11_07_47-8716024428775838849?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_11_15_09-10846657339180189578?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_11_07_47-8904791275372151865?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_11_14_33-5545488340526491160?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_11_07_47-17248077356663773418?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_11_15_23-10434672553127907534?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_11_07_46-13293056256080094186?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_11_14_37-4861484282419668485?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_11_07_48-12155845279559484900?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_11_14_19-15846458283358409762?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_11_07_47-15081770956286081622?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-14_11_14_48-6609592423592496688?project=apache-beam-testing.
:beam-sdks-python:validatesRunnerBatchTests (Thread[Task worker for 
':',5,main]) completed. Took 14 mins 24.454 secs.
:beam-sdks-python:validatesRunnerStreamingTests (Thread[Task worker for 
':',5,main]) started.

> Task :beam-sdks-python:validatesRunnerStreamingTests
Caching disabled for task ':beam-sdks-python:validatesRunnerStreamingTests': 
Caching has not been enabled for the task
Task ':beam-sdks-python:validatesRunnerStreamingTests' is not up-to-date 
because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: 

 Command: sh -c . 

 && ./scripts/run_integration_test.sh --test_opts "--nocapture --processes=8 
--process-timeout=4500 --attr=ValidatesRunner,!sickbay-streaming" --streaming 
true --worker_jar 

Successfully started process 'command 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #1031

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[scott] Roll-forwawrd "Merge pull request #6979: Add helper task to print

[scott] Share docker image name along with shared task

--
[...truncated 225.33 KB...]
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-sdks-java-nexmark-2.9.0-SNAPSHOT-Oo8yPft4afGHuUtDa2cSag.jar
Nov 14, 2018 7:49:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-sdks-java-io-google-cloud-platform-2.9.0-SNAPSHOT-08fkZxBCbhrZKnda1CysaQ.jar
Nov 14, 2018 7:49:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-sdks-java-extensions-join-library-2.9.0-SNAPSHOT-8uant0oTtkHUou0BTAuzQw.jar
Nov 14, 2018 7:49:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-sdks-java-extensions-sql-2.9.0-SNAPSHOT-1tSB0ETj22H6DiVjz5Noxw.jar
Nov 14, 2018 7:49:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-sdks-java-extensions-protobuf-2.9.0-SNAPSHOT-IMRidWgQ-Xn6Um3eBvVRqw.jar
Nov 14, 2018 7:49:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.9.0-SNAPSHOT-yB-16Mv13dDWZClX99zDPQ.jar
Nov 14, 2018 7:49:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-sdks-java-test-utils-2.9.0-SNAPSHOT-B0ZvL5qq6hVukekwunlczw.jar
Nov 14, 2018 7:49:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-sdks-java-io-kafka-2.9.0-SNAPSHOT-TexPdsf-3_MSHcyKmhfDFQ.jar
Nov 14, 2018 7:49:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-sdks-java-core-2.9.0-SNAPSHOT-JVva4Y2DcC98HPkb8V2zSA.jar
Nov 14, 2018 7:49:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-direct-java-2.9.0-SNAPSHOT-VUvL2X6F1U2zXVzlafbzZQ.jar
Nov 14, 2018 7:49:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-core-construction-java-2.9.0-SNAPSHOT-r1ldP4EQmqY10-jb7eCgSA.jar
Nov 14, 2018 7:49:32 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 

Build failed in Jenkins: beam_PreCommit_Java_Cron #587

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[echauchot] [BEAM-5813] wait for the refresh of ES indices in all test utils 
that do

--
[...truncated 52.40 MB...]
LogicalAggregate(group=[{0, 1}], num=[COUNT()])
  LogicalProject(auction=[$0], $f1=[HOP($3, 5000, 1)])
BeamIOSourceRel(table=[[beam, Bid]])

Nov 14, 2018 6:18:41 PM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..4=[{inputs}], proj#0..1=[{exprs}])
  BeamJoinRel(condition=[AND(=($2, $4), >=($1, $3))], joinType=[inner])
BeamCalcRel(expr#0..2=[{inputs}], auction=[$t0], num=[$t2], 
starttime=[$t1])
  BeamAggregationRel(group=[{0, 1}], num=[COUNT()], 
window=[SlidingWindows($1, PT5S, PT10S, PT0S)])
BeamCalcRel(expr#0..4=[{inputs}], auction=[$t0], $f1=[$t3])
  BeamIOSourceRel(table=[[beam, Bid]])
BeamCalcRel(expr#0..1=[{inputs}], maxnum=[$t1], starttime=[$t0])
  BeamAggregationRel(group=[{1}], maxnum=[MAX($0)])
BeamCalcRel(expr#0..2=[{inputs}], num=[$t2], starttime=[$t1])
  BeamAggregationRel(group=[{0, 1}], num=[COUNT()], 
window=[SlidingWindows($1, PT5S, PT10S, PT0S)])
BeamCalcRel(expr#0..4=[{inputs}], auction=[$t0], $f1=[$t3])
  BeamIOSourceRel(table=[[beam, Bid]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery3Test > 
testJoinsPeopleWithAuctions STANDARD_ERROR
Nov 14, 2018 6:18:41 PM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `P`.`name`, `P`.`city`, `P`.`state`, `A`.`id`
FROM `beam`.`Auction` AS `A`
INNER JOIN `beam`.`Person` AS `P` ON `A`.`seller` = `P`.`id`
WHERE `A`.`category` = 10 AND (`P`.`state` = 'OR' OR `P`.`state` = 'ID' OR 
`P`.`state` = 'CA')
Nov 14, 2018 6:18:41 PM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(name=[$11], city=[$14], state=[$15], id=[$0])
  LogicalFilter(condition=[AND(=($8, 10), OR(=($15, 'OR'), =($15, 'ID'), 
=($15, 'CA')))])
LogicalJoin(condition=[=($7, $10)], joinType=[inner])
  BeamIOSourceRel(table=[[beam, Auction]])
  BeamIOSourceRel(table=[[beam, Person]])

Nov 14, 2018 6:18:41 PM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..17=[{inputs}], name=[$t11], city=[$t14], state=[$t15], 
id=[$t0])
  BeamJoinRel(condition=[=($7, $10)], joinType=[inner])
BeamCalcRel(expr#0..9=[{inputs}], expr#10=[10], expr#11=[=($t8, $t10)], 
proj#0..9=[{exprs}], $condition=[$t11])
  BeamIOSourceRel(table=[[beam, Auction]])
BeamCalcRel(expr#0..7=[{inputs}], expr#8=['OR'], expr#9=[=($t5, $t8)], 
expr#10=['ID'], expr#11=[=($t5, $t10)], expr#12=['CA'], expr#13=[=($t5, $t12)], 
expr#14=[OR($t9, $t11, $t13)], proj#0..7=[{exprs}], $condition=[$t14])
  BeamIOSourceRel(table=[[beam, Person]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery7Test > testBids STANDARD_ERROR
Nov 14, 2018 6:18:41 PM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, `B`.`extra`
FROM (SELECT `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, 
`B`.`extra`, TUMBLE_START(`B`.`dateTime`, INTERVAL '10' SECOND) AS `starttime`
FROM `beam`.`Bid` AS `B`
GROUP BY `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, 
`B`.`extra`, TUMBLE(`B`.`dateTime`, INTERVAL '10' SECOND)) AS `B`
INNER JOIN (SELECT MAX(`B1`.`price`) AS `maxprice`, 
TUMBLE_START(`B1`.`dateTime`, INTERVAL '10' SECOND) AS `starttime`
FROM `beam`.`Bid` AS `B1`
GROUP BY TUMBLE(`B1`.`dateTime`, INTERVAL '10' SECOND)) AS `B1` ON 
`B`.`starttime` = `B1`.`starttime` AND `B`.`price` = `B1`.`maxprice`
Nov 14, 2018 6:18:41 PM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(auction=[$0], price=[$1], bidder=[$2], dateTime=[$3], 
extra=[$4])
  LogicalJoin(condition=[AND(=($5, $7), =($1, $6))], joinType=[inner])
LogicalProject(auction=[$0], price=[$1], bidder=[$2], dateTime=[$3], 
extra=[$4], starttime=[$5])
  LogicalAggregate(group=[{0, 1, 2, 3, 4, 5}])
LogicalProject(auction=[$0], price=[$2], bidder=[$1], 
dateTime=[$3], extra=[$4], $f5=[TUMBLE($3, 1)])
  BeamIOSourceRel(table=[[beam, Bid]])
LogicalProject(maxprice=[$1], starttime=[$0])
  LogicalAggregate(group=[{0}], maxprice=[MAX($1)])
LogicalProject($f0=[TUMBLE($3, 1)], price=[$2])
  BeamIOSourceRel(table=[[beam, Bid]])

Nov 14, 2018 6:18:41 PM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow_Gradle #93

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[echauchot] [BEAM-5813] wait for the refresh of ES indices in all test utils 
that do

--
[...truncated 18.19 MB...]
Nov 14, 2018 6:42:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:41:56.651Z: Unzipping flatten s43-u80 for input 
s45-reify-value58-c78
Nov 14, 2018 6:42:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:41:56.703Z: Fusing unzipped copy of 
PAssert$27/GroupGlobally/GroupDummyAndContents/Write, through flatten s43-u80, 
into producer PAssert$27/GroupGlobally/GroupDummyAndContents/Reify
Nov 14, 2018 6:42:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:41:56.753Z: Fusing consumer 
PAssert$27/GroupGlobally/GroupDummyAndContents/Reify into 
PAssert$27/GroupGlobally/WindowIntoDummy/Window.Assign
Nov 14, 2018 6:42:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:41:56.799Z: Fusing consumer 
PAssert$27/GroupGlobally/GroupDummyAndContents/Write into 
PAssert$27/GroupGlobally/GroupDummyAndContents/Reify
Nov 14, 2018 6:42:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:41:56.848Z: Fusing consumer 
Create123/Read(CreateSource)/ParDo(ReadFromBoundedSource) into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
Nov 14, 2018 6:42:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:41:56.900Z: Fusing consumer 
PAssert$27/GroupGlobally/Window.Into()/Window.Assign into OutputSideInputs
Nov 14, 2018 6:42:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:41:56.940Z: Fusing consumer 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable 
into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
Nov 14, 2018 6:42:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:41:56.994Z: Fusing consumer 
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/GroupByWindow
 into 
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read
Nov 14, 2018 6:42:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:41:57.041Z: Fusing consumer 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
 into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
Nov 14, 2018 6:42:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:41:57.098Z: Fusing consumer 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write 
into 
Create123/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
Nov 14, 2018 6:42:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:41:57.197Z: Fusing consumer 
PAssert$27/GroupGlobally/WindowIntoDummy/Window.Assign into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource)
Nov 14, 2018 6:42:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:41:57.255Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource)
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
Nov 14, 2018 6:42:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:41:57.292Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
Nov 14, 2018 6:42:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:41:57.344Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
 into 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
Nov 14, 2018 6:42:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:41:57.376Z: Fusing consumer 
PAssert$27/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
 into 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #1028

2018-11-14 Thread Apache Jenkins Server
See 


--
[...truncated 249.60 KB...]
Nov 14, 2018 6:04:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:04:10.836Z: Fusing consumer 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/GroupByKey/Reify
 into 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/GroupByKey+Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/Combine.GroupedValues/Partial
Nov 14, 2018 6:04:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:04:10.872Z: Fusing consumer 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/GroupByKey+Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/Combine.GroupedValues/Partial
 into 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PrepareCold/Map
Nov 14, 2018 6:04:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:04:10.907Z: Fusing consumer Query7.Format into 
Query7/Query7.Stamp
Nov 14, 2018 6:04:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:04:10.936Z: Fusing consumer Query7/Query7.Stamp into 
Query7/Query7.Debug
Nov 14, 2018 6:04:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:04:10.974Z: Fusing consumer 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PrepareCold/Map
 into 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
Nov 14, 2018 6:04:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:04:11.007Z: Fusing consumer 
Query7/Query7/justBids/IsBid/ParDo(Anonymous) into Query7/Query7.Snoop
Nov 14, 2018 6:04:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:04:11.038Z: Fusing consumer 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 into 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey/Read
Nov 14, 2018 6:04:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:04:11.072Z: Fusing consumer 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 into Query7/Query7/BidToPrice
Nov 14, 2018 6:04:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:04:11.122Z: Fusing consumer 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues/Extract
 into 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
Nov 14, 2018 6:04:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:04:11.159Z: Fusing consumer 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey/Reify
 into 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey+Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues/Partial
Nov 14, 2018 6:04:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:04:11.206Z: Fusing consumer Query7/Query7/justBids/AsBid 
into Query7/Query7/justBids/IsBid/ParDo(Anonymous)
Nov 14, 2018 6:04:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:04:11.243Z: Fusing consumer 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 into 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
Nov 14, 2018 6:04:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-14T18:04:11.279Z: Fusing consumer 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Window.Remerge/Identity/Map
 into 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #1029

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[lcwik] Remove some unnecessary catch blocks (#7034)

--
[...truncated 263.45 KB...]
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at 
com.google.cloud.hadoop.util.AbstractGoogleAsyncWriteChannel$UploadOperation.call(AbstractGoogleAsyncWriteChannel.java:358)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

Nov 14, 2018 6:39:09 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-75nu4ZR7YTGUBfCO8vrYuA.jar
Nov 14, 2018 6:39:09 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-75nu4ZR7YTGUBfCO8vrYuA.jar
Nov 14, 2018 6:39:17 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-75nu4ZR7YTGUBfCO8vrYuA.jar
Nov 14, 2018 6:39:20 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 29 files newly uploaded
Nov 14, 2018 6:39:21 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7.ReadBounded as step s1
Nov 14, 2018 6:39:21 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Monitor as step s2
Nov 14, 2018 6:39:21 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Snoop as step s3
Nov 14, 2018 6:39:21 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 14, 2018 6:39:21 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 14, 2018 6:39:21 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 14, 2018 6:39:21 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 14, 2018 6:39:21 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 14, 2018 6:39:21 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 14, 2018 6:39:21 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 14, 2018 6:39:21 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 as step s11
Nov 14, 2018 6:39:21 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/StripNonce/Map
 as step s12
Nov 14, 2018 6:39:21 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 

Jenkins build is back to normal : beam_PostCommit_Java_PortabilityApi_GradleBuild #157

2018-11-14 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #1030

2018-11-14 Thread Apache Jenkins Server
See 


Changes:

[mxm] ParDoTest: Check expected errors inline instead of after pipeline

[mxm] Add output collection id to timer specs.

[mxm] Move doFn wrapper initialization to open()

[mxm] [BEAM-6009] Add note about default mode for PortableValidatesRunner

[mxm] [BEAM-4681] Add support for portable timers in Flink streaming mode

--
[...truncated 246.25 KB...]
  "errors" : [ {
"domain" : "usageLimits",
"message" : "The total number of changes to the object 
temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-Rc_wgLdGSia4wa6op9S8Fw.jar
 exceeds the rate limit. Please reduce the rate of create, update, and delete 
requests.",
"reason" : "rateLimitExceeded"
  } ],
  "message" : "The total number of changes to the object 
temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-Rc_wgLdGSia4wa6op9S8Fw.jar
 exceeds the rate limit. Please reduce the rate of create, update, and delete 
requests."
}
at 
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at 
com.google.cloud.hadoop.util.AbstractGoogleAsyncWriteChannel$UploadOperation.call(AbstractGoogleAsyncWriteChannel.java:358)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

Nov 14, 2018 6:56:15 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-Rc_wgLdGSia4wa6op9S8Fw.jar
Nov 14, 2018 6:56:19 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 29 files newly uploaded
Nov 14, 2018 6:56:20 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7.ReadBounded as step s1
Nov 14, 2018 6:56:20 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Monitor as step s2
Nov 14, 2018 6:56:20 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Snoop as step s3
Nov 14, 2018 6:56:20 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 14, 2018 6:56:20 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 14, 2018 6:56:20 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 14, 2018 6:56:20 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 14, 2018 6:56:20 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 14, 2018 6:56:20 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 14, 2018 6:56:20 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 14, 2018 6:56:20 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 as step s11
Nov 14, 2018 

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow_Gradle #94

2018-11-14 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org