Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #230

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 18.90 MB...]
INFO: Uploading 
/home/jenkins/.m2/repository/com/thoughtworks/paranamer/paranamer/2.7/paranamer-2.7.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430064408-82e91690/output/results/staging/paranamer-2.7-VweilzYySf_-OOgYnNb5yw.jar
Apr 30, 2018 6:44:09 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-context/1.9.0/28b0836f48c9705abf73829bbc536dba29a1329a/grpc-context-1.9.0.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430064408-82e91690/output/results/staging/grpc-context-1.9.0-atYmq4r_lax8i7-0BRHcCg.jar
Apr 30, 2018 6:44:09 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/io/netty/netty-handler-proxy/4.1.8.Final/netty-handler-proxy-4.1.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430064408-82e91690/output/results/staging/netty-handler-proxy-4.1.8.Final-Zey48Fj4mlWtgpwIc7Osgg.jar
Apr 30, 2018 6:44:09 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/io/netty/netty-codec-socks/4.1.8.Final/netty-codec-socks-4.1.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430064408-82e91690/output/results/staging/netty-codec-socks-4.1.8.Final-dfbCpCpgkEl1iwscd6OVeQ.jar
Apr 30, 2018 6:44:09 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/io/netty/netty-codec-http/4.1.8.Final/netty-codec-http-4.1.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430064408-82e91690/output/results/staging/netty-codec-http-4.1.8.Final-uBsfxiZeL3IeVQNS0QTHUw.jar
Apr 30, 2018 6:44:09 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/objenesis/objenesis/2.6/objenesis-2.6.jar to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430064408-82e91690/output/results/staging/objenesis-2.6-X_rD9RQFypspFZcKIks-jw.jar
Apr 30, 2018 6:44:09 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/io/netty/netty-resolver/4.1.8.Final/netty-resolver-4.1.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430064408-82e91690/output/results/staging/netty-resolver-4.1.8.Final-ZBELwscrxRZgRR5DKH9siA.jar
Apr 30, 2018 6:44:09 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/commons-logging/commons-logging/1.2/commons-logging-1.2.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430064408-82e91690/output/results/staging/commons-logging-1.2-BAtLTY6siG9rSio70vMbAA.jar
Apr 30, 2018 6:44:09 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/mockito/mockito-core/1.9.5/mockito-core-1.9.5.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430064408-82e91690/output/results/staging/mockito-core-1.9.5-b3PPBKVutgqqmWUG58EPxw.jar
Apr 30, 2018 6:44:09 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/io/netty/netty-codec/4.1.8.Final/netty-codec-4.1.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430064408-82e91690/output/results/staging/netty-codec-4.1.8.Final-Og5DJ4MB-fn1oSzjKqVU_w.jar
Apr 30, 2018 6:44:09 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/apache/commons/commons-compress/1.16.1/commons-compress-1.16.1.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430064408-82e91690/output/results/staging/commons-compress-1.16.1-NAljjWtr0jBC7qxc2X4lbA.jar
Apr 30, 2018 6:44:09 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/com/google/code/gson/gson/2.7/gson-2.7.jar to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430064408-82e91690/output/results/staging/gson-2.7-UTSiNQ9YiQ_7nbC0AEcZXQ.jar
Apr 30, 2018 6:44:09 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/commons-codec/commons-codec/1.3/commons-codec-1.3.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430064408-82e91690/output/results/staging/commons-codec-1.3-jhScEFN0HANzalLfg5dNzA.jar
Apr 30, 2018 6:44:09 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/json/json/20160810/

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #229

2018-04-29 Thread Apache Jenkins Server
See 


Changes:

[chamikara] [BEAM-3973] Adds a parameter to the Cloud Spanner read connector 
that

--
[...truncated 19.39 MB...]

org.apache.beam.sdk.io.gcp.datastore.V1ReadIT > 
testE2EV1ReadWithGQLQueryWithNoLimit STANDARD_ERROR
Apr 30, 2018 6:18:24 AM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Successfully wrote 500 entities

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
Apr 30, 2018 6:18:24 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <8772 bytes, hash VDvh5wb0lzo2oAsQ4v17MQ> to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430061817-67e927eb/output/results/staging/pipeline-VDvh5wb0lzo2oAsQ4v17MQ.pb

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_OUT
Dataflow SDK version: 2.5.0-SNAPSHOT

org.apache.beam.sdk.io.gcp.datastore.V1ReadIT > 
testE2EV1ReadWithGQLQueryWithNoLimit STANDARD_ERROR
Apr 30, 2018 6:18:25 AM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Writing batch of 500 entities
Apr 30, 2018 6:18:25 AM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Successfully wrote 500 entities
Apr 30, 2018 6:18:25 AM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil 
deleteAllEntities
INFO: Successfully deleted 1000 entities

org.apache.beam.sdk.io.gcp.datastore.SplitQueryFnIT > 
testSplitQueryFnWithLargeDataset STANDARD_ERROR
Apr 30, 2018 6:18:25 AM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
Apr 30, 2018 6:18:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-04-29_23_18_25-11109540548031537450?project=apache-beam-testing

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_OUT
Submitted job: 2018-04-29_23_18_25-11109540548031537450

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
Apr 30, 2018 6:18:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2018-04-29_23_18_25-11109540548031537450
Apr 30, 2018 6:18:25 AM org.apache.beam.runners.dataflow.TestDataflowRunner 
run
INFO: Running Dataflow job 2018-04-29_23_18_25-11109540548031537450 with 0 
expected assertions.

org.apache.beam.sdk.io.gcp.datastore.SplitQueryFnIT > 
testSplitQueryFnWithLargeDataset STANDARD_ERROR
Apr 30, 2018 6:18:26 AM 
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read getEstimatedSizeBytes
INFO: Latest stats timestamp for kind sort_1G is 152490139000
Apr 30, 2018 6:18:26 AM 
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read getEstimatedNumSplits
INFO: Estimated size bytes for the query is: 213000
Apr 30, 2018 6:18:26 AM 
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read$SplitQueryFn 
processElement
INFO: Splitting the query into 32 splits

org.apache.beam.sdk.io.gcp.datastore.SplitQueryFnIT > 
testSplitQueryFnWithSmallDataset STANDARD_ERROR
Apr 30, 2018 6:18:26 AM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.
Apr 30, 2018 6:18:26 AM 
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read getEstimatedSizeBytes
INFO: Latest stats timestamp for kind shakespeare is 152490139000
Apr 30, 2018 6:18:26 AM 
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read getEstimatedNumSplits
INFO: Estimated size bytes for the query is: 26383451
Apr 30, 2018 6:18:26 AM 
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read$SplitQueryFn 
processElement
INFO: Splitting the query into 12 splits

Gradle Test Executor 121 finished executing tests.

> Task 
> :beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite FAILED
java.lang.NoClassDefFoundError: 
com/google/api/gax/retrying/ExceptionRetryAlgorithm
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at 
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at

Build failed in Jenkins: beam_PerformanceTests_MongoDBIO_IT #112

2018-04-29 Thread Apache Jenkins Server
See 


Changes:

[chamikara] [BEAM-3973] Adds a parameter to the Cloud Spanner read connector 
that

--
[...truncated 887.44 KB...]
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.api:api-common:jar:1.0.0-rc2 from the shaded jar.
[INFO] Excluding com.google.api:gax:jar:1.3.1 from the shaded jar.
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the 
shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-pubsub:jar:v1-rev382-1.23.0 from the shaded 
jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the 
shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.23.0 
from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.23.0 
from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded 
jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-resolver:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-all:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-okhttp:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.squareup.okhttp:okhttp:jar:2.5.0 from the shaded jar.
[INFO] Excluding com.squareup.okio:okio:jar:1.6.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-lite:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-nano:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.protobuf.nano:protobuf-javanano:jar:3.0.0-alpha-5 
from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core:jar:1.0.2 from the shaded 
jar.
[INFO] Excluding org.json:json:jar:20160810 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-spanner:jar:0.20.0b-beta from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-spanner-v1:jar:0.1.11b 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:proto-google-cloud-spanner-admin-instance-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-spanner-v1:jar:0.1.11b 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:grpc-google-cloud-spanner-admin-database-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding 
com.google.api.grpc:grpc-google-cloud-spanner-admin-instance-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-longrunning-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-longrunning-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-protos:jar:1.0.0-pre3 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-client-core:jar:1.0.0 from 
the shaded jar.
[INFO] Excluding commons-logging:commons-logging:jar:1.2 from the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-appengine:jar:0.7.0 from 
the shaded jar.
[INFO] Excluding io.opencensus:opencensus-contrib-grpc-util:jar:0.7.0 from the 
shaded jar.
[INFO] Excluding io.dropwizard.metrics:metrics-core:jar:3.1.2 from the shaded 
jar.
[INFO] Excluding com.google.protobuf:protobuf-java:jar:3.2.0 from the shaded 
jar.
[INFO] Excluding io.netty:netty-tcnative-boringssl-static:jar:1.1.33.Fork26 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:proto-google-cloud-spanner-admin-database-v1:j

Build failed in Jenkins: beam_PerformanceTests_AvroIOIT_HDFS #111

2018-04-29 Thread Apache Jenkins Server
See 


Changes:

[chamikara] [BEAM-3973] Adds a parameter to the Cloud Spanner read connector 
that

--
[...truncated 1.06 MB...]
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at 
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at 
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy62.create(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy63.create(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn$DoFnInvoker.invokeProcessElement(Unknown
 Source)
at 
org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
at 
org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:138)
at 
com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
at 
com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
at 
com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
at 
com.google.cloud.dataflow.worker.AssignWindowsParDoFnFactory$AssignWindowsParDoFn.processElement(AssignWindowsParDoFnFactory.java:118)
at 
com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
at 
com.google.cloud.dataflow.worker.util.common.worker.

Jenkins build is back to normal : beam_PerformanceTests_XmlIOIT_HDFS #110

2018-04-29 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #117

2018-04-29 Thread Apache Jenkins Server
See 


Changes:

[chamikara] [BEAM-3973] Adds a parameter to the Cloud Spanner read connector 
that

--
[...truncated 1.78 KB...]
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: 
[--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7661041291736264997.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7309902636461270751.sh
+ kubectl 
--kubeconfig=
 create namespace filebasedioithdfs-1525064473025
namespace "filebasedioithdfs-1525064473025" created
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7220929133661677761.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=filebasedioithdfs-1525064473025
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins4513463986908057174.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7912845215073243883.sh
+ rm -rf .env
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins6604210043133616776.sh
+ virtualenv .env --system-site-packages
New python executable in 

Also creating executable in 

Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python2
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7480360600346992700.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in 
./.env/lib/python2.7/site-packages (39.1.0)
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages 
(10.0.1)
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins3829379328669399479.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins6440737165207550989.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Collecting absl-py (from -r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15)) (2.10)
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16)) (39.1.0)
Collecting colorlog[windows]==2.6.0 (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached 
https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r PerfKitBenchmarker/requirements.txt (line 18))
Collecting futures>=3.0.3 (from -r PerfKitBenchmarker/requirements.txt (line 
19))
  Using cached 
https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20)) (3.12)
Collecting pint>=0.7 (from -r PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
  Using cached 
https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23)) (3.2.3.post2)
Collecting contextlib2>=0.5.1 (from -r PerfKitBenchmarker/requirements.txt 
(line 24))
  Using cached 
https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1

Build failed in Jenkins: beam_PerformanceTests_HadoopInputFormat #203

2018-04-29 Thread Apache Jenkins Server
See 


Changes:

[chamikara] [BEAM-3973] Adds a parameter to the Cloud Spanner read connector 
that

--
[...truncated 1.65 KB...]
[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins6523027832096484469.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a 
--verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: 
[--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins3470344049052466922.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins6536197247370251554.sh
+ kubectl 
--kubeconfig=
 create namespace hadoopinputformatioit-1525064484996
namespace "hadoopinputformatioit-1525064484996" created
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins1239471699989309746.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=hadoopinputformatioit-1525064484996
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins6114281082745240251.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins2261807486001307506.sh
+ rm -rf .env
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins3675078791806847385.sh
+ virtualenv .env --system-site-packages
New python executable in 

Also creating executable in 

Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python2
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins2310127395822483221.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in 
./.env/lib/python2.7/site-packages (39.1.0)
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages 
(10.0.1)
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins7629945608037433777.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_HadoopInputFormat] $ /bin/bash -xe 
/tmp/jenkins2839604064157817270.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Collecting absl-py (from -r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15)) (2.10)
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16)) (39.1.0)
Collecting colorlog[windows]==2.6.0 (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached 
https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r PerfKitBenchmarker/requirements.txt (line 18))
Collecting futures>=3.0.3 (from -r PerfKitBenchmarker/requirements.txt (line 
19))
  Using cached 
https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20)) (3.12)
Collecting pint>=0.7 (from -r PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
  Using cached 
https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23)) (3.2.3.post2)
Collecting contextlib2>=0.5.1 (from -r PerfKitBenchmarker/requirements.txt 
(line 24))
  Using cached 
ht

Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT_HDFS #111

2018-04-29 Thread Apache Jenkins Server
See 


Changes:

[chamikara] [BEAM-3973] Adds a parameter to the Cloud Spanner read connector 
that

--
[...truncated 1.80 KB...]
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: 
[--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins2153563429860718499.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins6756177850207221272.sh
+ kubectl 
--kubeconfig=
 create namespace filebasedioithdfs-1525064473059
namespace "filebasedioithdfs-1525064473059" created
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins8300349701189951723.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=filebasedioithdfs-1525064473059
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins5817842736416498134.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins3206005458130393711.sh
+ rm -rf .env
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins6904816998998557771.sh
+ virtualenv .env --system-site-packages
New python executable in 

Also creating executable in 

Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python2
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins3534532328187586159.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in 
./.env/lib/python2.7/site-packages (39.1.0)
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages 
(10.0.1)
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins8719765742881969539.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins9219120378298336468.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Collecting absl-py (from -r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15)) (2.10)
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16)) (39.1.0)
Collecting colorlog[windows]==2.6.0 (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached 
https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r PerfKitBenchmarker/requirements.txt (line 18))
Collecting futures>=3.0.3 (from -r PerfKitBenchmarker/requirements.txt (line 
19))
  Using cached 
https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20)) (3.12)
Collecting pint>=0.7 (from -r PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
  Using cached 
https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23)) (3.2.3.post2)
Collecting contextlib2>=0.5.1 (from -r PerfKitBenchmarker/requirements.txt 
(line 24))
  Using cached 
https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pyw

Build failed in Jenkins: beam_PerformanceTests_Python #1210

2018-04-29 Thread Apache Jenkins Server
See 


Changes:

[chamikara] [BEAM-3973] Adds a parameter to the Cloud Spanner read connector 
that

--
[...truncated 4.33 KB...]
Collecting contextlib2>=0.5.1 (from -r PerfKitBenchmarker/requirements.txt 
(line 24))
  Using cached 
https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Requirement already satisfied: six in /usr/local/lib/python2.7/dist-packages 
(from absl-py->-r PerfKitBenchmarker/requirements.txt (line 14)) (1.11.0)
Requirement already satisfied: MarkupSafe>=0.23 in 
/usr/local/lib/python2.7/dist-packages (from jinja2>=2.7->-r 
PerfKitBenchmarker/requirements.txt (line 15)) (1.0)
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached 
https://files.pythonhosted.org/packages/db/c8/7dcf9dbcb22429512708fe3a547f8b6101c0d02137acbd892505aee57adf/colorama-0.3.9-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Requirement already satisfied: requests>=2.9.1 in 
/usr/local/lib/python2.7/dist-packages (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.18.4)
Collecting xmltodict (from pywinrm->-r PerfKitBenchmarker/requirements.txt 
(line 25))
  Using cached 
https://files.pythonhosted.org/packages/42/a9/7e99652c6bc619d19d58cdd8c47560730eb5825d43a7e25db2e1d776ceb7/xmltodict-0.11.0-py2.py3-none-any.whl
Requirement already satisfied: cryptography>=1.3 in 
/usr/local/lib/python2.7/dist-packages (from requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.2.2)
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/69/bc/230987c0dc22c763529330b2e669dbdba374d6a10c1f61232274184731be/ntlm_auth-1.1.0-py2.py3-none-any.whl
Requirement already satisfied: certifi>=2017.4.17 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2018.4.16)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (3.0.4)
Requirement already satisfied: idna<2.7,>=2.5 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.6)
Requirement already satisfied: urllib3<1.23,>=1.21.1 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.22)
Requirement already satisfied: cffi>=1.7; platform_python_implementation != 
"PyPy" in /usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.11.5)
Requirement already satisfied: enum34; python_version < "3" in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.1.6)
Requirement already satisfied: asn1crypto>=0.21.0 in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (0.24.0)
Requirement already satisfied: ipaddress; python_version < "3" in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.0.22)
Requirement already satisfied: pycparser in 
/usr/local/lib/python2.7/dist-packages (from cffi>=1.7; 
platform_python_implementation != 
"PyPy"->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.18)
Installing collected packages: absl-py, colorama, colorlog, blinker, futures, 
pint, numpy, contextlib2, ntlm-auth, requests-ntlm, xmltodict, pywinrm
Successfully installed absl-py-0.2.0 blinker-1.4 colorama-0.3.9 colorlog-2.6.0 
contextlib2-0.5.5 futures-3.2.0 ntlm-auth-1.1.0 numpy-1.13.3 pint-0.8.1 
pywinrm-0.3.0 requests-ntlm-1.1.0 xmltodict-0.11.0
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins1598592255176044864.sh
+ .env/bin/pip install -e 'src/sdks/python/[gcp,test]'
Obtaining 
file://
Collecting avro<2.0.0,>=1.8.1 (from

Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT #437

2018-04-29 Thread Apache Jenkins Server
See 


Changes:

[chamikara] [BEAM-3973] Adds a parameter to the Cloud Spanner read connector 
that

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam15 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e0aac4eabd23446083ae8aaa0caf48b4c26777d7 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e0aac4eabd23446083ae8aaa0caf48b4c26777d7
Commit message: "[BEAM-3973] Adds a parameter to the Cloud Spanner read 
connector that can disable batch API (#4946)"
 > git rev-list --no-walk 6a5e5d96402b676d18dfcb771d3e533a51063690 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Compressed_TextIOIT] $ /bin/bash -xe 
/tmp/jenkins4122389141609535345.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Compressed_TextIOIT] $ /bin/bash -xe 
/tmp/jenkins2596247856924314275.sh
+ rm -rf .env
[beam_PerformanceTests_Compressed_TextIOIT] $ /bin/bash -xe 
/tmp/jenkins2124856150463338767.sh
+ virtualenv .env --system-site-packages
New python executable in 

Also creating executable in 

Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python2
[beam_PerformanceTests_Compressed_TextIOIT] $ /bin/bash -xe 
/tmp/jenkins5026437607192371757.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in 
./.env/lib/python2.7/site-packages (39.1.0)
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages 
(10.0.1)
[beam_PerformanceTests_Compressed_TextIOIT] $ /bin/bash -xe 
/tmp/jenkins8684645485531582788.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Compressed_TextIOIT] $ /bin/bash -xe 
/tmp/jenkins4038350656789775033.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Collecting absl-py (from -r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15)) (2.10)
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16)) (39.1.0)
Collecting colorlog[windows]==2.6.0 (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached 
https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r PerfKitBenchmarker/requirements.txt (line 18))
Collecting futures>=3.0.3 (from -r PerfKitBenchmarker/requirements.txt (line 
19))
  Using cached 
https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20)) (3.12)
Collecting pint>=0.7 (from -r PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
  Using cached 
https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23)) (3.2.3.post2)
Collecting contextlib2>=0.5.1 (from -r PerfKitBenchmarker/requirements.txt 
(line 24))
  Using cached 
https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r PerfKitBenchmarke

[jira] [Work logged] (BEAM-3981) Futurize and fix python 2 compatibility for coders package

2018-04-29 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3981?focusedWorklogId=96548&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-96548
 ]

ASF GitHub Bot logged work on BEAM-3981:


Author: ASF GitHub Bot
Created on: 30/Apr/18 05:36
Start Date: 30/Apr/18 05:36
Worklog Time Spent: 10m 
  Work Description: tvalentyn commented on issue #5053: [BEAM-3981] 
Futurize coders subpackage
URL: https://github.com/apache/beam/pull/5053#issuecomment-385316601
 
 
   I have tried to revert several statements  that might trigger a regression 
(hash functions, dropped typing annotation in cython code, explicit encode() 
statements, and recent commit). I also started a suite with more wide reverts. 
I also plan to take a look at generated cython code by robertwb's suggestion. 
Current suite I am running takes around 8hours to complete, which reduces 
feedback look significantly. If I still don't see the culprit tomorrow, I might 
try to reproduce the regression in a microbenchmark that specifically targets 
the code we are modifying code to reduce the feedback loop. There's not that 
many things left to revert.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 96548)
Time Spent: 15h 10m  (was: 15h)

> Futurize and fix python 2 compatibility for coders package
> --
>
> Key: BEAM-3981
> URL: https://issues.apache.org/jira/browse/BEAM-3981
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Robbe
>Assignee: Robbe
>Priority: Major
>  Time Spent: 15h 10m
>  Remaining Estimate: 0h
>
> Run automatic conversion with futurize tool on coders subpackage and fix 
> python 2 compatibility. This prepares the subpackage for python 3 support.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3246) BigtableIO should merge splits if they exceed 15K

2018-04-29 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3246?focusedWorklogId=96549&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-96549
 ]

ASF GitHub Bot logged work on BEAM-3246:


Author: ASF GitHub Bot
Created on: 30/Apr/18 05:36
Start Date: 30/Apr/18 05:36
Worklog Time Spent: 10m 
  Work Description: chamikaramj commented on issue #4517: [BEAM-3246] 
Bigtable: Merge splits if they exceed 15K
URL: https://github.com/apache/beam/pull/4517#issuecomment-385316670
 
 
   Retest this please


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 96549)
Time Spent: 5h  (was: 4h 50m)

> BigtableIO should merge splits if they exceed 15K
> -
>
> Key: BEAM-3246
> URL: https://issues.apache.org/jira/browse/BEAM-3246
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Reporter: Solomon Duskis
>Assignee: Solomon Duskis
>Priority: Major
>  Time Spent: 5h
>  Remaining Estimate: 0h
>
> A customer hit a problem with a large number of splits.  CloudBitableIO fixes 
> that here 
> https://github.com/GoogleCloudPlatform/cloud-bigtable-client/blob/master/bigtable-dataflow-parent/bigtable-hbase-beam/src/main/java/com/google/cloud/bigtable/beam/CloudBigtableIO.java#L241
> BigtableIO should have similar logic.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3973) Allow to disable batch API in SpannerIO

2018-04-29 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3973?focusedWorklogId=96547&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-96547
 ]

ASF GitHub Bot logged work on BEAM-3973:


Author: ASF GitHub Bot
Created on: 30/Apr/18 05:34
Start Date: 30/Apr/18 05:34
Worklog Time Spent: 10m 
  Work Description: chamikaramj closed pull request #4946: [BEAM-3973] Adds 
a parameter to the Cloud Spanner read connector that can disable batch API
URL: https://github.com/apache/beam/pull/4946
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/spanner/NaiveSpannerRead.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/spanner/NaiveSpannerRead.java
new file mode 100644
index 000..3b68d9f91a8
--- /dev/null
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/spanner/NaiveSpannerRead.java
@@ -0,0 +1,111 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.gcp.spanner;
+
+import com.google.auto.value.AutoValue;
+import com.google.cloud.spanner.BatchReadOnlyTransaction;
+import com.google.cloud.spanner.ResultSet;
+import com.google.cloud.spanner.Struct;
+import com.google.cloud.spanner.TimestampBound;
+import com.google.common.annotations.VisibleForTesting;
+import javax.annotation.Nullable;
+import org.apache.beam.sdk.Pipeline;
+import org.apache.beam.sdk.transforms.DoFn;
+import org.apache.beam.sdk.transforms.PTransform;
+import org.apache.beam.sdk.transforms.ParDo;
+import org.apache.beam.sdk.values.PCollection;
+import org.apache.beam.sdk.values.PCollectionView;
+
+/** A naive version of Spanner read that doesn't use the Batch API. */
+@VisibleForTesting
+@AutoValue
+abstract class NaiveSpannerRead
+extends PTransform, PCollection> {
+
+  public static NaiveSpannerRead create(SpannerConfig spannerConfig,
+  PCollectionView txView, TimestampBound timestampBound) {
+return new AutoValue_NaiveSpannerRead(spannerConfig, txView, 
timestampBound);
+  }
+
+  abstract SpannerConfig getSpannerConfig();
+
+  @Nullable
+  abstract PCollectionView getTxView();
+
+  abstract TimestampBound getTimestampBound();
+
+  @Override
+  public PCollection expand(PCollection input) {
+PCollectionView txView = getTxView();
+if (txView == null) {
+  Pipeline begin = input.getPipeline();
+  SpannerIO.CreateTransaction createTx = SpannerIO.createTransaction()
+  
.withSpannerConfig(getSpannerConfig()).withTimestampBound(getTimestampBound());
+  txView = begin.apply(createTx);
+}
+
+return input.apply("Naive read from Cloud Spanner",
+ParDo.of(new NaiveSpannerReadFn(getSpannerConfig(), 
txView)).withSideInputs(txView));
+  }
+
+  private static class NaiveSpannerReadFn extends DoFn {
+
+private final SpannerConfig config;
+@Nullable private final PCollectionView txView;
+private transient SpannerAccessor spannerAccessor;
+
+NaiveSpannerReadFn(SpannerConfig config, @Nullable 
PCollectionView transaction) {
+  this.config = config;
+  this.txView = transaction;
+}
+
+@Setup
+public void setup() throws Exception {
+  spannerAccessor = config.connectToSpanner();
+}
+
+@Teardown
+public void teardown() throws Exception {
+  spannerAccessor.close();
+}
+
+@ProcessElement
+public void processElement(ProcessContext c) throws Exception {
+  Transaction tx = c.sideInput(txView);
+  ReadOperation op = c.element();
+  BatchReadOnlyTransaction context = spannerAccessor.getBatchClient()
+  .batchReadOnlyTransaction(tx.transactionId());
+  try (ResultSet resultSet = execute(op, context)) {
+while (resultSet.next()) {
+  c.output(resultSet.getCurrentRowAsStruct());
+}
+  }
+}
+
+private ResultSet exe

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1477

2018-04-29 Thread Apache Jenkins Server
See 


Changes:

[chamikara] [BEAM-3973] Adds a parameter to the Cloud Spanner read connector 
that

--
Started by GitHub push by chamikaramj
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e0aac4eabd23446083ae8aaa0caf48b4c26777d7 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e0aac4eabd23446083ae8aaa0caf48b4c26777d7
Commit message: "[BEAM-3973] Adds a parameter to the Cloud Spanner read 
connector that can disable batch API (#4946)"
 > git rev-list --no-walk 6a5e5d96402b676d18dfcb771d3e533a51063690 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PostCommit_Python_ValidatesRunner_Dataflow] $ /bin/bash -xe 
/tmp/jenkins6068170904368626212.sh
+ cd src
+ bash sdks/python/run_validatesrunner.sh

# pip install --user installation location.
LOCAL_PATH=$HOME/.local/bin/

# INFRA does not install virtualenv
pip install virtualenv --user
Requirement already satisfied: virtualenv in /usr/lib/python2.7/dist-packages 
(15.0.1)

# Virtualenv for the rest of the script to run setup & e2e tests
${LOCAL_PATH}/virtualenv sdks/python
sdks/python/run_validatesrunner.sh: line 38: 
/home/jenkins/.local/bin//virtualenv: No such file or directory
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user sweg...@google.com
Not sending mail to unregistered user w...@google.com
Not sending mail to unregistered user szewi...@gmail.com
Not sending mail to unregistered user git...@alasdairhodge.co.uk
Not sending mail to unregistered user ke...@google.com
Not sending mail to unregistered user aal...@gmail.com
Not sending mail to unregistered user ekirpic...@gmail.com
Not sending mail to unregistered user sid...@google.com
Not sending mail to unregistered user katarzyna.kucharc...@polidea.com
Not sending mail to unregistered user apill...@google.com
Not sending mail to unregistered user hero...@google.com
Not sending mail to unregistered user 
re...@relax-macbookpro2.roam.corp.google.com
Not sending mail to unregistered user kirpic...@google.com
Not sending mail to unregistered user ro...@frantil.com


[jira] [Updated] (BEAM-4038) Support Kafka Headers in KafkaIO

2018-04-29 Thread Raghu Angadi (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4038?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Raghu Angadi updated BEAM-4038:
---
Priority: Major  (was: Minor)

> Support Kafka Headers in KafkaIO
> 
>
> Key: BEAM-4038
> URL: https://issues.apache.org/jira/browse/BEAM-4038
> Project: Beam
>  Issue Type: New Feature
>  Components: io-java-kafka
>Reporter: Geet Kumar
>Assignee: Geet Kumar
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 9h 10m
>  Remaining Estimate: 0h
>
> Headers have been added to Kafka Consumer/Producer records (KAFKA-4208). The 
> purpose of this JIRA is to support this feature in KafkaIO.  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-4038) Support Kafka Headers in KafkaIO

2018-04-29 Thread Raghu Angadi (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-4038?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16458325#comment-16458325
 ] 

Raghu Angadi commented on BEAM-4038:


The interfaces with function call backs are problematic since we need to decide 
what context to provide for the function. E.g. 
{{KafkaPublishTimestampFunction}} provides {{elementTimestamp}}. Why? since we 
think that is probably what user wants to know along with KV. Recently we 
deprecated old timestamp functions for reader in order to support watermarks 
better. At least in the case of reader, there is no alternative to having a 
function callback since we need to set watermark/timestamp _before_ the user 
gets to see the record. 

 

Long story short, I think it is better to avoid another function to set 
headers. It will be similar story when more fields are added to 
{{KafkaRecord}}. In fact I think we should remove 
{{KafkaPublishTimestampFunction}}. 

How about adding {{KafkaIO.writeRecords()}} which is a 
{{PCollection<{ProducerRecord, PDone>}}? This way user builds the 
ProducerRecord anyway they see fit. We can provide Avro coder for Kafka's 
{{ProducerRecord}}. We can handle older kafka versions by ignoring fields that 
are not present in old versions. We can add a coder very similar to 
{{KafkaRecordCoder}}.

This is more work than adding a function, but I think it improve flexibility of 
writer now and for future. 

 

 

> Support Kafka Headers in KafkaIO
> 
>
> Key: BEAM-4038
> URL: https://issues.apache.org/jira/browse/BEAM-4038
> Project: Beam
>  Issue Type: New Feature
>  Components: io-java-kafka
>Reporter: Geet Kumar
>Assignee: Geet Kumar
>Priority: Minor
> Fix For: 2.5.0
>
>  Time Spent: 9h 10m
>  Remaining Estimate: 0h
>
> Headers have been added to Kafka Consumer/Producer records (KAFKA-4208). The 
> purpose of this JIRA is to support this feature in KafkaIO.  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-4134) Fix : Potential process leak

2018-04-29 Thread Reza ardeshir rokni (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4134?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Reza ardeshir rokni updated BEAM-4134:
--
Priority: Minor  (was: Major)

> Fix : Potential process leak 
> -
>
> Key: BEAM-4134
> URL: https://issues.apache.org/jira/browse/BEAM-4134
> Project: Beam
>  Issue Type: Bug
>  Components: examples-java
>Affects Versions: 2.5.0
>Reporter: Reza ardeshir rokni
>Assignee: Reuven Lax
>Priority: Minor
>
> Need to check for resource leak as reported by [~reuvenlax]
> [https://github.com/apache/beam/blob/master/examples/java/src/main/java/org/apache/beam/examples/subprocess/kernel/SubProcessKernel.java#L154]
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (BEAM-3851) Support element timestamps while publishing to Kafka.

2018-04-29 Thread Raghu Angadi (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3851?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Raghu Angadi resolved BEAM-3851.

Resolution: Fixed

> Support element timestamps while publishing to Kafka.
> -
>
> Key: BEAM-3851
> URL: https://issues.apache.org/jira/browse/BEAM-3851
> Project: Beam
>  Issue Type: Improvement
>  Components: io-java-kafka
>Affects Versions: 2.3.0
>Reporter: Raghu Angadi
>Assignee: Raghu Angadi
>Priority: Major
> Fix For: 2.5.0
>
>  Time Spent: 2h
>  Remaining Estimate: 0h
>
> KafkaIO sink should support using input element timestamp for the message 
> published to Kafka. Otherwise there is no way for user to influence the 
> timestamp of the messages in Kafka sink.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (BEAM-591) Better handling of watermark in KafkaIO

2018-04-29 Thread Raghu Angadi (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-591?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Raghu Angadi resolved BEAM-591.
---
   Resolution: Fixed
Fix Version/s: 2.4.0

> Better handling of watermark in KafkaIO
> ---
>
> Key: BEAM-591
> URL: https://issues.apache.org/jira/browse/BEAM-591
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-kafka
>Reporter: Raghu Angadi
>Assignee: Raghu Angadi
>Priority: Major
> Fix For: 2.4.0
>
>  Time Spent: 3h 40m
>  Remaining Estimate: 0h
>
> Right now default watermark in KafkaIO is same as timestamp of the record. 
> The main problem with this is that watermark does not change if there n't any 
> new records on the topic. This can hold up many open windows. 
> The record timestamp by default is set to processing time (i.e. when the 
> runner reads a record from Kafka reader).
> A user can provide functions to calculate watermark and record timestamps. 
> There are a few concerns with current design:
> * What should happen when a kafka topic is idle:
>   ** in default case, I think watermark should advance to current time.
>   ** What should happen when user has provided a function to calculate record 
> timestamp? 
>*** Should the watermark stay same as record timestamp?
>*** same when user has provided own watermark function? 
> * Are the current semantics of user provided watermark function correct?
>   ** -it is run once for each record read-.
>   ** -Should it instead be run inside {{getWatermark()}} called by the runner 
> (we could still provide the last user record, and its timestamp)-.
>   ** It does run inside {{getWatermark()}}. should we pass current record 
> timestamp in addition to the record?
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-591) Better handling of watermark in KafkaIO

2018-04-29 Thread Raghu Angadi (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-591?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16458307#comment-16458307
 ] 

Raghu Angadi commented on BEAM-591:
---

See following methods added to `KafkaIO.Read` in 3 pull requests attached this 
jira:
 * {{withLogAppendTime()}}
 * {{withCreateTime()}}
 * {{withProcessingTime()}}
 * {{withTimestampPolicyFactory()}}


> Better handling of watermark in KafkaIO
> ---
>
> Key: BEAM-591
> URL: https://issues.apache.org/jira/browse/BEAM-591
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-kafka
>Reporter: Raghu Angadi
>Assignee: Raghu Angadi
>Priority: Major
> Fix For: 2.4.0
>
>  Time Spent: 3h 40m
>  Remaining Estimate: 0h
>
> Right now default watermark in KafkaIO is same as timestamp of the record. 
> The main problem with this is that watermark does not change if there n't any 
> new records on the topic. This can hold up many open windows. 
> The record timestamp by default is set to processing time (i.e. when the 
> runner reads a record from Kafka reader).
> A user can provide functions to calculate watermark and record timestamps. 
> There are a few concerns with current design:
> * What should happen when a kafka topic is idle:
>   ** in default case, I think watermark should advance to current time.
>   ** What should happen when user has provided a function to calculate record 
> timestamp? 
>*** Should the watermark stay same as record timestamp?
>*** same when user has provided own watermark function? 
> * Are the current semantics of user provided watermark function correct?
>   ** -it is run once for each record read-.
>   ** -Should it instead be run inside {{getWatermark()}} called by the runner 
> (we could still provide the last user record, and its timestamp)-.
>   ** It does run inside {{getWatermark()}}. should we pass current record 
> timestamp in addition to the record?
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1476

2018-04-29 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam23 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6a5e5d96402b676d18dfcb771d3e533a51063690 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6a5e5d96402b676d18dfcb771d3e533a51063690
Commit message: "Merge pull request #5238 from matzew/JMS_MESSAGE_ID_NULL"
 > git rev-list --no-walk 6a5e5d96402b676d18dfcb771d3e533a51063690 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PostCommit_Python_ValidatesRunner_Dataflow] $ /bin/bash -xe 
/tmp/jenkins4755379011458545891.sh
+ cd src
+ bash sdks/python/run_validatesrunner.sh

# pip install --user installation location.
LOCAL_PATH=$HOME/.local/bin/

# INFRA does not install virtualenv
pip install virtualenv --user
Requirement already satisfied: virtualenv in /usr/lib/python2.7/dist-packages 
(15.0.1)

# Virtualenv for the rest of the script to run setup & e2e tests
${LOCAL_PATH}/virtualenv sdks/python
sdks/python/run_validatesrunner.sh: line 38: 
/home/jenkins/.local/bin//virtualenv: No such file or directory
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user sweg...@google.com
Not sending mail to unregistered user w...@google.com
Not sending mail to unregistered user szewi...@gmail.com
Not sending mail to unregistered user git...@alasdairhodge.co.uk
Not sending mail to unregistered user ke...@google.com
Not sending mail to unregistered user aal...@gmail.com
Not sending mail to unregistered user ekirpic...@gmail.com
Not sending mail to unregistered user sid...@google.com
Not sending mail to unregistered user katarzyna.kucharc...@polidea.com
Not sending mail to unregistered user apill...@google.com
Not sending mail to unregistered user hero...@google.com
Not sending mail to unregistered user 
re...@relax-macbookpro2.roam.corp.google.com
Not sending mail to unregistered user kirpic...@google.com
Not sending mail to unregistered user ro...@frantil.com


[jira] [Work logged] (BEAM-4186) Need to be able to set QuerySplitter in DatastoreIO.v1()

2018-04-29 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-4186?focusedWorklogId=96527&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-96527
 ]

ASF GitHub Bot logged work on BEAM-4186:


Author: ASF GitHub Bot
Created on: 30/Apr/18 02:25
Start Date: 30/Apr/18 02:25
Worklog Time Spent: 10m 
  Work Description: fyellin opened a new pull request #5246: [BEAM-4186] 
Enable withQuerySplitter
URL: https://github.com/apache/beam/pull/5246
 
 
   The Jira issue explains the need for this change.
   
   This simply adds withQuerySplitter() to DatastoreV1.Read  The change is 
relatively straightforward.  I've also added a test.  Because one constructor 
changed arguments, I changed all calls to the constructor.
   
   I did not make changes to boot.go or piputil.go.   I'm not sure how they got 
pulled into this, or how to make them go away.  
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 96527)
Time Spent: 10m
Remaining Estimate: 0h

> Need to be able to set QuerySplitter in DatastoreIO.v1()
> 
>
> Key: BEAM-4186
> URL: https://issues.apache.org/jira/browse/BEAM-4186
> Project: Beam
>  Issue Type: New Feature
>  Components: io-java-gcp
>Affects Versions: 2.4.0
>Reporter: Frank Yellin
>Assignee: Frank Yellin
>Priority: Minor
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> I want to add a method
>       withQuerySplitter(QuerySplitter querySplitter)
> to DatastoreV1.Reader.  The implementation is fairly straightforward, except 
> for enforcing the requirement that the query splitter must be Serializable 
> for this to work.
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-4187) Make Context.errors public in InsertRetryPolicy

2018-04-29 Thread Devon Biggerstaff (JIRA)
Devon Biggerstaff created BEAM-4187:
---

 Summary: Make Context.errors public in InsertRetryPolicy
 Key: BEAM-4187
 URL: https://issues.apache.org/jira/browse/BEAM-4187
 Project: Beam
  Issue Type: Improvement
  Components: io-java-gcp
Affects Versions: 2.4.0
 Environment: macOS High Sierra running java 9.0.4 and gcloud 
components 199.0.0, using beam java sdk 2.4.0 and dataflow runner
Reporter: Devon Biggerstaff
Assignee: Chamikara Jayalath


Context.errors should be public in sdk.io.gcp.bigquery.InsertRetryPolicy so 
that it is possible for users to create their own InsertRetryPolicy which 
reacts differently to different errors.

Also, "notFound" should probably be in PERSISTENT_ERRORS, the list of 
non-transient errors for which InsertRetryPolicy.retryTransientErrors does 
*not* retry infinitely-many times.

Finally it would be great if users could specify a max number of retry attempts.

This issue may be related to Beam-3271.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #228

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 19.53 MB...]
INFO: Uploading 
/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430005239-f7bad662/output/results/staging/jackson-core-asl-1.9.13-MZxJpDBOP6n-PNjc_ACdNw.jar
Apr 30, 2018 12:52:40 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/com/google/protobuf/protobuf-java/3.2.0/protobuf-java-3.2.0.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430005239-f7bad662/output/results/staging/protobuf-java-3.2.0-fh30Gescj5k_IhxJOGxriw.jar
Apr 30, 2018 12:52:40 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430005239-f7bad662/output/results/staging/jackson-mapper-asl-1.9.13-F1D5wzk1L8S3KNYbVxcWEw.jar
Apr 30, 2018 12:52:40 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/apache/commons/commons-compress/1.16.1/commons-compress-1.16.1.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430005239-f7bad662/output/results/staging/commons-compress-1.16.1-NAljjWtr0jBC7qxc2X4lbA.jar
Apr 30, 2018 12:52:40 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/io/netty/netty-handler-proxy/4.1.8.Final/netty-handler-proxy-4.1.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430005239-f7bad662/output/results/staging/netty-handler-proxy-4.1.8.Final-Zey48Fj4mlWtgpwIc7Osgg.jar
Apr 30, 2018 12:52:40 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/io/netty/netty-handler/4.1.8.Final/netty-handler-4.1.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430005239-f7bad662/output/results/staging/netty-handler-4.1.8.Final-gZi7R-DTlcEw9CmPWOQBOw.jar
Apr 30, 2018 12:52:40 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/com/google/errorprone/error_prone_annotations/2.0.15/error_prone_annotations-2.0.15.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430005239-f7bad662/output/results/staging/error_prone_annotations-2.0.15-npnuyJrjs2dF75GdvUBIvQ.jar
Apr 30, 2018 12:52:40 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-context/1.9.0/28b0836f48c9705abf73829bbc536dba29a1329a/grpc-context-1.9.0.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430005239-f7bad662/output/results/staging/grpc-context-1.9.0-atYmq4r_lax8i7-0BRHcCg.jar
Apr 30, 2018 12:52:40 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/threeten/threetenbp/1.3.3/threetenbp-1.3.3.jar 
to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430005239-f7bad662/output/results/staging/threetenbp-1.3.3-bEXFSgaAYiXSdUtR-98IjQ.jar
Apr 30, 2018 12:52:40 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/io/netty/netty-codec-socks/4.1.8.Final/netty-codec-socks-4.1.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430005239-f7bad662/output/results/staging/netty-codec-socks-4.1.8.Final-dfbCpCpgkEl1iwscd6OVeQ.jar
Apr 30, 2018 12:52:40 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/io/netty/netty-codec-http/4.1.8.Final/netty-codec-http-4.1.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430005239-f7bad662/output/results/staging/netty-codec-http-4.1.8.Final-uBsfxiZeL3IeVQNS0QTHUw.jar
Apr 30, 2018 12:52:40 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/mockito/mockito-core/1.9.5/mockito-core-1.9.5.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430005239-f7bad662/output/results/staging/mockito-core-1.9.5-b3PPBKVutgqqmWUG58EPxw.jar
Apr 30, 2018 12:52:40 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/commons-logging/commons-logging/1.2/commons-logging-1.2.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0430005239-f7bad662/output/results/staging/commons-logging-1.2-BAtLTY6siG9rSio70vMbAA.jar
Apr 30, 2018 12:52:40 AM

Jenkins build is back to normal : beam_PerformanceTests_JDBC #513

2018-04-29 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_MongoDBIO_IT #111

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 906.13 KB...]
at java.lang.Thread.run(Thread.java:745)
Caused by: com.mongodb.MongoTimeoutException: Timed out after 3 ms while 
waiting for a server that matches 
ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster 
state is {type=UNKNOWN, servers=[{address=35.226.0.156:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
at 
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:75)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:71)
at 
com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63)
at 
com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:210)
at com.mongodb.operation.FindOperation.execute(FindOperation.java:482)
at com.mongodb.operation.FindOperation.execute(FindOperation.java:79)
at com.mongodb.Mongo.execute(Mongo.java:772)
at com.mongodb.Mongo$2.execute(Mongo.java:759)
at com.mongodb.OperationIterable.iterator(OperationIterable.java:47)
at com.mongodb.FindIterableImpl.iterator(FindIterableImpl.java:143)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$BoundedMongoDbReader.start(MongoDbIO.java:456)
at 
com.google.cloud.dataflow.worker.WorkerCustomSources$BoundedReaderIterator.start(WorkerCustomSources.java:592)
... 14 more
java.io.IOException: Failed to start reading from source: 
org.apache.beam.sdk.io.mongodb.MongoDbIO$BoundedMongoDbSource@387b69c8
at 
com.google.cloud.dataflow.worker.WorkerCustomSources$BoundedReaderIterator.start(WorkerCustomSources.java:595)
at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation$SynchronizedReaderIterator.start(ReadOperation.java:360)
at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:193)
at 
com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:158)
at 
com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:75)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:383)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:355)
at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:286)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at 
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: com.mongodb.MongoTimeoutException: Timed out after 3 ms while 
waiting for a server that matches 
ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster 
state is {type=UNKNOWN, servers=[{address=35.226.0.156:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
at 
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:75)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:71)
at 
com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63)
at 
com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:210)
at com.mongodb.operation.FindOperation.execute(FindOperation.java:482)
at com.mongodb.operation.FindOperation.execute(FindOperation.java:79)
at com.mongodb.Mongo.execute(Mongo.java:772)
at com.mongodb.Mongo$2.execute(Mongo.java:759)
at com.mongodb.OperationIterable.iterator(OperationIterable.java:47)
at com.mongodb.FindIterableImpl.iterator(FindIterableImpl

Build failed in Jenkins: beam_PerformanceTests_AvroIOIT_HDFS #110

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 211.17 KB...]
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at 
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at 
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy62.create(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy63.create(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn$DoFnInvoker.invokeProcessElement(Unknown
 Source)
at 
org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
at 
org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:138)
at 
com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
at 
com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
at 
com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
at 
com.google.cloud.dataflow.worker.AssignWindowsParDoFnFactory$AssignWindowsParDoFn.processElement(AssignWindowsParDoFnFactory.java:118)
at 
com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
at 
com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
at 
com.google.cloud.dataflow.worker.SimpleParDoFn

Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT_HDFS #110

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 194.58 KB...]
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at 
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at 
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy62.create(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy63.create(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWith

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #109

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 1.73 KB...]
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: 
[--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins6839851398817353723.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins8137854498322143335.sh
+ kubectl 
--kubeconfig=
 create namespace filebasedioithdfs-1525032061289
namespace "filebasedioithdfs-1525032061289" created
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins3093050180341171362.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=filebasedioithdfs-1525032061289
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins2062575812017060606.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins2160344351277264702.sh
+ rm -rf .env
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins4664279552343982732.sh
+ virtualenv .env --system-site-packages
New python executable in 

Also creating executable in 

Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python2
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins2703353269856492537.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in 
./.env/lib/python2.7/site-packages (39.1.0)
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages 
(10.0.1)
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins6628071474355961937.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7311969646300941434.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Collecting absl-py (from -r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15)) (2.10)
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16)) (39.1.0)
Collecting colorlog[windows]==2.6.0 (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached 
https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r PerfKitBenchmarker/requirements.txt (line 18))
Collecting futures>=3.0.3 (from -r PerfKitBenchmarker/requirements.txt (line 
19))
  Using cached 
https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20)) (3.12)
Collecting pint>=0.7 (from -r PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
  Using cached 
https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23)) (3.2.3.post2)
Collecting contextlib2>=0.5.1 (from -r PerfKitBenchmarker/requirements.txt 
(line 24))
  Using cached 
https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Requirement already satisfied: six in /usr/local/lib/python2.7/dist-

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #116

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 904.69 KB...]
[INFO] Excluding org.apache.hadoop:hadoop-yarn-api:jar:2.6.0 from the shaded 
jar.
[INFO] Excluding org.apache.hadoop:hadoop-yarn-common:jar:2.6.0 from the shaded 
jar.
[INFO] Excluding javax.xml.bind:jaxb-api:jar:2.2.12 from the shaded jar.
[INFO] Excluding com.google.inject.extensions:guice-servlet:jar:3.0 from the 
shaded jar.
[INFO] Excluding com.google.inject:guice:jar:3.0 from the shaded jar.
[INFO] Excluding javax.inject:javax.inject:jar:1 from the shaded jar.
[INFO] Excluding aopalliance:aopalliance:jar:1.0 from the shaded jar.
[INFO] Excluding com.sun.jersey:jersey-server:jar:1.9 from the shaded jar.
[INFO] Excluding asm:asm:jar:3.1 from the shaded jar.
[INFO] Excluding com.sun.jersey.contribs:jersey-guice:jar:1.9 from the shaded 
jar.
[INFO] Excluding jline:jline:jar:2.11 from the shaded jar.
[INFO] Excluding org.apache.ant:ant:jar:1.9.2 from the shaded jar.
[INFO] Excluding org.apache.ant:ant-launcher:jar:1.9.2 from the shaded jar.
[INFO] Excluding net.engio:mbassador:jar:1.1.9 from the shaded jar.
[INFO] Excluding net.lingala.zip4j:zip4j:jar:1.3.2 from the shaded jar.
[INFO] Excluding commons-codec:commons-codec:jar:1.10 from the shaded jar.
[INFO] Excluding org.apache.xbean:xbean-asm5-shaded:jar:4.3 from the shaded jar.
[INFO] Excluding org.jctools:jctools-core:jar:1.1 from the shaded jar.
[INFO] Excluding org.apache.beam:beam-model-pipeline:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java:jar:3.2.0 from the shaded 
jar.
[INFO] Excluding org.apache.beam:beam-sdks-java-core:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Including com.google.guava:guava:jar:20.0 in the shaded jar.
[INFO] Excluding com.google.guava:guava-testlib:jar:20.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding com.github.stephenc.findbugs:findbugs-annotations:jar:1.3.9-1 
from the shaded jar.
[INFO] Excluding com.fasterxml.jackson.core:jackson-core:jar:2.8.9 from the 
shaded jar.
[INFO] Excluding com.fasterxml.jackson.core:jackson-annotations:jar:2.8.9 from 
the shaded jar.
[INFO] Excluding com.fasterxml.jackson.core:jackson-databind:jar:2.8.9 from the 
shaded jar.
[INFO] Excluding net.bytebuddy:byte-buddy:jar:1.7.10 from the shaded jar.
[INFO] Excluding org.apache.avro:avro:jar:1.8.2 from the shaded jar.
[INFO] Excluding com.thoughtworks.paranamer:paranamer:jar:2.7 from the shaded 
jar.
[INFO] Excluding org.tukaani:xz:jar:1.5 from the shaded jar.
[INFO] Excluding org.xerial.snappy:snappy-java:jar:1.1.4 from the shaded jar.
[INFO] Excluding org.apache.commons:commons-compress:jar:1.16.1 from the shaded 
jar.
[INFO] Excluding 
org.apache.beam:beam-runners-core-construction-java:jar:2.5.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding org.apache.beam:beam-model-job-management:jar:2.5.0-SNAPSHOT 
from the shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding org.apache.beam:beam-runners-core-java:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding org.apache.commons:commons-lang3:jar:3.6 from the shaded jar.
[INFO] Excluding com.google.code.findbugs:jsr305:jar:3.0.1 from the shaded jar.
[INFO] Excluding org.objenesis:objenesis:jar:1.0 from the shaded jar.
[INFO] Excluding com.google.auto.service:auto-service:jar:1.0-rc2 from the 
shaded jar.
[INFO] Excluding com.google.auto:auto-common:jar:0.3 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-lite:jar:1.2.0 from the shaded jar.
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing 

 with 

[INFO] Replacing original test artifact with shaded test artifact.
[INFO] Replacing 

 with 

[INFO] 
[INFO] --- maven-surefire-plugin:2.21.0:test (validates-runner-tests

[jira] [Work logged] (BEAM-3983) BigQuery writes from pure SQL

2018-04-29 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3983?focusedWorklogId=96518&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-96518
 ]

ASF GitHub Bot logged work on BEAM-3983:


Author: ASF GitHub Bot
Created on: 30/Apr/18 00:07
Start Date: 30/Apr/18 00:07
Worklog Time Spent: 10m 
  Work Description: kennknowles commented on a change in pull request 
#5220: [BEAM-3983][SQL] Add BigQuery table provider
URL: https://github.com/apache/beam/pull/5220#discussion_r184900165
 
 

 ##
 File path: sdks/java/extensions/sql/build.gradle
 ##
 @@ -65,6 +65,7 @@ dependencies {
   shadow library.java.joda_time
   shadow project(path: ":beam-runners-direct-java", configuration: "shadow")
   provided project(path: ":beam-sdks-java-io-kafka", configuration: "shadow")
+  provided project(path: ":beam-sdks-java-io-google-cloud-platform", 
configuration: "shadow")
 
 Review comment:
   
   Commented on the pom, but since it is deprecated, consider that comment to 
apply to this line.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 96518)
Time Spent: 11.5h  (was: 11h 20m)

> BigQuery writes from pure SQL
> -
>
> Key: BEAM-3983
> URL: https://issues.apache.org/jira/browse/BEAM-3983
> Project: Beam
>  Issue Type: New Feature
>  Components: dsl-sql
>Reporter: Andrew Pilloud
>Assignee: Andrew Pilloud
>Priority: Major
>  Time Spent: 11.5h
>  Remaining Estimate: 0h
>
> It would be nice if you could write to BigQuery in SQL without writing any 
> java code. For example:
> {code:java}
> INSERT INTO bigquery SELECT * FROM PCOLLECTION{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3983) BigQuery writes from pure SQL

2018-04-29 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3983?focusedWorklogId=96521&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-96521
 ]

ASF GitHub Bot logged work on BEAM-3983:


Author: ASF GitHub Bot
Created on: 30/Apr/18 00:07
Start Date: 30/Apr/18 00:07
Worklog Time Spent: 10m 
  Work Description: kennknowles commented on a change in pull request 
#5220: [BEAM-3983][SQL] Add BigQuery table provider
URL: https://github.com/apache/beam/pull/5220#discussion_r184900166
 
 

 ##
 File path: sdks/java/extensions/sql/pom.xml
 ##
 @@ -396,6 +396,12 @@
   provided
 
 
+
+  org.apache.beam
+  beam-sdks-java-io-google-cloud-platform
+  provided
 
 Review comment:
   
   Noting that I think we are going to have to find a different way to manage 
the various providers. For now this follows the pattern set by Kafka so that's 
totally fine. But we should track how to reorg the modules so we don't have to 
bake in everything.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 96521)

> BigQuery writes from pure SQL
> -
>
> Key: BEAM-3983
> URL: https://issues.apache.org/jira/browse/BEAM-3983
> Project: Beam
>  Issue Type: New Feature
>  Components: dsl-sql
>Reporter: Andrew Pilloud
>Assignee: Andrew Pilloud
>Priority: Major
>  Time Spent: 11.5h
>  Remaining Estimate: 0h
>
> It would be nice if you could write to BigQuery in SQL without writing any 
> java code. For example:
> {code:java}
> INSERT INTO bigquery SELECT * FROM PCOLLECTION{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3983) BigQuery writes from pure SQL

2018-04-29 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3983?focusedWorklogId=96523&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-96523
 ]

ASF GitHub Bot logged work on BEAM-3983:


Author: ASF GitHub Bot
Created on: 30/Apr/18 00:07
Start Date: 30/Apr/18 00:07
Worklog Time Spent: 10m 
  Work Description: kennknowles commented on a change in pull request 
#5220: [BEAM-3983][SQL] Add BigQuery table provider
URL: https://github.com/apache/beam/pull/5220#discussion_r184900167
 
 

 ##
 File path: 
sdks/java/extensions/sql/src/test/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BigQueryTableProviderTest.java
 ##
 @@ -0,0 +1,75 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.extensions.sql.meta.provider.bigquery;
+
+import static org.apache.beam.sdk.extensions.sql.RowSqlTypes.INTEGER;
+import static org.apache.beam.sdk.extensions.sql.RowSqlTypes.VARCHAR;
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertNotNull;
+import static org.junit.Assert.assertTrue;
+
+import com.google.common.collect.ImmutableList;
+import org.apache.beam.sdk.extensions.sql.BeamSqlTable;
+import org.apache.beam.sdk.extensions.sql.meta.Column;
+import org.apache.beam.sdk.extensions.sql.meta.Table;
+import org.junit.Test;
+
+/**
+ * UnitTest for {@link BigQueryTableProvider}.
+ */
+public class BigQueryTableProviderTest {
+  private BigQueryTableProvider provider = new BigQueryTableProvider();
+
+  @Test
+  public void testGetTableType() throws Exception {
+assertEquals("bigquery", provider.getTableType());
+  }
+
+  @Test
+  public void testBuildBeamSqlTable() throws Exception {
+Table table = mockTable("hello");
+BeamSqlTable sqlTable = provider.buildBeamSqlTable(table);
+
+assertNotNull(sqlTable);
+assertTrue(sqlTable instanceof BeamBigQueryTable);
+
+BeamBigQueryTable bqTable = (BeamBigQueryTable) sqlTable;
+assertEquals("project:dataset.table", bqTable.getTableSpec());
+  }
+
+  private static Table mockTable(String name) {
 
 Review comment:
   
   Nit: Is `mock` the right term? It has a particular denotation at this point, 
of a weird magic object that doesn't go through real code paths. Since a table 
is basically a data struct, I'd just say it is "fake".


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 96523)
Time Spent: 11h 40m  (was: 11.5h)

> BigQuery writes from pure SQL
> -
>
> Key: BEAM-3983
> URL: https://issues.apache.org/jira/browse/BEAM-3983
> Project: Beam
>  Issue Type: New Feature
>  Components: dsl-sql
>Reporter: Andrew Pilloud
>Assignee: Andrew Pilloud
>Priority: Major
>  Time Spent: 11h 40m
>  Remaining Estimate: 0h
>
> It would be nice if you could write to BigQuery in SQL without writing any 
> java code. For example:
> {code:java}
> INSERT INTO bigquery SELECT * FROM PCOLLECTION{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3983) BigQuery writes from pure SQL

2018-04-29 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3983?focusedWorklogId=96520&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-96520
 ]

ASF GitHub Bot logged work on BEAM-3983:


Author: ASF GitHub Bot
Created on: 30/Apr/18 00:07
Start Date: 30/Apr/18 00:07
Worklog Time Spent: 10m 
  Work Description: kennknowles commented on a change in pull request 
#5220: [BEAM-3983][SQL] Add BigQuery table provider
URL: https://github.com/apache/beam/pull/5220#discussion_r184900164
 
 

 ##
 File path: 
sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BeamBigQueryTable.java
 ##
 @@ -0,0 +1,71 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.extensions.sql.meta.provider.bigquery;
+
+import java.io.Serializable;
+import org.apache.beam.sdk.Pipeline;
+import org.apache.beam.sdk.extensions.sql.impl.schema.BaseBeamTable;
+import org.apache.beam.sdk.extensions.sql.impl.schema.BeamIOType;
+import org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO;
+import org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils;
+import org.apache.beam.sdk.io.gcp.bigquery.WriteResult;
+import org.apache.beam.sdk.schemas.Schema;
+import org.apache.beam.sdk.transforms.PTransform;
+import org.apache.beam.sdk.values.PCollection;
+import org.apache.beam.sdk.values.POutput;
+import org.apache.beam.sdk.values.Row;
+
+/**
+ * {@code BeamBigQueryTable} represent a BigQuery table as source or target.
+ *
+ */
+public class BeamBigQueryTable extends BaseBeamTable implements Serializable {
 
 Review comment:
   
   Good to put `@Experimental` here, even though the whole SQL module is still 
experimental anyhow.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 96520)
Time Spent: 11.5h  (was: 11h 20m)

> BigQuery writes from pure SQL
> -
>
> Key: BEAM-3983
> URL: https://issues.apache.org/jira/browse/BEAM-3983
> Project: Beam
>  Issue Type: New Feature
>  Components: dsl-sql
>Reporter: Andrew Pilloud
>Assignee: Andrew Pilloud
>Priority: Major
>  Time Spent: 11.5h
>  Remaining Estimate: 0h
>
> It would be nice if you could write to BigQuery in SQL without writing any 
> java code. For example:
> {code:java}
> INSERT INTO bigquery SELECT * FROM PCOLLECTION{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3983) BigQuery writes from pure SQL

2018-04-29 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3983?focusedWorklogId=96519&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-96519
 ]

ASF GitHub Bot logged work on BEAM-3983:


Author: ASF GitHub Bot
Created on: 30/Apr/18 00:07
Start Date: 30/Apr/18 00:07
Worklog Time Spent: 10m 
  Work Description: kennknowles commented on a change in pull request 
#5220: [BEAM-3983][SQL] Add BigQuery table provider
URL: https://github.com/apache/beam/pull/5220#discussion_r184900163
 
 

 ##
 File path: 
sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/BeamBigQueryTable.java
 ##
 @@ -0,0 +1,71 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.extensions.sql.meta.provider.bigquery;
+
+import java.io.Serializable;
+import org.apache.beam.sdk.Pipeline;
+import org.apache.beam.sdk.extensions.sql.impl.schema.BaseBeamTable;
+import org.apache.beam.sdk.extensions.sql.impl.schema.BeamIOType;
+import org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO;
+import org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils;
+import org.apache.beam.sdk.io.gcp.bigquery.WriteResult;
+import org.apache.beam.sdk.schemas.Schema;
+import org.apache.beam.sdk.transforms.PTransform;
+import org.apache.beam.sdk.values.PCollection;
+import org.apache.beam.sdk.values.POutput;
+import org.apache.beam.sdk.values.Row;
+
+/**
+ * {@code BeamBigQueryTable} represent a BigQuery table as source or target.
 
 Review comment:
   
   Mention lack of support for read in the javadoc?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 96519)
Time Spent: 11.5h  (was: 11h 20m)

> BigQuery writes from pure SQL
> -
>
> Key: BEAM-3983
> URL: https://issues.apache.org/jira/browse/BEAM-3983
> Project: Beam
>  Issue Type: New Feature
>  Components: dsl-sql
>Reporter: Andrew Pilloud
>Assignee: Andrew Pilloud
>Priority: Major
>  Time Spent: 11.5h
>  Remaining Estimate: 0h
>
> It would be nice if you could write to BigQuery in SQL without writing any 
> java code. For example:
> {code:java}
> INSERT INTO bigquery SELECT * FROM PCOLLECTION{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3983) BigQuery writes from pure SQL

2018-04-29 Thread ASF GitHub Bot (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3983?focusedWorklogId=96522&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-96522
 ]

ASF GitHub Bot logged work on BEAM-3983:


Author: ASF GitHub Bot
Created on: 30/Apr/18 00:07
Start Date: 30/Apr/18 00:07
Worklog Time Spent: 10m 
  Work Description: kennknowles commented on a change in pull request 
#5220: [BEAM-3983][SQL] Add BigQuery table provider
URL: https://github.com/apache/beam/pull/5220#discussion_r184900168
 
 

 ##
 File path: 
sdks/java/extensions/sql/src/main/java/org/apache/beam/sdk/extensions/sql/meta/provider/bigquery/package-info.java
 ##
 @@ -0,0 +1,22 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+/**
+ * table schema for BigQuery.
 
 Review comment:
   
   nit: capitalization


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 96522)
Time Spent: 11h 40m  (was: 11.5h)

> BigQuery writes from pure SQL
> -
>
> Key: BEAM-3983
> URL: https://issues.apache.org/jira/browse/BEAM-3983
> Project: Beam
>  Issue Type: New Feature
>  Components: dsl-sql
>Reporter: Andrew Pilloud
>Assignee: Andrew Pilloud
>Priority: Major
>  Time Spent: 11h 40m
>  Remaining Estimate: 0h
>
> It would be nice if you could write to BigQuery in SQL without writing any 
> java code. For example:
> {code:java}
> INSERT INTO bigquery SELECT * FROM PCOLLECTION{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_Python #1209

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 4.28 KB...]
Collecting contextlib2>=0.5.1 (from -r PerfKitBenchmarker/requirements.txt 
(line 24))
  Using cached 
https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Requirement already satisfied: six in /usr/local/lib/python2.7/dist-packages 
(from absl-py->-r PerfKitBenchmarker/requirements.txt (line 14)) (1.11.0)
Requirement already satisfied: MarkupSafe>=0.23 in 
/usr/local/lib/python2.7/dist-packages (from jinja2>=2.7->-r 
PerfKitBenchmarker/requirements.txt (line 15)) (1.0)
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached 
https://files.pythonhosted.org/packages/db/c8/7dcf9dbcb22429512708fe3a547f8b6101c0d02137acbd892505aee57adf/colorama-0.3.9-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Requirement already satisfied: requests>=2.9.1 in 
/usr/local/lib/python2.7/dist-packages (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.18.4)
Collecting xmltodict (from pywinrm->-r PerfKitBenchmarker/requirements.txt 
(line 25))
  Using cached 
https://files.pythonhosted.org/packages/42/a9/7e99652c6bc619d19d58cdd8c47560730eb5825d43a7e25db2e1d776ceb7/xmltodict-0.11.0-py2.py3-none-any.whl
Requirement already satisfied: cryptography>=1.3 in 
/usr/local/lib/python2.7/dist-packages (from requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.2.2)
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/69/bc/230987c0dc22c763529330b2e669dbdba374d6a10c1f61232274184731be/ntlm_auth-1.1.0-py2.py3-none-any.whl
Requirement already satisfied: certifi>=2017.4.17 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2018.4.16)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (3.0.4)
Requirement already satisfied: idna<2.7,>=2.5 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.6)
Requirement already satisfied: urllib3<1.23,>=1.21.1 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.22)
Requirement already satisfied: cffi>=1.7; platform_python_implementation != 
"PyPy" in /usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.11.5)
Requirement already satisfied: enum34; python_version < "3" in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.1.6)
Requirement already satisfied: asn1crypto>=0.21.0 in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (0.24.0)
Requirement already satisfied: ipaddress; python_version < "3" in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.0.22)
Requirement already satisfied: pycparser in 
/usr/local/lib/python2.7/dist-packages (from cffi>=1.7; 
platform_python_implementation != 
"PyPy"->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.18)
Installing collected packages: absl-py, colorama, colorlog, blinker, futures, 
pint, numpy, contextlib2, ntlm-auth, requests-ntlm, xmltodict, pywinrm
Successfully installed absl-py-0.2.0 blinker-1.4 colorama-0.3.9 colorlog-2.6.0 
contextlib2-0.5.5 futures-3.2.0 ntlm-auth-1.1.0 numpy-1.13.3 pint-0.8.1 
pywinrm-0.3.0 requests-ntlm-1.1.0 xmltodict-0.11.0
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins8861034256233226150.sh
+ .env/bin/pip install -e 'src/sdks/python/[gcp,test]'
Obtaining 
file://
Collecting avro<2.0.0,>=1.8.1 (from apache-beam==2.5.0.dev0)
Requirement already satisfied: crcmod<2.0,>=1.7 in 
/usr/lib/python2.7/dist-packa

[jira] [Created] (BEAM-4186) Need to be able to set QuerySplitter in DatastoreIO.v1()

2018-04-29 Thread Frank Yellin (JIRA)
Frank Yellin created BEAM-4186:
--

 Summary: Need to be able to set QuerySplitter in DatastoreIO.v1()
 Key: BEAM-4186
 URL: https://issues.apache.org/jira/browse/BEAM-4186
 Project: Beam
  Issue Type: New Feature
  Components: io-java-gcp
Affects Versions: 2.4.0
Reporter: Frank Yellin
Assignee: Frank Yellin


I want to add a method

      withQuerySplitter(QuerySplitter querySplitter)

to DatastoreV1.Reader.  The implementation is fairly straightforward, except 
for enforcing the requirement that the query splitter must be Serializable for 
this to work.

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1475

2018-04-29 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam23 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6a5e5d96402b676d18dfcb771d3e533a51063690 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6a5e5d96402b676d18dfcb771d3e533a51063690
Commit message: "Merge pull request #5238 from matzew/JMS_MESSAGE_ID_NULL"
 > git rev-list --no-walk 6a5e5d96402b676d18dfcb771d3e533a51063690 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PostCommit_Python_ValidatesRunner_Dataflow] $ /bin/bash -xe 
/tmp/jenkins5817054869909241407.sh
+ cd src
+ bash sdks/python/run_validatesrunner.sh

# pip install --user installation location.
LOCAL_PATH=$HOME/.local/bin/

# INFRA does not install virtualenv
pip install virtualenv --user
Requirement already satisfied: virtualenv in /usr/lib/python2.7/dist-packages 
(15.0.1)

# Virtualenv for the rest of the script to run setup & e2e tests
${LOCAL_PATH}/virtualenv sdks/python
sdks/python/run_validatesrunner.sh: line 38: 
/home/jenkins/.local/bin//virtualenv: No such file or directory
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user sweg...@google.com
Not sending mail to unregistered user w...@google.com
Not sending mail to unregistered user szewi...@gmail.com
Not sending mail to unregistered user git...@alasdairhodge.co.uk
Not sending mail to unregistered user ke...@google.com
Not sending mail to unregistered user aal...@gmail.com
Not sending mail to unregistered user ekirpic...@gmail.com
Not sending mail to unregistered user sid...@google.com
Not sending mail to unregistered user katarzyna.kucharc...@polidea.com
Not sending mail to unregistered user apill...@google.com
Not sending mail to unregistered user hero...@google.com
Not sending mail to unregistered user 
re...@relax-macbookpro2.roam.corp.google.com
Not sending mail to unregistered user kirpic...@google.com
Not sending mail to unregistered user ro...@frantil.com


Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #227

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 17.68 MB...]
INFO: Uploading 
/home/jenkins/.m2/repository/io/netty/netty-handler-proxy/4.1.8.Final/netty-handler-proxy-4.1.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429184330-b22b0add/output/results/staging/netty-handler-proxy-4.1.8.Final-Zey48Fj4mlWtgpwIc7Osgg.jar
Apr 29, 2018 6:43:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/io/netty/netty-codec-socks/4.1.8.Final/netty-codec-socks-4.1.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429184330-b22b0add/output/results/staging/netty-codec-socks-4.1.8.Final-dfbCpCpgkEl1iwscd6OVeQ.jar
Apr 29, 2018 6:43:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/com/google/protobuf/protobuf-java/3.2.0/protobuf-java-3.2.0.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429184330-b22b0add/output/results/staging/protobuf-java-3.2.0-fh30Gescj5k_IhxJOGxriw.jar
Apr 29, 2018 6:43:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429184330-b22b0add/output/results/staging/beam-sdks-java-core-2.5.0-SNAPSHOT-shaded-tests-7Cb8MsYjqLHr5IR3aIcYvw.jar
Apr 29, 2018 6:43:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/com/google/errorprone/error_prone_annotations/2.0.15/error_prone_annotations-2.0.15.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429184330-b22b0add/output/results/staging/error_prone_annotations-2.0.15-npnuyJrjs2dF75GdvUBIvQ.jar
Apr 29, 2018 6:43:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429184330-b22b0add/output/results/staging/jackson-mapper-asl-1.9.13-F1D5wzk1L8S3KNYbVxcWEw.jar
Apr 29, 2018 6:43:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/apache/httpcomponents/httpclient/4.0.1/httpclient-4.0.1.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429184330-b22b0add/output/results/staging/httpclient-4.0.1-nKmHdIYBAcBsqQEO_WIkoQ.jar
Apr 29, 2018 6:43:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/com/google/guava/guava/20.0/guava-20.0.jar to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429184330-b22b0add/output/results/staging/guava-20.0-8yqKJSRiDb7Mn2v2ogwpPw.jar
Apr 29, 2018 6:43:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/io/netty/netty-buffer/4.1.8.Final/netty-buffer-4.1.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429184330-b22b0add/output/results/staging/netty-buffer-4.1.8.Final-iwny8IFMu0G1ln10Yx5VrA.jar
Apr 29, 2018 6:43:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/threeten/threetenbp/1.3.3/threetenbp-1.3.3.jar 
to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429184330-b22b0add/output/results/staging/threetenbp-1.3.3-bEXFSgaAYiXSdUtR-98IjQ.jar
Apr 29, 2018 6:43:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/io/netty/netty-codec/4.1.8.Final/netty-codec-4.1.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429184330-b22b0add/output/results/staging/netty-codec-4.1.8.Final-Og5DJ4MB-fn1oSzjKqVU_w.jar
Apr 29, 2018 6:43:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/xerial/snappy/snappy-java/1.1.4/snappy-java-1.1.4.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429184330-b22b0add/output/results/staging/snappy-java-1.1.4-SFNwbMuGq13aaoKVzeS1Tw.jar
Apr 29, 2018 6:43:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429184330-b22b0add/output/results/staging/beam-sdks-java-core-2.5.0-SNAPSHOT-shaded-xRq9zuECFaRZa7hBrPujCQ.j

Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT_HDFS #109

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 267.87 KB...]
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at 
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at 
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy62.create(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy63.create(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWith

Jenkins build is back to normal : beam_PerformanceTests_TextIOIT_HDFS #115

2018-04-29 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_MongoDBIO_IT #110

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 879.98 KB...]
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
com.mongodb.MongoTimeoutException: Timed out after 3 ms while waiting for a 
server that matches WritableServerSelector. Client view of cluster state is 
{type=UNKNOWN, servers=[{address=35.225.151.117:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
at 
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:75)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:71)
at 
com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
at 
com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
at com.mongodb.Mongo.execute(Mongo.java:781)
at com.mongodb.Mongo$2.execute(Mongo.java:764)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 3 ms while waiting for a 
server that matches WritableServerSelector. Client view of cluster state is 
{type=UNKNOWN, servers=[{address=35.225.151.117:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
at 
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:75)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:71)
at 
com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
at 
com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
at com.mongodb.Mongo.execute(Mongo.java:781)
at com.mongodb.Mongo$2.execute(Mongo.java:764)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
at 
com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
at 
org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 3 ms while waiting for a 
server that matches WritableServerSelector. Client view of cluster state is 
{type=UNKNOWN, servers=[{address=35.225.151.117:27017, type=UNKNOWN, 
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception 
opening socket}, caused by {java.net.SocketTimeoutException: connect timed 
out}}]
at 
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:75)
at 
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.(ClusterBinding.java:71)
at 
com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
at 
com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
at 
com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
at com.mongodb.Mongo.execute(Mongo.java:781)
at com.mongodb.Mongo$2.execute(Mongo.java:764)
at 
com.m

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #108

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 1.73 KB...]
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: 
[--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins2121491349176826250.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins898108221814044805.sh
+ kubectl 
--kubeconfig=
 create namespace filebasedioithdfs-1525014063279
namespace "filebasedioithdfs-1525014063279" created
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins8453750871660266923.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=filebasedioithdfs-1525014063279
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins4485659180114100523.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins3984296995988792579.sh
+ rm -rf .env
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins5855212881490517145.sh
+ virtualenv .env --system-site-packages
New python executable in 

Also creating executable in 

Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python2
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins78580811818217492.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in 
./.env/lib/python2.7/site-packages (39.1.0)
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages 
(10.0.1)
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins1622943831971599378.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins6505721521948095221.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Collecting absl-py (from -r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15)) (2.10)
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16)) (39.1.0)
Collecting colorlog[windows]==2.6.0 (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached 
https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r PerfKitBenchmarker/requirements.txt (line 18))
Collecting futures>=3.0.3 (from -r PerfKitBenchmarker/requirements.txt (line 
19))
  Using cached 
https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20)) (3.12)
Collecting pint>=0.7 (from -r PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
  Using cached 
https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23)) (3.2.3.post2)
Collecting contextlib2>=0.5.1 (from -r PerfKitBenchmarker/requirements.txt 
(line 24))
  Using cached 
https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Requirement already satisfied: six in /usr/local/lib/python2.7/dist-pac

Build failed in Jenkins: beam_PerformanceTests_JDBC #512

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 890.13 KB...]
[INFO] Excluding log4j:log4j:jar:1.2.17 from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-annotations:jar:2.6.0 from the shaded 
jar.
[INFO] Excluding org.apache.hadoop:hadoop-yarn-api:jar:2.6.0 from the shaded 
jar.
[INFO] Excluding org.apache.hadoop:hadoop-yarn-common:jar:2.6.0 from the shaded 
jar.
[INFO] Excluding javax.xml.bind:jaxb-api:jar:2.2.12 from the shaded jar.
[INFO] Excluding com.google.inject.extensions:guice-servlet:jar:3.0 from the 
shaded jar.
[INFO] Excluding com.google.inject:guice:jar:3.0 from the shaded jar.
[INFO] Excluding javax.inject:javax.inject:jar:1 from the shaded jar.
[INFO] Excluding aopalliance:aopalliance:jar:1.0 from the shaded jar.
[INFO] Excluding com.sun.jersey:jersey-server:jar:1.9 from the shaded jar.
[INFO] Excluding asm:asm:jar:3.1 from the shaded jar.
[INFO] Excluding com.sun.jersey.contribs:jersey-guice:jar:1.9 from the shaded 
jar.
[INFO] Excluding jline:jline:jar:2.11 from the shaded jar.
[INFO] Excluding org.apache.ant:ant:jar:1.9.2 from the shaded jar.
[INFO] Excluding org.apache.ant:ant-launcher:jar:1.9.2 from the shaded jar.
[INFO] Excluding net.engio:mbassador:jar:1.1.9 from the shaded jar.
[INFO] Excluding net.lingala.zip4j:zip4j:jar:1.3.2 from the shaded jar.
[INFO] Excluding commons-codec:commons-codec:jar:1.10 from the shaded jar.
[INFO] Excluding org.apache.xbean:xbean-asm5-shaded:jar:4.3 from the shaded jar.
[INFO] Excluding org.jctools:jctools-core:jar:1.1 from the shaded jar.
[INFO] Excluding org.apache.beam:beam-model-pipeline:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java:jar:3.2.0 from the shaded 
jar.
[INFO] Excluding org.apache.beam:beam-sdks-java-core:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Including com.google.guava:guava:jar:20.0 in the shaded jar.
[INFO] Excluding com.google.guava:guava-testlib:jar:20.0 from the shaded jar.
[INFO] Excluding com.google.errorprone:error_prone_annotations:jar:2.0.15 from 
the shaded jar.
[INFO] Excluding com.github.stephenc.findbugs:findbugs-annotations:jar:1.3.9-1 
from the shaded jar.
[INFO] Excluding com.fasterxml.jackson.core:jackson-core:jar:2.8.9 from the 
shaded jar.
[INFO] Excluding com.fasterxml.jackson.core:jackson-annotations:jar:2.8.9 from 
the shaded jar.
[INFO] Excluding com.fasterxml.jackson.core:jackson-databind:jar:2.8.9 from the 
shaded jar.
[INFO] Excluding net.bytebuddy:byte-buddy:jar:1.7.10 from the shaded jar.
[INFO] Excluding org.apache.avro:avro:jar:1.8.2 from the shaded jar.
[INFO] Excluding com.thoughtworks.paranamer:paranamer:jar:2.7 from the shaded 
jar.
[INFO] Excluding org.tukaani:xz:jar:1.5 from the shaded jar.
[INFO] Excluding org.xerial.snappy:snappy-java:jar:1.1.4 from the shaded jar.
[INFO] Excluding org.apache.commons:commons-compress:jar:1.16.1 from the shaded 
jar.
[INFO] Excluding 
org.apache.beam:beam-runners-core-construction-java:jar:2.5.0-SNAPSHOT from the 
shaded jar.
[INFO] Excluding org.apache.beam:beam-model-job-management:jar:2.5.0-SNAPSHOT 
from the shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the 
shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.7 from the shaded jar.
[INFO] Excluding io.grpc:grpc-core:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-context:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.instrumentation:instrumentation-api:jar:0.3.0 from 
the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding org.apache.beam:beam-runners-core-java:jar:2.5.0-SNAPSHOT from 
the shaded jar.
[INFO] Excluding org.apache.commons:commons-lang3:jar:3.6 from the shaded jar.
[INFO] Excluding com.google.code.findbugs:jsr305:jar:3.0.1 from the shaded jar.
[INFO] Excluding org.objenesis:objenesis:jar:1.0 from the shaded jar.
[INFO] Excluding com.google.auto.service:auto-service:jar:1.0-rc2 from the 
shaded jar.
[INFO] Excluding com.google.auto:auto-common:jar:0.3 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-lite:jar:1.2.0 from the shaded jar.
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing 

 with 

[INFO] Replacing original test artifact with shaded test artifact.
[INFO] Replacing 

 with 


Build failed in Jenkins: beam_PerformanceTests_Python #1208

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 4.28 KB...]
Collecting contextlib2>=0.5.1 (from -r PerfKitBenchmarker/requirements.txt 
(line 24))
  Using cached 
https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Requirement already satisfied: six in /usr/local/lib/python2.7/dist-packages 
(from absl-py->-r PerfKitBenchmarker/requirements.txt (line 14)) (1.11.0)
Requirement already satisfied: MarkupSafe>=0.23 in 
/usr/local/lib/python2.7/dist-packages (from jinja2>=2.7->-r 
PerfKitBenchmarker/requirements.txt (line 15)) (1.0)
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached 
https://files.pythonhosted.org/packages/db/c8/7dcf9dbcb22429512708fe3a547f8b6101c0d02137acbd892505aee57adf/colorama-0.3.9-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Requirement already satisfied: requests>=2.9.1 in 
/usr/local/lib/python2.7/dist-packages (from pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.18.4)
Collecting xmltodict (from pywinrm->-r PerfKitBenchmarker/requirements.txt 
(line 25))
  Using cached 
https://files.pythonhosted.org/packages/42/a9/7e99652c6bc619d19d58cdd8c47560730eb5825d43a7e25db2e1d776ceb7/xmltodict-0.11.0-py2.py3-none-any.whl
Requirement already satisfied: cryptography>=1.3 in 
/usr/local/lib/python2.7/dist-packages (from requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.2.2)
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/69/bc/230987c0dc22c763529330b2e669dbdba374d6a10c1f61232274184731be/ntlm_auth-1.1.0-py2.py3-none-any.whl
Requirement already satisfied: certifi>=2017.4.17 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2018.4.16)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (3.0.4)
Requirement already satisfied: idna<2.7,>=2.5 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.6)
Requirement already satisfied: urllib3<1.23,>=1.21.1 in 
/usr/local/lib/python2.7/dist-packages (from requests>=2.9.1->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.22)
Requirement already satisfied: cffi>=1.7; platform_python_implementation != 
"PyPy" in /usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.11.5)
Requirement already satisfied: enum34; python_version < "3" in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.1.6)
Requirement already satisfied: asn1crypto>=0.21.0 in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (0.24.0)
Requirement already satisfied: ipaddress; python_version < "3" in 
/usr/local/lib/python2.7/dist-packages (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (1.0.22)
Requirement already satisfied: pycparser in 
/usr/local/lib/python2.7/dist-packages (from cffi>=1.7; 
platform_python_implementation != 
"PyPy"->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
PerfKitBenchmarker/requirements.txt (line 25)) (2.18)
Installing collected packages: absl-py, colorama, colorlog, blinker, futures, 
pint, numpy, contextlib2, ntlm-auth, requests-ntlm, xmltodict, pywinrm
Successfully installed absl-py-0.2.0 blinker-1.4 colorama-0.3.9 colorlog-2.6.0 
contextlib2-0.5.5 futures-3.2.0 ntlm-auth-1.1.0 numpy-1.13.3 pint-0.8.1 
pywinrm-0.3.0 requests-ntlm-1.1.0 xmltodict-0.11.0
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins7138293126766063020.sh
+ .env/bin/pip install -e 'src/sdks/python/[gcp,test]'
Obtaining 
file://
Collecting avro<2.0.0,>=1.8.1 (from apache-beam==2.5.0.dev0)
Requirement already satisfied: crcmod<2.0,>=1.7 in 
/usr/lib/python2.7/dist-packa

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1474

2018-04-29 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam23 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6a5e5d96402b676d18dfcb771d3e533a51063690 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6a5e5d96402b676d18dfcb771d3e533a51063690
Commit message: "Merge pull request #5238 from matzew/JMS_MESSAGE_ID_NULL"
 > git rev-list --no-walk 6a5e5d96402b676d18dfcb771d3e533a51063690 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PostCommit_Python_ValidatesRunner_Dataflow] $ /bin/bash -xe 
/tmp/jenkins5511387465887998691.sh
+ cd src
+ bash sdks/python/run_validatesrunner.sh

# pip install --user installation location.
LOCAL_PATH=$HOME/.local/bin/

# INFRA does not install virtualenv
pip install virtualenv --user
Requirement already satisfied: virtualenv in /usr/lib/python2.7/dist-packages 
(15.0.1)

# Virtualenv for the rest of the script to run setup & e2e tests
${LOCAL_PATH}/virtualenv sdks/python
sdks/python/run_validatesrunner.sh: line 38: 
/home/jenkins/.local/bin//virtualenv: No such file or directory
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user sweg...@google.com
Not sending mail to unregistered user w...@google.com
Not sending mail to unregistered user szewi...@gmail.com
Not sending mail to unregistered user git...@alasdairhodge.co.uk
Not sending mail to unregistered user ke...@google.com
Not sending mail to unregistered user aal...@gmail.com
Not sending mail to unregistered user ekirpic...@gmail.com
Not sending mail to unregistered user sid...@google.com
Not sending mail to unregistered user katarzyna.kucharc...@polidea.com
Not sending mail to unregistered user apill...@google.com
Not sending mail to unregistered user hero...@google.com
Not sending mail to unregistered user 
re...@relax-macbookpro2.roam.corp.google.com
Not sending mail to unregistered user kirpic...@google.com
Not sending mail to unregistered user ro...@frantil.com


Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #226

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 18.19 MB...]
INFO: Uploading 
/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429124410-db24cdba/output/results/staging/jackson-core-asl-1.9.13-MZxJpDBOP6n-PNjc_ACdNw.jar
Apr 29, 2018 12:44:11 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-context/1.9.0/28b0836f48c9705abf73829bbc536dba29a1329a/grpc-context-1.9.0.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429124410-db24cdba/output/results/staging/grpc-context-1.9.0-atYmq4r_lax8i7-0BRHcCg.jar
Apr 29, 2018 12:44:11 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/com/google/api/grpc/proto-google-common-protos/0.1.9/proto-google-common-protos-0.1.9.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429124410-db24cdba/output/results/staging/proto-google-common-protos-0.1.9-_83eXiiP6E2LZYIU5feerg.jar
Apr 29, 2018 12:44:11 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429124410-db24cdba/output/results/staging/beam-runners-direct-java-2.5.0-SNAPSHOT-shaded-YXvVrWtdhVaSzyY7l5sisQ.jar
Apr 29, 2018 12:44:11 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/com/google/protobuf/protobuf-java/3.2.0/protobuf-java-3.2.0.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429124410-db24cdba/output/results/staging/protobuf-java-3.2.0-fh30Gescj5k_IhxJOGxriw.jar
Apr 29, 2018 12:44:11 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/apache/httpcomponents/httpclient/4.0.1/httpclient-4.0.1.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429124410-db24cdba/output/results/staging/httpclient-4.0.1-nKmHdIYBAcBsqQEO_WIkoQ.jar
Apr 29, 2018 12:44:11 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/io/netty/netty-codec-http/4.1.8.Final/netty-codec-http-4.1.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429124410-db24cdba/output/results/staging/netty-codec-http-4.1.8.Final-uBsfxiZeL3IeVQNS0QTHUw.jar
Apr 29, 2018 12:44:11 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/json/json/20160810/json-20160810.jar to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429124410-db24cdba/output/results/staging/json-20160810-L3-Jnwdm5lAXdEpMT8FNRg.jar
Apr 29, 2018 12:44:11 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/threeten/threetenbp/1.3.3/threetenbp-1.3.3.jar 
to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429124410-db24cdba/output/results/staging/threetenbp-1.3.3-bEXFSgaAYiXSdUtR-98IjQ.jar
Apr 29, 2018 12:44:11 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/commons-logging/commons-logging/1.2/commons-logging-1.2.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429124410-db24cdba/output/results/staging/commons-logging-1.2-BAtLTY6siG9rSio70vMbAA.jar
Apr 29, 2018 12:44:11 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/org/mockito/mockito-core/1.9.5/mockito-core-1.9.5.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429124410-db24cdba/output/results/staging/mockito-core-1.9.5-b3PPBKVutgqqmWUG58EPxw.jar
Apr 29, 2018 12:44:11 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/io/netty/netty-codec-socks/4.1.8.Final/netty-codec-socks-4.1.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429124410-db24cdba/output/results/staging/netty-codec-socks-4.1.8.Final-dfbCpCpgkEl1iwscd6OVeQ.jar
Apr 29, 2018 12:44:11 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.m2/repository/io/netty/netty-resolver/4.1.8.Final/netty-resolver-4.1.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0429124410-db24cdba/output/results/staging/netty-resolver-4.1.8.Final-ZBELwscrxRZgRR5DKH9siA.jar
Apr 29, 2018 12:4

Jenkins build is back to normal : beam_PerformanceTests_MongoDBIO_IT #109

2018-04-29 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #114

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 1.09 MB...]
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at 
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at 
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy62.create(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy63.create(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at 
org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
at 
org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
at 
org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
at 
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn$Do

Jenkins build is back to normal : beam_PerformanceTests_AvroIOIT_HDFS #108

2018-04-29 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT_HDFS #108

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 1.75 KB...]
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: 
[--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins8555839134771590129.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins1768398758350209653.sh
+ kubectl 
--kubeconfig=
 create namespace filebasedioithdfs-1524996062154
namespace "filebasedioithdfs-1524996062154" created
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins2584868531233016658.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=filebasedioithdfs-1524996062154
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins2224315864887819191.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins269136794490653775.sh
+ rm -rf .env
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins2971675580310920657.sh
+ virtualenv .env --system-site-packages
New python executable in 

Also creating executable in 

Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python2
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins5373292989735824750.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in 
./.env/lib/python2.7/site-packages (39.1.0)
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages 
(10.0.1)
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins8430275230603496780.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Compressed_TextIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7165406337866073352.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Collecting absl-py (from -r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15)) (2.10)
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16)) (39.1.0)
Collecting colorlog[windows]==2.6.0 (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached 
https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r PerfKitBenchmarker/requirements.txt (line 18))
Collecting futures>=3.0.3 (from -r PerfKitBenchmarker/requirements.txt (line 
19))
  Using cached 
https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20)) (3.12)
Collecting pint>=0.7 (from -r PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
  Using cached 
https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23)) (3.2.3.post2)
Collecting contextlib2>=0.5.1 (from -r PerfKitBenchmarker/requirements.txt 
(line 24))
  Using cached 
https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #107

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 1.73 KB...]
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: 
[--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins3762720969526659046.sh
+ cp /home/jenkins/.kube/config 

[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7306559291242286327.sh
+ kubectl 
--kubeconfig=
 create namespace filebasedioithdfs-1524996062174
namespace "filebasedioithdfs-1524996062174" created
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins3264190293428071878.sh
++ kubectl config current-context
+ kubectl 
--kubeconfig=
 config set-context gke_apache-beam-testing_us-central1-a_io-datastores 
--namespace=filebasedioithdfs-1524996062174
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins6167082821413831830.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins3777159768630853690.sh
+ rm -rf .env
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins7520245020037712096.sh
+ virtualenv .env --system-site-packages
New python executable in 

Also creating executable in 

Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python2
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins436474039433398813.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in 
./.env/lib/python2.7/site-packages (39.1.0)
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages 
(10.0.1)
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins8907645129020143164.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins3936095808990327229.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Collecting absl-py (from -r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15)) (2.10)
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16)) (39.1.0)
Collecting colorlog[windows]==2.6.0 (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
  Using cached 
https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r PerfKitBenchmarker/requirements.txt (line 18))
Collecting futures>=3.0.3 (from -r PerfKitBenchmarker/requirements.txt (line 
19))
  Using cached 
https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20)) (3.12)
Collecting pint>=0.7 (from -r PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
  Using cached 
https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23)) (3.2.3.post2)
Collecting contextlib2>=0.5.1 (from -r PerfKitBenchmarker/requirements.txt 
(line 24))
  Using cached 
https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r PerfKitBenchmarker/requirements.txt (line 25))
  Using cached 
https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Requirement already satisfied: six in /usr/local/lib/python2.7/dist-p

Build failed in Jenkins: beam_PerformanceTests_JDBC #511

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 920.42 KB...]
[INFO] Excluding com.google.api.grpc:proto-google-cloud-spanner-v1:jar:0.1.11b 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:proto-google-cloud-spanner-admin-instance-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-spanner-v1:jar:0.1.11b 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:grpc-google-cloud-spanner-admin-database-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding 
com.google.api.grpc:grpc-google-cloud-spanner-admin-instance-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-longrunning-v1:jar:0.1.11 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-longrunning-v1:jar:0.1.11 
from the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-protos:jar:1.0.0-pre3 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-client-core:jar:1.0.0 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-appengine:jar:0.7.0 from 
the shaded jar.
[INFO] Excluding io.opencensus:opencensus-contrib-grpc-util:jar:0.7.0 from the 
shaded jar.
[INFO] Excluding io.dropwizard.metrics:metrics-core:jar:3.1.2 from the shaded 
jar.
[INFO] Excluding com.google.protobuf:protobuf-java:jar:3.2.0 from the shaded 
jar.
[INFO] Excluding io.netty:netty-tcnative-boringssl-static:jar:1.1.33.Fork26 
from the shaded jar.
[INFO] Excluding 
com.google.api.grpc:proto-google-cloud-spanner-admin-database-v1:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-common-protos:jar:0.1.9 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client:jar:1.23.0 from the 
shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client:jar:1.23.0 from 
the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client:jar:1.23.0 from the 
shaded jar.
[INFO] Excluding org.apache.httpcomponents:httpclient:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding org.apache.httpcomponents:httpcore:jar:4.0.1 from the shaded 
jar.
[INFO] Excluding commons-codec:commons-codec:jar:1.3 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.23.0 
from the shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.23.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-clouddebugger:jar:v2-rev233-1.23.0 from the 
shaded jar.
[INFO] Excluding 
com.google.apis:google-api-services-storage:jar:v1-rev124-1.23.0 from the 
shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from 
the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.23.0 from 
the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.23.0 
from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.23.0 
from the shaded jar.
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing 

 with 

[INFO] Replacing original test artifact with shaded test artifact.
[INFO] Replacing 

 with 

[INFO] Dependency-reduced POM written at: 

[INFO] 
[INFO] --- maven-failsafe-plugin:2.21.0:integration-test (default) @ 
beam-sdks-java-io-jdbc ---
[INFO] Failsafe report directory: 

[INFO] parallel='all', perCoreThreadCount=true, threadCount=4, 
useUnlimitedThreads=false, threadCountSuites=0, threadCountClasses=0, 
threadCountMethods=0, parallelOptimized=true
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] Running org.apache.beam.sdk.io.jdbc.JdbcIOIT
[ERROR] Tests run: 2, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 0.001 s 
<<< FAILURE! - in org.apache.beam.sdk.io.

Build failed in Jenkins: beam_PerformanceTests_Python #1207

2018-04-29 Thread Apache Jenkins Server
See 


--
[...truncated 72.94 KB...]
[INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce-banned-dependencies) 
@ beam-sdks-go-container ---
[INFO] 
[INFO] --- maven-dependency-plugin:3.0.2:unpack (copy-dependency) @ 
beam-sdks-go-container ---
[INFO] Configured Artifact: 
org.apache.beam:beam-sdks-go:pkg-sources:2.5.0-SNAPSHOT:zip
[INFO] Unpacking 

 to 

 with includes "" and excludes ""
[INFO] 
[INFO] --- maven-resources-plugin:3.0.2:copy-resources (copy-go-pkg-source) @ 
beam-sdks-go-container ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 

[INFO] 
[INFO] --- maven-resources-plugin:3.0.2:copy-resources (copy-go-cmd-source) @ 
beam-sdks-go-container ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) 
@ beam-sdks-go-container ---
[INFO] 
[INFO] --- mvn-golang-wrapper:2.1.7:get (go-get-imports) @ 
beam-sdks-go-container ---
[INFO] Prepared command line : bin/go get -u google.golang.org/grpc 
golang.org/x/oauth2/google google.golang.org/api/storage/v1
[ERROR] 
[ERROR] -Exec.Err-
[ERROR] # cd /home/jenkins/.mvnGoLang/.go_path/src/github.com/golang/protobuf; 
git pull --ff-only
[ERROR] Your configuration specifies to merge with the ref 'refs/heads/master'
[ERROR] from the remote, but no such ref was fetched.
[ERROR] package github.com/golang/protobuf/proto: exit status 1
[ERROR] 
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Beam :: Parent .. SUCCESS [  2.472 s]
[INFO] Apache Beam :: SDKs :: Java :: Build Tools . SUCCESS [  1.962 s]
[INFO] Apache Beam :: Model ... SUCCESS [  0.029 s]
[INFO] Apache Beam :: Model :: Pipeline ... SUCCESS [  7.329 s]
[INFO] Apache Beam :: Model :: Job Management . SUCCESS [  3.223 s]
[INFO] Apache Beam :: Model :: Fn Execution ... SUCCESS [  3.623 s]
[INFO] Apache Beam :: SDKs  SUCCESS [  0.105 s]
[INFO] Apache Beam :: SDKs :: Go .. SUCCESS [ 23.420 s]
[INFO] Apache Beam :: SDKs :: Go :: Container . FAILURE [ 14.522 s]
[INFO] Apache Beam :: SDKs :: Java  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Core  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Fn Execution  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Extensions .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Extensions :: Google Cloud Platform Core 
SKIPPED
[INFO] Apache Beam :: Runners . SKIPPED
[INFO] Apache Beam :: Runners :: Core Construction Java ... SKIPPED
[INFO] Apache Beam :: Runners :: Core Java  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Harness . SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Container ... SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: Amazon Web Services SKIPPED
[INFO] Apache Beam :: Runners :: Local Java Core .. SKIPPED
[INFO] Apache Beam :: Runners :: Direct Java .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: AMQP .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: Common  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: Cassandra . SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: Elasticsearch . SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: Elasticsearch-Tests SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: Elasticsearch-Tests :: Common 
SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: Elasticsearch-Tests :: 2.x SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: Elasticsearch-Tests :: 5.x SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: XML ... SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: Extensions :: Protobuf SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: Google Cloud Platform SKIPPED
[INFO] Apache Beam :: Runners :: Google Cloud Dataflow  SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: File-based-io-tests SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: Hadoop Common . SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: Hadoop File System SKIPPED
[INFO] Apache Beam :: SDKs :: Java :: IO :: JDBC .. SKIPPED
[INFO] Apache Beam :: SDKs :: Java

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1473

2018-04-29 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam23 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6a5e5d96402b676d18dfcb771d3e533a51063690 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6a5e5d96402b676d18dfcb771d3e533a51063690
Commit message: "Merge pull request #5238 from matzew/JMS_MESSAGE_ID_NULL"
 > git rev-list --no-walk 6a5e5d96402b676d18dfcb771d3e533a51063690 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PostCommit_Python_ValidatesRunner_Dataflow] $ /bin/bash -xe 
/tmp/jenkins1865422694736072878.sh
+ cd src
+ bash sdks/python/run_validatesrunner.sh

# pip install --user installation location.
LOCAL_PATH=$HOME/.local/bin/

# INFRA does not install virtualenv
pip install virtualenv --user
Requirement already satisfied: virtualenv in /usr/lib/python2.7/dist-packages 
(15.0.1)

# Virtualenv for the rest of the script to run setup & e2e tests
${LOCAL_PATH}/virtualenv sdks/python
sdks/python/run_validatesrunner.sh: line 38: 
/home/jenkins/.local/bin//virtualenv: No such file or directory
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user sweg...@google.com
Not sending mail to unregistered user w...@google.com
Not sending mail to unregistered user szewi...@gmail.com
Not sending mail to unregistered user git...@alasdairhodge.co.uk
Not sending mail to unregistered user ke...@google.com
Not sending mail to unregistered user aal...@gmail.com
Not sending mail to unregistered user ekirpic...@gmail.com
Not sending mail to unregistered user sid...@google.com
Not sending mail to unregistered user katarzyna.kucharc...@polidea.com
Not sending mail to unregistered user apill...@google.com
Not sending mail to unregistered user hero...@google.com
Not sending mail to unregistered user 
re...@relax-macbookpro2.roam.corp.google.com
Not sending mail to unregistered user kirpic...@google.com
Not sending mail to unregistered user ro...@frantil.com