See
<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/189/display/redirect?page=changes>
Changes:
[sidhom] Add in-process ManagedChannelFactory and update tests to use it
[sidhom] Add common pipeline options for portable runners
[sidhom] Set artifact names explicitly while staging to service
[sidhom] Add CloseableResource to wrap non-closeable resources that must be
[sidhom] [BEAM-4071] Add Portable Runner Job API shim
------------------------------------------
[...truncated 20.11 MB...]
Apr 25, 2018 10:55:15 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
/home/jenkins/.m2/repository/io/netty/netty-common/4.1.8.Final/netty-common-4.1.8.Final.jar
to
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425225513-56bd4e2e/output/results/staging/netty-common-4.1.8.Final-lafOW8vGsI6SRwZSmXNClg.jar
Apr 25, 2018 10:55:15 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
/home/jenkins/.m2/repository/com/google/auto/value/auto-value/1.4/auto-value-1.4.jar
to
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425225513-56bd4e2e/output/results/staging/auto-value-1.4-_7tYdqNvCowycFIDpBenFw.jar
Apr 25, 2018 10:55:15 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.5.0-SNAPSHOT-shaded-tests.jar>
to
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425225513-56bd4e2e/output/results/staging/beam-sdks-java-io-google-cloud-platform-2.5.0-SNAPSHOT-shaded-tests-0H_bINqh474db9AuqdxujA.jar
Apr 25, 2018 10:55:15 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
/home/jenkins/.m2/repository/io/netty/netty-handler-proxy/4.1.8.Final/netty-handler-proxy-4.1.8.Final.jar
to
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425225513-56bd4e2e/output/results/staging/netty-handler-proxy-4.1.8.Final-Zey48Fj4mlWtgpwIc7Osgg.jar
Apr 25, 2018 10:55:15 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
/home/jenkins/.m2/repository/com/squareup/okhttp/okhttp/2.5.0/okhttp-2.5.0.jar
to
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425225513-56bd4e2e/output/results/staging/okhttp-2.5.0-64v0X4G_nxfR_PsuymOqpg.jar
Apr 25, 2018 10:55:20 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 0 files cached, 127 files newly uploaded
Apr 25, 2018 10:55:20 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding GenerateSequence/Read(BoundedCountingSource) as step s1
Apr 25, 2018 10:55:20 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(CreateEntity) as step s2
Apr 25, 2018 10:55:20 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding DatastoreV1.Write/Convert to Mutation/Map as step s3
Apr 25, 2018 10:55:20 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding DatastoreV1.Write/Write Mutation to Datastore as step s4
Apr 25, 2018 10:55:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425225513-56bd4e2e/output/results/staging/
Apr 25, 2018 10:55:20 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <8774 bytes, hash c-CZlcnyOzIczDEKFLuWYQ> to
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-0425225513-56bd4e2e/output/results/staging/pipeline-c-CZlcnyOzIczDEKFLuWYQ.pb
org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_OUT
Dataflow SDK version: 2.5.0-SNAPSHOT
org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
Apr 25, 2018 10:55:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-04-25_15_55_21-10465330941772867377?project=apache-beam-testing
org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_OUT
Submitted job: 2018-04-25_15_55_21-10465330941772867377
org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
Apr 25, 2018 10:55:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2018-04-25_15_55_21-10465330941772867377
Apr 25, 2018 10:55:22 PM
org.apache.beam.runners.dataflow.TestDataflowRunner run
INFO: Running Dataflow job 2018-04-25_15_55_21-10465330941772867377 with 0
expected assertions.
org.apache.beam.sdk.io.gcp.datastore.V1ReadIT >
testE2EV1ReadWithGQLQueryWithNoLimit STANDARD_ERROR
Apr 25, 2018 10:55:22 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
INFO: Job 2018-04-25_15_51_45-16629068779300950722 finished with status
DONE.
Apr 25, 2018 10:55:22 PM
org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
INFO: Success result for Dataflow job
2018-04-25_15_51_45-16629068779300950722. Found 1 success, 0 failures out of 1
expected assertions.
Apr 25, 2018 10:55:23 PM
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Writing batch of 500 entities
Apr 25, 2018 10:55:23 PM
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Successfully wrote 500 entities
Apr 25, 2018 10:55:23 PM
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Writing batch of 500 entities
Apr 25, 2018 10:55:24 PM
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Successfully wrote 500 entities
Apr 25, 2018 10:55:24 PM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil
deleteAllEntities
INFO: Successfully deleted 1000 entities
org.apache.beam.sdk.io.gcp.datastore.SplitQueryFnIT >
testSplitQueryFnWithLargeDataset STANDARD_ERROR
Apr 25, 2018 10:55:24 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly.
Please use TestPipeline instead.
Apr 25, 2018 10:55:24 PM
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read getEstimatedSizeBytes
INFO: Latest stats timestamp for kind sort_1G is 1524555829000000
Apr 25, 2018 10:55:24 PM
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read getEstimatedNumSplits
INFO: Estimated size bytes for the query is: 2130000000
Apr 25, 2018 10:55:24 PM
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read$SplitQueryFn
processElement
INFO: Splitting the query into 32 splits
org.apache.beam.sdk.io.gcp.datastore.SplitQueryFnIT >
testSplitQueryFnWithSmallDataset STANDARD_ERROR
Apr 25, 2018 10:55:25 PM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly.
Please use TestPipeline instead.
Apr 25, 2018 10:55:25 PM
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read getEstimatedSizeBytes
INFO: Latest stats timestamp for kind shakespeare is 1524555829000000
Apr 25, 2018 10:55:25 PM
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read getEstimatedNumSplits
INFO: Estimated size bytes for the query is: 26383451
Apr 25, 2018 10:55:25 PM
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read$SplitQueryFn
processElement
INFO: Splitting the query into 12 splits
Gradle Test Executor 122 finished executing tests.
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite FAILED
java.lang.NoClassDefFoundError:
com/google/api/gax/retrying/ExceptionRetryAlgorithm
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at com.google.cloud.BaseService.<clinit>(BaseService.java:48)
at
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:60)
at
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:55)
at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:426)
at
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT.setUp(SpannerWriteIT.java:91)
Caused by:
java.lang.ClassNotFoundException:
com.google.api.gax.retrying.ExceptionRetryAlgorithm
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 17 more
java.lang.NullPointerException
at
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT.tearDown(SpannerWriteIT.java:148)
org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
Apr 25, 2018 10:55:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:55:21.282Z: Autoscaling is enabled for job
2018-04-25_15_55_21-10465330941772867377. The number of workers will be between
1 and 1000.
Apr 25, 2018 10:55:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:55:21.306Z: Autoscaling was automatically enabled for
job 2018-04-25_15_55_21-10465330941772867377.
Apr 25, 2018 10:55:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:55:23.825Z: Checking required Cloud APIs are enabled.
Apr 25, 2018 10:55:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:55:23.989Z: Checking permissions granted to controller
Service Account.
Apr 25, 2018 10:55:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:55:28.381Z: Worker configuration: n1-standard-1 in
us-central1-c.
Apr 25, 2018 10:55:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:55:28.723Z: Expanding CoGroupByKey operations into
optimizable parts.
Apr 25, 2018 10:55:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:55:28.752Z: Expanding GroupByKey operations into
optimizable parts.
Apr 25, 2018 10:55:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:55:28.789Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
Apr 25, 2018 10:55:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:55:28.847Z: Fusing adjacent ParDo, Read, Write, and
Flatten operations
Apr 25, 2018 10:55:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:55:28.874Z: Fusing consumer ParDo(CreateEntity) into
GenerateSequence/Read(BoundedCountingSource)
Apr 25, 2018 10:55:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:55:28.899Z: Fusing consumer DatastoreV1.Write/Write
Mutation to Datastore into DatastoreV1.Write/Convert to Mutation/Map
Apr 25, 2018 10:55:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:55:28.923Z: Fusing consumer DatastoreV1.Write/Convert
to Mutation/Map into ParDo(CreateEntity)
Apr 25, 2018 10:55:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:55:29.249Z: Executing operation
GenerateSequence/Read(BoundedCountingSource)+ParDo(CreateEntity)+DatastoreV1.Write/Convert
to Mutation/Map+DatastoreV1.Write/Write Mutation to Datastore
Apr 25, 2018 10:55:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:55:29.323Z: Starting 1 workers in us-central1-c...
Apr 25, 2018 10:55:36 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:55:36.630Z: Autoscaling: Raised the number of workers
to 0 based on the rate of progress in the currently running step(s).
Apr 25, 2018 10:55:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:55:47.078Z: Autoscaling: Raised the number of workers
to 1 based on the rate of progress in the currently running step(s).
Apr 25, 2018 10:56:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:56:21.441Z: Workers have started successfully.
Apr 25, 2018 10:56:59 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:56:57.677Z: Cleaning up.
Apr 25, 2018 10:56:59 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:56:57.741Z: Stopping worker pool...
Apr 25, 2018 10:58:34 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:58:32.031Z: Autoscaling: Resized worker pool from 1 to
0.
Apr 25, 2018 10:58:34 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-04-25T22:58:32.055Z: Autoscaling: Would further reduce the
number of workers but reached the minimum number allowed for the job.
Apr 25, 2018 10:58:40 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
INFO: Job 2018-04-25_15_55_21-10465330941772867377 finished with status
DONE.
Apr 25, 2018 10:58:40 PM
org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
INFO: Success result for Dataflow job
2018-04-25_15_55_21-10465330941772867377. Found 0 success, 0 failures out of 0
expected assertions.
Apr 25, 2018 10:58:41 PM
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Writing batch of 500 entities
Apr 25, 2018 10:58:41 PM
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Successfully wrote 500 entities
Apr 25, 2018 10:58:42 PM
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Writing batch of 500 entities
Apr 25, 2018 10:58:42 PM
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Successfully wrote 500 entities
Apr 25, 2018 10:58:42 PM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil
deleteAllEntities
INFO: Successfully deleted 1000 entities
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testQuery FAILED
java.lang.NoClassDefFoundError:
com/google/api/gax/retrying/ExceptionRetryAlgorithm
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at com.google.cloud.BaseService.<clinit>(BaseService.java:48)
at
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:60)
at
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:55)
at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:426)
at
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT.setUp(SpannerReadIT.java:90)
Caused by:
java.lang.ClassNotFoundException:
com.google.api.gax.retrying.ExceptionRetryAlgorithm
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 17 more
java.lang.NullPointerException
at
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT.tearDown(SpannerReadIT.java:198)
Gradle Test Executor 131 finished executing tests.
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testRead FAILED
java.lang.NoClassDefFoundError: Could not initialize class
com.google.cloud.spanner.SpannerImpl
at
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:60)
at
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:55)
at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:426)
at
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT.setUp(SpannerReadIT.java:90)
java.lang.NullPointerException
at
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT.tearDown(SpannerReadIT.java:198)
12 tests completed, 3 failed
Finished generating test XML results (0.008 secs) into:
<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformIntegrationTest>
Generating HTML test report...
Finished generating test html results (0.007 secs) into:
<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformIntegrationTest>
:beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest
FAILED
:beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest
(Thread[Task worker for ':',5,main]) completed. Took 20 mins 27.714 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest'.
> There were failing tests. See the report at:
> file://<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformIntegrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 5.0.
See
https://docs.gradle.org/4.6/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 48m 21s
545 actionable tasks: 542 executed, 3 from cache
Publishing build scan...
https://gradle.com/s/j2hiyu7dsra3g
Build cache (/home/jenkins/.gradle/caches/build-cache-1) has not been cleaned
up in 0 days
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Recording test results
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user
[email protected]