See
<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/242/display/redirect>
------------------------------------------
[...truncated 19.54 MB...]
May 01, 2018 12:49:42 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:49:41.985Z: Executing operation
PAssert$3/CreateActual/GatherPanes/GroupByKey/Create
May 01, 2018 12:49:42 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:49:42.021Z: Executing operation
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
May 01, 2018 12:49:42 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:49:42.054Z: Starting 1 workers in us-central1-f...
May 01, 2018 12:49:43 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:49:42.093Z: Executing operation
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
May 01, 2018 12:49:43 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:49:42.141Z: Executing operation
DatastoreV1.Read/Reshuffle/Reshuffle/GroupByKey/Create
May 01, 2018 12:49:43 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:49:42.688Z: Executing operation
DatastoreV1.Read/Create.OfValueProvider/Create.Values/Read(CreateSource)+DatastoreV1.Read/ParDo(GqlQueryTranslate)+DatastoreV1.Read/Split+DatastoreV1.Read/Reshuffle/Pair
with random
key+DatastoreV1.Read/Reshuffle/Reshuffle/Window.Into()/Window.Assign+DatastoreV1.Read/Reshuffle/Reshuffle/GroupByKey/Reify+DatastoreV1.Read/Reshuffle/Reshuffle/GroupByKey/Write
May 01, 2018 12:49:50 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:49:49.072Z: Autoscaling: Raised the number of workers
to 0 based on the rate of progress in the currently running step(s).
May 01, 2018 12:50:05 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:50:04.657Z: Autoscaling: Raised the number of workers
to 1 based on the rate of progress in the currently running step(s).
May 01, 2018 12:50:05 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:50:04.706Z: Autoscaling: Would further reduce the
number of workers but reached the minimum number allowed for the job.
org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
May 01, 2018 12:50:11 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:50:11.473Z: Autoscaling: Resized worker pool from 1 to
0.
May 01, 2018 12:50:11 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:50:11.513Z: Autoscaling: Would further reduce the
number of workers but reached the minimum number allowed for the job.
May 01, 2018 12:50:21 AM
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
INFO: Job 2018-04-30_17_47_11-3862323665452178396 finished with status DONE.
May 01, 2018 12:50:21 AM
org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
INFO: Success result for Dataflow job
2018-04-30_17_47_11-3862323665452178396. Found 0 success, 0 failures out of 0
expected assertions.
May 01, 2018 12:50:22 AM
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Writing batch of 500 entities
May 01, 2018 12:50:22 AM
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Successfully wrote 500 entities
May 01, 2018 12:50:23 AM
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Writing batch of 500 entities
May 01, 2018 12:50:23 AM
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Successfully wrote 500 entities
May 01, 2018 12:50:23 AM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil
deleteAllEntities
INFO: Successfully deleted 1000 entities
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testQuery FAILED
java.lang.NoClassDefFoundError:
com/google/api/gax/retrying/ExceptionRetryAlgorithm
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at com.google.cloud.BaseService.<clinit>(BaseService.java:48)
at
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:60)
at
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:55)
at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:426)
at
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT.setUp(SpannerReadIT.java:93)
Caused by:
java.lang.ClassNotFoundException:
com.google.api.gax.retrying.ExceptionRetryAlgorithm
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 17 more
java.lang.NullPointerException
at
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT.tearDown(SpannerReadIT.java:217)
Gradle Test Executor 129 finished executing tests.
> Task
> :beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testRead FAILED
java.lang.NoClassDefFoundError: Could not initialize class
com.google.cloud.spanner.SpannerImpl
at
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:60)
at
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:55)
at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:426)
at
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT.setUp(SpannerReadIT.java:93)
java.lang.NullPointerException
at
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT.tearDown(SpannerReadIT.java:217)
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testReadAllRecordsInDb FAILED
java.lang.NoClassDefFoundError: Could not initialize class
com.google.cloud.spanner.SpannerImpl
at
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:60)
at
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:55)
at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:426)
at
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT.setUp(SpannerReadIT.java:93)
java.lang.NullPointerException
at
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT.tearDown(SpannerReadIT.java:217)
org.apache.beam.sdk.io.gcp.datastore.V1ReadIT >
testE2EV1ReadWithGQLQueryWithNoLimit STANDARD_ERROR
May 01, 2018 12:50:38 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:50:37.438Z: Workers have started successfully.
May 01, 2018 12:51:04 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:51:04.010Z: Executing operation
DatastoreV1.Read/Reshuffle/Reshuffle/GroupByKey/Close
May 01, 2018 12:51:04 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:51:04.156Z: Executing operation
DatastoreV1.Read/Reshuffle/Reshuffle/GroupByKey/Read+DatastoreV1.Read/Reshuffle/Reshuffle/GroupByKey/GroupByWindow+DatastoreV1.Read/Reshuffle/Reshuffle/ExpandIterable+DatastoreV1.Read/Reshuffle/Values/Values/Map+DatastoreV1.Read/Read+Combine.globally(Count)/WithKeys/AddKeys/Map+Combine.globally(Count)/Combine.perKey(Count)/GroupByKey+Combine.globally(Count)/Combine.perKey(Count)/Combine.GroupedValues/Partial+Combine.globally(Count)/Combine.perKey(Count)/GroupByKey/Reify+Combine.globally(Count)/Combine.perKey(Count)/GroupByKey/Write
May 01, 2018 12:51:13 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:51:13.108Z: Executing operation
Combine.globally(Count)/Combine.perKey(Count)/GroupByKey/Close
May 01, 2018 12:51:13 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:51:13.287Z: Executing operation
Combine.globally(Count)/Combine.perKey(Count)/GroupByKey/Read+Combine.globally(Count)/Combine.perKey(Count)/Combine.GroupedValues+Combine.globally(Count)/Combine.perKey(Count)/Combine.GroupedValues/Extract+Combine.globally(Count)/Values/Values/Map+PAssert$3/CreateActual/FilterActuals/Window.Assign+PAssert$3/CreateActual/GatherPanes/Reify.Window/ParDo(Anonymous)+PAssert$3/CreateActual/GatherPanes/WithKeys/AddKeys/Map+PAssert$3/CreateActual/GatherPanes/Window.Into()/Window.Assign+PAssert$3/CreateActual/GatherPanes/GroupByKey/Reify+PAssert$3/CreateActual/GatherPanes/GroupByKey/Write+Combine.globally(Count)/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)
May 01, 2018 12:51:22 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:51:21.405Z: Executing operation
Combine.globally(Count)/View.AsIterable/CreateDataflowView
May 01, 2018 12:51:23 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:51:21.724Z: Executing operation
Combine.globally(Count)/CreateVoid/Read(CreateSource)+Combine.globally(Count)/ProduceDefault+PAssert$3/CreateActual/FilterActuals/Window.Assign+PAssert$3/CreateActual/GatherPanes/Reify.Window/ParDo(Anonymous)+PAssert$3/CreateActual/GatherPanes/WithKeys/AddKeys/Map+PAssert$3/CreateActual/GatherPanes/Window.Into()/Window.Assign+PAssert$3/CreateActual/GatherPanes/GroupByKey/Reify+PAssert$3/CreateActual/GatherPanes/GroupByKey/Write
May 01, 2018 12:51:27 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:51:24.987Z: Executing operation
PAssert$3/CreateActual/GatherPanes/GroupByKey/Close
May 01, 2018 12:51:27 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:51:25.142Z: Executing operation
PAssert$3/CreateActual/GatherPanes/GroupByKey/Read+PAssert$3/CreateActual/GatherPanes/GroupByKey/GroupByWindow+PAssert$3/CreateActual/GatherPanes/Values/Values/Map+PAssert$3/CreateActual/ExtractPane/Map+PAssert$3/CreateActual/Flatten.Iterables/FlattenIterables/FlatMap+PAssert$3/CreateActual/RewindowActuals/Window.Assign+PAssert$3/CreateActual/ParDo(Anonymous)+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
May 01, 2018 12:51:31 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:51:30.862Z: Executing operation
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Close
May 01, 2018 12:51:31 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:51:30.985Z: Executing operation
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
May 01, 2018 12:51:35 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:51:33.858Z: Executing operation
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
May 01, 2018 12:51:35 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:51:34.029Z: Executing operation
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
May 01, 2018 12:51:39 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:51:39.903Z: Executing operation
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView
May 01, 2018 12:51:41 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:51:40.196Z: Executing operation
PAssert$3/Create.Values/Read(CreateSource)+PAssert$3/WindowToken/Window.Assign+PAssert$3/RunChecks+PAssert$3/VerifyAssertions/ParDo(DefaultConclude)
May 01, 2018 12:51:43 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:51:42.199Z: Cleaning up.
May 01, 2018 12:51:43 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:51:42.459Z: Stopping worker pool...
May 01, 2018 12:53:08 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:53:07.513Z: Autoscaling: Resized worker pool from 1 to
0.
May 01, 2018 12:53:08 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-05-01T00:53:07.556Z: Autoscaling: Would further reduce the
number of workers but reached the minimum number allowed for the job.
May 01, 2018 12:53:17 AM
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
INFO: Job 2018-04-30_17_49_30-5810670578240902988 finished with status DONE.
May 01, 2018 12:53:17 AM
org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
INFO: Success result for Dataflow job
2018-04-30_17_49_30-5810670578240902988. Found 1 success, 0 failures out of 1
expected assertions.
May 01, 2018 12:53:18 AM
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Writing batch of 500 entities
May 01, 2018 12:53:18 AM
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Successfully wrote 500 entities
May 01, 2018 12:53:19 AM
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Writing batch of 500 entities
May 01, 2018 12:53:19 AM
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
INFO: Successfully wrote 500 entities
May 01, 2018 12:53:19 AM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil
deleteAllEntities
INFO: Successfully deleted 1000 entities
org.apache.beam.sdk.io.gcp.datastore.SplitQueryFnIT >
testSplitQueryFnWithLargeDataset STANDARD_ERROR
May 01, 2018 12:53:19 AM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly.
Please use TestPipeline instead.
May 01, 2018 12:53:19 AM
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read getEstimatedSizeBytes
INFO: Latest stats timestamp for kind sort_1G is 1524901390000000
May 01, 2018 12:53:19 AM
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read getEstimatedNumSplits
INFO: Estimated size bytes for the query is: 2130000000
May 01, 2018 12:53:19 AM
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read$SplitQueryFn
processElement
INFO: Splitting the query into 32 splits
org.apache.beam.sdk.io.gcp.datastore.SplitQueryFnIT >
testSplitQueryFnWithSmallDataset STANDARD_ERROR
May 01, 2018 12:53:20 AM org.apache.beam.sdk.transforms.DoFnTester of
WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly.
Please use TestPipeline instead.
May 01, 2018 12:53:20 AM
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read getEstimatedSizeBytes
INFO: Latest stats timestamp for kind shakespeare is 1524901390000000
May 01, 2018 12:53:20 AM
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read getEstimatedNumSplits
INFO: Estimated size bytes for the query is: 26383451
May 01, 2018 12:53:20 AM
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read$SplitQueryFn
processElement
INFO: Splitting the query into 12 splits
Gradle Test Executor 128 finished executing tests.
> Task
> :beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest
> FAILED
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite FAILED
java.lang.NoClassDefFoundError:
com/google/api/gax/retrying/ExceptionRetryAlgorithm
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at com.google.cloud.BaseService.<clinit>(BaseService.java:48)
at
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:60)
at
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:55)
at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:426)
at
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT.setUp(SpannerWriteIT.java:91)
Caused by:
java.lang.ClassNotFoundException:
com.google.api.gax.retrying.ExceptionRetryAlgorithm
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 17 more
java.lang.NullPointerException
at
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT.tearDown(SpannerWriteIT.java:148)
13 tests completed, 4 failed
Finished generating test XML results (0.006 secs) into:
<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformIntegrationTest>
Generating HTML test report...
Finished generating test html results (0.008 secs) into:
<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformIntegrationTest>
:beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest
(Thread[main,5,main]) completed. Took 15 mins 36.181 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest'.
> There were failing tests. See the report at:
> file://<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformIntegrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 5.0.
See
https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 52m 25s
545 actionable tasks: 540 executed, 5 from cache
Publishing build scan...
https://gradle.com/s/huzoqnvxd5qj4
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Recording test results
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user
[email protected]
Not sending mail to unregistered user [email protected]