See 
<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/217/display/redirect?page=changes>

Changes:

[kirpichov] Fixes mapping outputs in Dataflow streaming WriteFiles override

------------------------------------------
[...truncated 19.17 MB...]
    Apr 27, 2018 11:54:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:53:59.260Z: Fusing consumer 
Combine.globally(Count)/Combine.perKey(Count)/Combine.GroupedValues/Extract 
into Combine.globally(Count)/Combine.perKey(Count)/Combine.GroupedValues
    Apr 27, 2018 11:54:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:53:59.295Z: Fusing consumer PAssert$3/RunChecks into 
PAssert$3/WindowToken/Window.Assign
    Apr 27, 2018 11:54:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:53:59.318Z: Fusing consumer 
PAssert$3/WindowToken/Window.Assign into 
PAssert$3/Create.Values/Read(CreateSource)
    Apr 27, 2018 11:54:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:53:59.355Z: Fusing consumer 
PAssert$3/VerifyAssertions/ParDo(DefaultConclude) into PAssert$3/RunChecks
    Apr 27, 2018 11:54:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:53:59.718Z: Executing operation 
Combine.globally(Count)/Combine.perKey(Count)/GroupByKey/Create
    Apr 27, 2018 11:54:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:53:59.750Z: Executing operation 
PAssert$3/CreateActual/GatherPanes/GroupByKey/Create
    Apr 27, 2018 11:54:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:53:59.788Z: Executing operation 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
    Apr 27, 2018 11:54:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:53:59.798Z: Starting 1 workers in us-central1-f...
    Apr 27, 2018 11:54:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:53:59.818Z: Executing operation 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    Apr 27, 2018 11:54:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:53:59.849Z: Executing operation 
DatastoreV1.Read/Reshuffle/Reshuffle/GroupByKey/Create

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
    Apr 27, 2018 11:54:08 PM 
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-04-27_16_51_09-12315947957027948551 finished with status 
DONE.
    Apr 27, 2018 11:54:08 PM 
org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
    INFO: Success result for Dataflow job 
2018-04-27_16_51_09-12315947957027948551. Found 0 success, 0 failures out of 0 
expected assertions.

org.apache.beam.sdk.io.gcp.datastore.V1ReadIT > 
testE2EV1ReadWithGQLQueryWithNoLimit STANDARD_ERROR
    Apr 27, 2018 11:54:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:54:08.889Z: Autoscaling: Raised the number of workers 
to 0 based on the rate of progress in the currently running step(s).
    Apr 27, 2018 11:54:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:54:09.031Z: Executing operation 
DatastoreV1.Read/Create.OfValueProvider/Create.Values/Read(CreateSource)+DatastoreV1.Read/ParDo(GqlQueryTranslate)+DatastoreV1.Read/Split+DatastoreV1.Read/Reshuffle/Pair
 with random 
key+DatastoreV1.Read/Reshuffle/Reshuffle/Window.Into()/Window.Assign+DatastoreV1.Read/Reshuffle/Reshuffle/GroupByKey/Reify+DatastoreV1.Read/Reshuffle/Reshuffle/GroupByKey/Write

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
    Apr 27, 2018 11:54:09 PM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
    INFO: Writing batch of 500 entities
    Apr 27, 2018 11:54:09 PM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
    INFO: Successfully wrote 500 entities
    Apr 27, 2018 11:54:10 PM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
    INFO: Writing batch of 500 entities
    Apr 27, 2018 11:54:10 PM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
    INFO: Successfully wrote 500 entities
    Apr 27, 2018 11:54:10 PM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil 
deleteAllEntities
    INFO: Successfully deleted 1000 entities

Gradle Test Executor 131 finished executing tests.

> Task 
> :beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest

org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testQuery FAILED
    java.lang.NoClassDefFoundError: 
com/google/api/gax/retrying/ExceptionRetryAlgorithm
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
        at 
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at com.google.cloud.BaseService.<clinit>(BaseService.java:48)
        at 
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:60)
        at 
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:55)
        at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:426)
        at 
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT.setUp(SpannerReadIT.java:90)

        Caused by:
        java.lang.ClassNotFoundException: 
com.google.api.gax.retrying.ExceptionRetryAlgorithm
            at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
            at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
            at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
            at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
            ... 17 more

    java.lang.NullPointerException
        at 
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT.tearDown(SpannerReadIT.java:198)

org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testRead FAILED
    java.lang.NoClassDefFoundError: Could not initialize class 
com.google.cloud.spanner.SpannerImpl
        at 
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:60)
        at 
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:55)
        at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:426)
        at 
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT.setUp(SpannerReadIT.java:90)

    java.lang.NullPointerException
        at 
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT.tearDown(SpannerReadIT.java:198)

org.apache.beam.sdk.io.gcp.datastore.V1ReadIT > 
testE2EV1ReadWithGQLQueryWithNoLimit STANDARD_ERROR
    Apr 27, 2018 11:54:15 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:54:14.377Z: Autoscaling: Resizing worker pool from 1 to 
5.
    Apr 27, 2018 11:54:48 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:54:47.055Z: Autoscaling: Raised the number of workers 
to 1 based on the rate of progress in the currently running step(s).
    Apr 27, 2018 11:54:48 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:54:47.081Z: Resized worker pool to 1, though goal was 
5.  This could be a quota issue.
    Apr 27, 2018 11:54:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:54:57.582Z: Autoscaling: Raised the number of workers 
to 5 based on the rate of progress in the currently running step(s).
    Apr 27, 2018 11:55:12 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:55:12.308Z: Workers have started successfully.
    Apr 27, 2018 11:55:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:55:35.521Z: Executing operation 
DatastoreV1.Read/Reshuffle/Reshuffle/GroupByKey/Close
    Apr 27, 2018 11:55:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:55:35.566Z: Executing operation 
DatastoreV1.Read/Reshuffle/Reshuffle/GroupByKey/Read+DatastoreV1.Read/Reshuffle/Reshuffle/GroupByKey/GroupByWindow+DatastoreV1.Read/Reshuffle/Reshuffle/ExpandIterable+DatastoreV1.Read/Reshuffle/Values/Values/Map+DatastoreV1.Read/Read+Combine.globally(Count)/WithKeys/AddKeys/Map+Combine.globally(Count)/Combine.perKey(Count)/GroupByKey+Combine.globally(Count)/Combine.perKey(Count)/Combine.GroupedValues/Partial+Combine.globally(Count)/Combine.perKey(Count)/GroupByKey/Reify+Combine.globally(Count)/Combine.perKey(Count)/GroupByKey/Write
    Apr 27, 2018 11:55:45 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:55:44.877Z: Executing operation 
Combine.globally(Count)/Combine.perKey(Count)/GroupByKey/Close
    Apr 27, 2018 11:55:45 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:55:44.943Z: Executing operation 
Combine.globally(Count)/Combine.perKey(Count)/GroupByKey/Read+Combine.globally(Count)/Combine.perKey(Count)/Combine.GroupedValues+Combine.globally(Count)/Combine.perKey(Count)/Combine.GroupedValues/Extract+Combine.globally(Count)/Values/Values/Map+PAssert$3/CreateActual/FilterActuals/Window.Assign+PAssert$3/CreateActual/GatherPanes/Reify.Window/ParDo(Anonymous)+PAssert$3/CreateActual/GatherPanes/WithKeys/AddKeys/Map+PAssert$3/CreateActual/GatherPanes/Window.Into()/Window.Assign+PAssert$3/CreateActual/GatherPanes/GroupByKey/Reify+PAssert$3/CreateActual/GatherPanes/GroupByKey/Write+Combine.globally(Count)/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)
    Apr 27, 2018 11:55:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:55:54.217Z: Executing operation 
Combine.globally(Count)/View.AsIterable/CreateDataflowView
    Apr 27, 2018 11:55:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:55:54.379Z: Executing operation 
Combine.globally(Count)/CreateVoid/Read(CreateSource)+Combine.globally(Count)/ProduceDefault+PAssert$3/CreateActual/FilterActuals/Window.Assign+PAssert$3/CreateActual/GatherPanes/Reify.Window/ParDo(Anonymous)+PAssert$3/CreateActual/GatherPanes/WithKeys/AddKeys/Map+PAssert$3/CreateActual/GatherPanes/Window.Into()/Window.Assign+PAssert$3/CreateActual/GatherPanes/GroupByKey/Reify+PAssert$3/CreateActual/GatherPanes/GroupByKey/Write
    Apr 27, 2018 11:55:57 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:55:56.408Z: Executing operation 
PAssert$3/CreateActual/GatherPanes/GroupByKey/Close
    Apr 27, 2018 11:55:57 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:55:56.469Z: Executing operation 
PAssert$3/CreateActual/GatherPanes/GroupByKey/Read+PAssert$3/CreateActual/GatherPanes/GroupByKey/GroupByWindow+PAssert$3/CreateActual/GatherPanes/Values/Values/Map+PAssert$3/CreateActual/ExtractPane/Map+PAssert$3/CreateActual/Flatten.Iterables/FlattenIterables/FlatMap+PAssert$3/CreateActual/RewindowActuals/Window.Assign+PAssert$3/CreateActual/ParDo(Anonymous)+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
    Apr 27, 2018 11:55:59 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:55:58.232Z: Executing operation 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Close
    Apr 27, 2018 11:55:59 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:55:58.327Z: Executing operation 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    Apr 27, 2018 11:56:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:55:59.827Z: Executing operation 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    Apr 27, 2018 11:56:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:55:59.899Z: Executing operation 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
    Apr 27, 2018 11:56:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:56:03.792Z: Executing operation 
PAssert$3/CreateActual/View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView
    Apr 27, 2018 11:56:06 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:56:03.953Z: Executing operation 
PAssert$3/Create.Values/Read(CreateSource)+PAssert$3/WindowToken/Window.Assign+PAssert$3/RunChecks+PAssert$3/VerifyAssertions/ParDo(DefaultConclude)
    Apr 27, 2018 11:56:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:56:07.145Z: Cleaning up.
    Apr 27, 2018 11:56:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:56:07.441Z: Stopping worker pool...
    Apr 27, 2018 11:57:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:57:24.090Z: Autoscaling: Resized worker pool from 5 to 
0.
    Apr 27, 2018 11:57:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-27T23:57:24.119Z: Autoscaling: Would further reduce the 
number of workers but reached the minimum number allowed for the job.
    Apr 27, 2018 11:57:31 PM 
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-04-27_16_53_50-12730139443119953841 finished with status 
DONE.
    Apr 27, 2018 11:57:31 PM 
org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
    INFO: Success result for Dataflow job 
2018-04-27_16_53_50-12730139443119953841. Found 1 success, 0 failures out of 1 
expected assertions.
    Apr 27, 2018 11:57:32 PM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
    INFO: Writing batch of 500 entities
    Apr 27, 2018 11:57:32 PM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
    INFO: Successfully wrote 500 entities
    Apr 27, 2018 11:57:33 PM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
    INFO: Writing batch of 500 entities
    Apr 27, 2018 11:57:33 PM 
org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
    INFO: Successfully wrote 500 entities
    Apr 27, 2018 11:57:33 PM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil 
deleteAllEntities
    INFO: Successfully deleted 1000 entities

org.apache.beam.sdk.io.gcp.datastore.SplitQueryFnIT > 
testSplitQueryFnWithLargeDataset STANDARD_ERROR
    Apr 27, 2018 11:57:33 PM org.apache.beam.sdk.transforms.DoFnTester of
    WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.
    Apr 27, 2018 11:57:34 PM 
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read getEstimatedSizeBytes
    INFO: Latest stats timestamp for kind sort_1G is 1524728602000000
    Apr 27, 2018 11:57:34 PM 
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read getEstimatedNumSplits
    INFO: Estimated size bytes for the query is: 2130000000
    Apr 27, 2018 11:57:34 PM 
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read$SplitQueryFn 
processElement
    INFO: Splitting the query into 32 splits

org.apache.beam.sdk.io.gcp.datastore.SplitQueryFnIT > 
testSplitQueryFnWithSmallDataset STANDARD_ERROR
    Apr 27, 2018 11:57:35 PM org.apache.beam.sdk.transforms.DoFnTester of
    WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. 
Please use TestPipeline instead.
    Apr 27, 2018 11:57:35 PM 
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read getEstimatedSizeBytes
    INFO: Latest stats timestamp for kind shakespeare is 1524728602000000
    Apr 27, 2018 11:57:35 PM 
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read getEstimatedNumSplits
    INFO: Estimated size bytes for the query is: 26383451
    Apr 27, 2018 11:57:35 PM 
org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read$SplitQueryFn 
processElement
    INFO: Splitting the query into 12 splits

Gradle Test Executor 130 finished executing tests.

> Task 
> :beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest 
> FAILED

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite FAILED
    java.lang.NoClassDefFoundError: 
com/google/api/gax/retrying/ExceptionRetryAlgorithm
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
        at 
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at com.google.cloud.BaseService.<clinit>(BaseService.java:48)
        at 
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:60)
        at 
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:55)
        at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:426)
        at 
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT.setUp(SpannerWriteIT.java:91)

        Caused by:
        java.lang.ClassNotFoundException: 
com.google.api.gax.retrying.ExceptionRetryAlgorithm
            at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
            at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
            at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
            at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
            ... 17 more

    java.lang.NullPointerException
        at 
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT.tearDown(SpannerWriteIT.java:148)

12 tests completed, 3 failed
Finished generating test XML results (0.007 secs) into: 
<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformIntegrationTest>
Generating HTML test report...
Finished generating test html results (0.009 secs) into: 
<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformIntegrationTest>
:beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest 
(Thread[main,5,main]) completed. Took 14 mins 50.794 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest'.
> There were failing tests. See the report at: 
> file://<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformIntegrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 5.0.
See 
https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 51m 58s
546 actionable tasks: 543 executed, 3 from cache

Publishing build scan...
https://gradle.com/s/je65fsiuknwla

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Recording test results
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user 
[email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]

Reply via email to