See
<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/151/display/redirect?page=changes>
Changes:
[katarzyna.kucharczyk] [BEAM-6335] Added streaming GroupByKey Test that reads
SyntheticSource
[katarzyna.kucharczyk] [BEAM-6335] Changed SyntheticDataPublisher to publish
String UTF values
[katarzyna.kucharczyk] [BEAM-6335] Added custom PubSub Matcher that stops
pipeline after
[valentyn] Restore original behavior of evaluating worker host on Windows until
a
[echauchot] [BEAM-8470] Add an empty spark-structured-streaming runner project
[echauchot] [BEAM-8470] Fix missing dep
[echauchot] [BEAM-8470] Add SparkPipelineOptions
[echauchot] [BEAM-8470] Start pipeline translation
[echauchot] [BEAM-8470] Add global pipeline translation structure
[echauchot] [BEAM-8470] Add nodes translators structure
[echauchot] [BEAM-8470] Wire node translators with pipeline translator
[echauchot] [BEAM-8470] Renames: better differenciate pipeline translator for
[echauchot] [BEAM-8470] Organise methods in PipelineTranslator
[echauchot] [BEAM-8470] Initialise BatchTranslationContext
[echauchot] [BEAM-8470] Refactoring: -move batch/streaming common translation
[echauchot] [BEAM-8470] Make transform translation clearer: renaming, comments
[echauchot] [BEAM-8470] Improve javadocs
[echauchot] [BEAM-8470] Move SparkTransformOverrides to correct package
[echauchot] [BEAM-8470] Move common translation context components to superclass
[echauchot] [BEAM-8470] apply spotless
[echauchot] [BEAM-8470] Make codestyle and firebug happy
[echauchot] [BEAM-8470] Add TODOs
[echauchot] [BEAM-8470] Post-pone batch qualifier in all classes names for
[echauchot] [BEAM-8470] Add precise TODO for multiple TransformTranslator per
[echauchot] [BEAM-8470] Added SparkRunnerRegistrar
[echauchot] [BEAM-8470] Add basic pipeline execution. Refactor
translatePipeline()
[echauchot] [BEAM-8470] Create PCollections manipulation methods
[echauchot] [BEAM-8470] Create Datasets manipulation methods
[echauchot] [BEAM-8470] Add Flatten transformation translator
[echauchot] [BEAM-8470] Add primitive GroupByKeyTranslatorBatch implementation
[echauchot] [BEAM-8470] Use Iterators.transform() to return Iterable
[echauchot] [BEAM-8470] Implement read transform
[echauchot] [BEAM-8470] update TODO
[echauchot] [BEAM-8470] Apply spotless
[echauchot] [BEAM-8470] start source instanciation
[echauchot] [BEAM-8470] Improve exception flow
[echauchot] [BEAM-8470] Improve type enforcement in ReadSourceTranslator
[echauchot] [BEAM-8470] Experiment over using spark Catalog to pass in Beam
Source
[echauchot] [BEAM-8470] Add source mocks
[echauchot] [BEAM-8470] fix mock, wire mock in translators and create a main
test.
[echauchot] [BEAM-8470] Use raw WindowedValue so that spark Encoders could work
[echauchot] [BEAM-8470] clean deps
[echauchot] [BEAM-8470] Move DatasetSourceMock to proper batch mode
[echauchot] [BEAM-8470] Run pipeline in batch mode or in streaming mode
[echauchot] [BEAM-8470] Split batch and streaming sources and translators
[echauchot] [BEAM-8470] Use raw Encoder<WindowedValue> also in regular
[echauchot] [BEAM-8470] Clean
[echauchot] [BEAM-8470] Add ReadSourceTranslatorStreaming
[echauchot] [BEAM-8470] Move Source and translator mocks to a mock package.
[echauchot] [BEAM-8470] Pass Beam Source and PipelineOptions to the spark
DataSource
[echauchot] [BEAM-8470] Refactor DatasetSource fields
[echauchot] [BEAM-8470] Wire real SourceTransform and not mock and update the
test
[echauchot] [BEAM-8470] Add missing 0-arg public constructor
[echauchot] [BEAM-8470] Use new PipelineOptionsSerializationUtils
[echauchot] [BEAM-8470] Apply spotless and fix checkstyle
[echauchot] [BEAM-8470] Add a dummy schema for reader
[echauchot] [BEAM-8470] Add empty 0-arg constructor for mock source
[echauchot] [BEAM-8470] Clean
[echauchot] [BEAM-8470] Checkstyle and Findbugs
[echauchot] [BEAM-8470] Refactor SourceTest to a UTest instaed of a main
[echauchot] [BEAM-8470] Fix pipeline triggering: use a spark action instead of
[echauchot] [BEAM-8470] improve readability of options passing to the source
[echauchot] [BEAM-8470] Clean unneeded fields in DatasetReader
[echauchot] [BEAM-8470] Fix serialization issues
[echauchot] [BEAM-8470] Add SerializationDebugger
[echauchot] [BEAM-8470] Add serialization test
[echauchot] [BEAM-8470] Move SourceTest to same package as tested class
[echauchot] [BEAM-8470] Fix SourceTest
[echauchot] [BEAM-8470] Simplify beam reader creation as it created once the
source
[echauchot] [BEAM-8470] Put all transform translators Serializable
[echauchot] [BEAM-8470] Enable test mode
[echauchot] [BEAM-8470] Enable gradle build scan
[echauchot] [BEAM-8470] Add flatten test
[echauchot] [BEAM-8470] First attempt for ParDo primitive implementation
[echauchot] [BEAM-8470] Serialize windowedValue to byte[] in source to be able
to
[echauchot] [BEAM-8470] Comment schema choices
[echauchot] [BEAM-8470] Fix errorprone
[echauchot] [BEAM-8470] Fix testMode output to comply with new binary schema
[echauchot] [BEAM-8470] Cleaning
[echauchot] [BEAM-8470] Remove bundleSize parameter and always use spark default
[echauchot] [BEAM-8470] Fix split bug
[echauchot] [BEAM-8470] Clean
[echauchot] [BEAM-8470] Add ParDoTest
[echauchot] [BEAM-8470] Address minor review notes
[echauchot] [BEAM-8470] Clean
[echauchot] [BEAM-8470] Add GroupByKeyTest
[echauchot] [BEAM-8470] Add comments and TODO to GroupByKeyTranslatorBatch
[echauchot] [BEAM-8470] Fix type checking with Encoder of WindowedValue<T>
[echauchot] [BEAM-8470] Port latest changes of ReadSourceTranslatorBatch to
[echauchot] [BEAM-8470] Remove no more needed putDatasetRaw
[echauchot] [BEAM-8470] Add ComplexSourceTest
[echauchot] [BEAM-8470] Fail in case of having SideInouts or State/Timers
[echauchot] [BEAM-8470] Fix Encoders: create an Encoder for every manipulated
type
[echauchot] [BEAM-8470] Apply spotless
[echauchot] [BEAM-8470] Fixed Javadoc error
[echauchot] [BEAM-8470] Rename SparkSideInputReader class and rename
pruneOutput()
[echauchot] [BEAM-8470] Don't use deprecated
[echauchot] [BEAM-8470] Simplify logic of ParDo translator
[echauchot] [BEAM-8470] Fix kryo issue in GBK translator with a workaround
[echauchot] [BEAM-8470] Rename SparkOutputManager for consistency
[echauchot] [BEAM-8470] Fix for test elements container in GroupByKeyTest
[echauchot] [BEAM-8470] Added "testTwoPardoInRow"
[echauchot] [BEAM-8470] Add a test for the most simple possible Combine
[echauchot] [BEAM-8470] Rename SparkDoFnFilterFunction to DoFnFilterFunction for
[echauchot] [BEAM-8470] Generalize the use of SerializablePipelineOptions in
place
[echauchot] [BEAM-8470] Fix getSideInputs
[echauchot] [BEAM-8470] Extract binary schema creation in a helper class
[echauchot] [BEAM-8470] First version of combinePerKey
[echauchot] [BEAM-8470] Improve type checking of Tuple2 encoder
[echauchot] [BEAM-8470] Introduce WindowingHelpers (and helpers package) and
use it
[echauchot] [BEAM-8470] Fix combiner using KV as input, use binary encoders in
place
[echauchot] [BEAM-8470] Add combinePerKey and CombineGlobally tests
[echauchot] [BEAM-8470] Introduce RowHelpers
[echauchot] [BEAM-8470] Add CombineGlobally translation to avoid translating
[echauchot] [BEAM-8470] Cleaning
[echauchot] [BEAM-8470] Get back to classes in translators resolution because
URNs
[echauchot] [BEAM-8470] Fix various type checking issues in Combine.Globally
[echauchot] [BEAM-8470] Update test with Long
[echauchot] [BEAM-8470] Fix combine. For unknown reason GenericRowWithSchema is
used
[echauchot] [BEAM-8470] Use more generic Row instead of GenericRowWithSchema
[echauchot] [BEAM-8470] Add explanation about receiving a Row as input in the
[echauchot] [BEAM-8470] Fix encoder bug in combinePerkey
[echauchot] [BEAM-8470] Cleaning
[echauchot] [BEAM-8470] Implement WindowAssignTranslatorBatch
[echauchot] [BEAM-8470] Implement WindowAssignTest
[echauchot] [BEAM-8470] Fix javadoc
[echauchot] [BEAM-8470] Added SideInput support
[echauchot] [BEAM-8470] Fix CheckStyle violations
[echauchot] [BEAM-8470] Don't use Reshuffle translation
[echauchot] [BEAM-8470] Added using CachedSideInputReader
[echauchot] [BEAM-8470] Added TODO comment for ReshuffleTranslatorBatch
[echauchot] [BEAM-8470] And unchecked warning suppression
[echauchot] [BEAM-8470] Add streaming source initialisation
[echauchot] [BEAM-8470] Implement first streaming source
[echauchot] [BEAM-8470] Add a TODO on spark output modes
[echauchot] [BEAM-8470] Add transformators registry in
PipelineTranslatorStreaming
[echauchot] [BEAM-8470] Add source streaming test
[echauchot] [BEAM-8470] Specify checkpointLocation at the pipeline start
[echauchot] [BEAM-8470] Clean unneeded 0 arg constructor in batch source
[echauchot] [BEAM-8470] Clean streaming source
[echauchot] [BEAM-8470] Continue impl of offsets for streaming source
[echauchot] [BEAM-8470] Deal with checkpoint and offset based read
[echauchot] [BEAM-8470] Apply spotless and fix spotbugs warnings
[echauchot] [BEAM-8470] Disable never ending test
[echauchot] [BEAM-8470] Fix access level issues, typos and modernize code to
Java 8
[echauchot] [BEAM-8470] Merge Spark Structured Streaming runner into main Spark
[echauchot] [BEAM-8470] Fix non-vendored imports from Spark Streaming Runner
classes
[echauchot] [BEAM-8470] Pass doFnSchemaInformation to ParDo batch translation
[echauchot] [BEAM-8470] Fix spotless issues after rebase
[echauchot] [BEAM-8470] Fix logging levels in Spark Structured Streaming
translation
[echauchot] [BEAM-8470] Add SparkStructuredStreamingPipelineOptions and
[echauchot] [BEAM-8470] Rename SparkPipelineResult to
[echauchot] [BEAM-8470] Use PAssert in Spark Structured Streaming transform
tests
[echauchot] [BEAM-8470] Ignore spark offsets (cf javadoc)
[echauchot] [BEAM-8470] implement source.stop
[echauchot] [BEAM-8470] Update javadoc
[echauchot] [BEAM-8470] Apply Spotless
[echauchot] [BEAM-8470] Enable batch Validates Runner tests for Structured
Streaming
[echauchot] [BEAM-8470] Limit the number of partitions to make tests go 300%
faster
[echauchot] [BEAM-8470] Fixes ParDo not calling setup and not tearing down if
[echauchot] [BEAM-8470] Pass transform based doFnSchemaInformation in ParDo
[echauchot] [BEAM-8470] Consider null object case on RowHelpers, fixes empty
side
[echauchot] [BEAM-8470] Put back batch/simpleSourceTest.testBoundedSource
[echauchot] [BEAM-8470] Update windowAssignTest
[echauchot] [BEAM-8470] Add comment about checkpoint mark
[echauchot] [BEAM-8470] Re-code GroupByKeyTranslatorBatch to conserve windowing
[echauchot] [BEAM-8470] re-enable reduceFnRunner timers for output
[echauchot] [BEAM-8470] Improve visibility of debug messages
[echauchot] [BEAM-8470] Add a test that GBK preserves windowing
[echauchot] [BEAM-8470] Add TODO in Combine translations
[echauchot] [BEAM-8470] Update KVHelpers.extractKey() to deal with
WindowedValue and
[echauchot] [BEAM-8470] Fix comment about schemas
[echauchot] [BEAM-8470] Implement reduce part of CombineGlobally translation
with
[echauchot] [BEAM-8470] Output data after combine
[echauchot] [BEAM-8470] Implement merge accumulators part of CombineGlobally
[echauchot] [BEAM-8470] Fix encoder in combine call
[echauchot] [BEAM-8470] Revert extractKey while combinePerKey is not done (so
that
[echauchot] [BEAM-8470] Apply a groupByKey avoids for some reason that the spark
[echauchot] [BEAM-8470] Fix case when a window does not merge into any other
window
[echauchot] [BEAM-8470] Fix wrong encoder in combineGlobally GBK
[echauchot] [BEAM-8470] Fix bug in the window merging logic
[echauchot] [BEAM-8470] Remove the mapPartition that adds a key per partition
[echauchot] [BEAM-8470] Remove CombineGlobally translation because it is less
[echauchot] [BEAM-8470] Now that there is only Combine.PerKey translation, make
only
[echauchot] [BEAM-8470] Clean no more needed KVHelpers
[echauchot] [BEAM-8470] Clean not more needed RowHelpers
[echauchot] [BEAM-8470] Clean not more needed WindowingHelpers
[echauchot] [BEAM-8470] Fix javadoc of AggregatorCombiner
[echauchot] [BEAM-8470] Fixed immutable list bug
[echauchot] [BEAM-8470] add comment in combine globally test
[echauchot] [BEAM-8470] Clean groupByKeyTest
[echauchot] [BEAM-8470] Add a test that combine per key preserves windowing
[echauchot] [BEAM-8470] Ignore for now not working test testCombineGlobally
[echauchot] [BEAM-8470] Add metrics support in DoFn
[echauchot] [BEAM-8470] Add missing dependencies to run Spark Structured
Streaming
[echauchot] [BEAM-8470] Add setEnableSparkMetricSinks() method
[echauchot] [BEAM-8470] Fix javadoc
[echauchot] [BEAM-8470] Fix accumulators initialization in Combine that
prevented
[echauchot] [BEAM-8470] Add a test to check that CombineGlobally preserves
windowing
[echauchot] [BEAM-8470] Persist all output Dataset if there are multiple
outputs in
[echauchot] [BEAM-8470] Added metrics sinks and tests
[echauchot] [BEAM-8470] Make spotless happy
[echauchot] [BEAM-8470] Add PipelineResults to Spark structured streaming.
[echauchot] [BEAM-8470] Update log4j configuration
[echauchot] [BEAM-8470] Add spark execution plans extended debug messages.
[echauchot] [BEAM-8470] Print number of leaf datasets
[echauchot] [BEAM-8470] fixup! Add PipelineResults to Spark structured
streaming.
[echauchot] [BEAM-8470] Remove no more needed AggregatorCombinerPerKey (there is
[echauchot] [BEAM-8470] After testing performance and correctness, launch
pipeline
[echauchot] [BEAM-8470] Improve Pardo translation performance: avoid calling a
[echauchot] [BEAM-8470] Use "sparkMaster" in local mode to obtain number of
shuffle
[echauchot] [BEAM-8470] Wrap Beam Coders into Spark Encoders using
[echauchot] [BEAM-8470] Wrap Beam Coders into Spark Encoders using
[echauchot] [BEAM-8470] type erasure: spark encoders require a Class<T>, pass
Object
[echauchot] [BEAM-8470] Fix scala Product in Encoders to avoid StackEverflow
[echauchot] [BEAM-8470] Conform to spark ExpressionEncoders: pass classTags,
[echauchot] [BEAM-8470] Add a simple spark native test to test Beam coders
wrapping
[echauchot] [BEAM-8470] Fix code generation in Beam coder wrapper
[echauchot] [BEAM-8470] Lazy init coder because coder instance cannot be
[echauchot] [BEAM-8470] Fix warning in coder construction by reflexion
[echauchot] [BEAM-8470] Fix ExpressionEncoder generated code: typos, try catch,
fqcn
[echauchot] [BEAM-8470] Fix getting the output value in code generation
[echauchot] [BEAM-8470] Fix beam coder lazy init using reflexion: use .clas +
try
[echauchot] [BEAM-8470] Remove lazy init of beam coder because there is no
generic
[echauchot] [BEAM-8470] Remove example code
[echauchot] [BEAM-8470] Fix equal and hashcode
[echauchot] [BEAM-8470] Fix generated code: uniform exceptions catching, fix
[echauchot] [BEAM-8470] Add an assert of equality in the encoders test
[echauchot] [BEAM-8470] Apply spotless and checkstyle and add javadocs
[echauchot] [BEAM-8470] Wrap exceptions in UserCoderExceptions
[echauchot] [BEAM-8470] Put Encoders expressions serializable
[echauchot] [BEAM-8470] Catch Exception instead of IOException because some
coders
[echauchot] [BEAM-8470] Apply new Encoders to CombinePerKey
[echauchot] [BEAM-8470] Apply new Encoders to Read source
[echauchot] [BEAM-8470] Improve performance of source: the mapper already calls
[echauchot] [BEAM-8470] Ignore long time failing test: SparkMetricsSinkTest
[echauchot] [BEAM-8470] Apply new Encoders to Window assign translation
[echauchot] [BEAM-8470] Apply new Encoders to AggregatorCombiner
[echauchot] [BEAM-8470] Create a Tuple2Coder to encode scala tuple2
[echauchot] [BEAM-8470] Apply new Encoders to GroupByKey
[echauchot] [BEAM-8470] Apply new Encoders to Pardo. Replace Tuple2Coder with
[echauchot] [BEAM-8470] Apply spotless, fix typo and javadoc
[echauchot] [BEAM-8470] Use beam encoders also in the output of the source
[echauchot] [BEAM-8470] Remove unneeded cast
[echauchot] [BEAM-8470] Fix: Remove generic hack of using object. Use actual
Coder
[echauchot] [BEAM-8470] Remove Encoders based on kryo now that we call Beam
coders
[echauchot] [BEAM-8470] Add a jenkins job for validates runner tests in the new
[echauchot] [BEAM-8470] Apply spotless
[echauchot] [BEAM-8470] Rebase on master: pass sideInputMapping in
SimpleDoFnRunner
[echauchot] Fix SpotBugs
[echauchot] [BEAM-8470] simplify coders in combinePerKey translation
[echauchot] [BEAM-8470] Fix combiner. Do not reuse instance of accumulator
[echauchot] [BEAM-8470] input windows can arrive exploded (for sliding
windows). As
[echauchot] [BEAM-8470] Add a combine test with sliding windows
[echauchot] [BEAM-8470] Add a test to test combine translation on
binaryCombineFn
[echauchot] [BEAM-8470] Fix tests: use correct
[echauchot] [BEAM-8470] Fix wrong expected results in
[echauchot] [BEAM-8470] Add disclaimers about this runner being experimental
[echauchot] [BEAM-8470] Fix: create an empty accumulator in
[echauchot] [BEAM-8470] Apply spotless
[echauchot] [BEAM-8470] Add a countPerElement test with sliding windows
[echauchot] [BEAM-8470] Fix the output timestamps of combine: timestamps must be
[echauchot] [BEAM-8470] set log level to info to avoid resource consumption in
[echauchot] [BEAM-8470] Fix CombineTest.testCountPerElementWithSlidingWindows
[aromanenko.dev] [BEAM-8470] Remove "validatesStructuredStreamingRunnerBatch"
from
[echauchot] [BEAM-8470] Fix timestamps in combine output: assign the timestamp
to
[iemejia] [website] Add Spark Structured Runner VR badge to the github template
[tvalentyn] [BEAM-8575] Add a Python test to test windowing in DoFn
finish_bundle()
------------------------------------------
[...truncated 403.08 KB...]
"message" : "Invalid table ID
\"bqio_write_10GB_java_db516d21-50b1-45c5-a158-c249d60531c9\". Table IDs must
be alphanumeric (plus underscores) and must be at most 1024 characters long.
Also, Table decorators cannot be used.",
"reason" : "invalid"
} ],
"message" : "Invalid table ID
\"bqio_write_10GB_java_db516d21-50b1-45c5-a158-c249d60531c9\". Table IDs must
be alphanumeric (plus underscores) and must be at most 1024 characters long.
Also, Table decorators cannot be used.",
"status" : "INVALID_ARGUMENT"
}
at
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.tryCreateTable(BigQueryServicesImpl.java:520)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.createTable(BigQueryServicesImpl.java:505)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:205)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
at
org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:82)
at
org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:180)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
at
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV.process(BigQueryIOIT.java:243)
at
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
at
org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
at
org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
at
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
at
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
at
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Nov 20, 2019 7:20:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2019-11-20T19:20:30.840Z: java.lang.RuntimeException:
com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad
Request
{
"code" : 400,
"errors" : [ {
"domain" : "global",
"message" : "Invalid table ID
\"bqio_write_10GB_java_db516d21-50b1-45c5-a158-c249d60531c9\". Table IDs must
be alphanumeric (plus underscores) and must be at most 1024 characters long.
Also, Table decorators cannot be used.",
"reason" : "invalid"
} ],
"message" : "Invalid table ID
\"bqio_write_10GB_java_db516d21-50b1-45c5-a158-c249d60531c9\". Table IDs must
be alphanumeric (plus underscores) and must be at most 1024 characters long.
Also, Table decorators cannot be used.",
"status" : "INVALID_ARGUMENT"
}
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:208)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
Caused by:
com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad
Request
{
"code" : 400,
"errors" : [ {
"domain" : "global",
"message" : "Invalid table ID
\"bqio_write_10GB_java_db516d21-50b1-45c5-a158-c249d60531c9\". Table IDs must
be alphanumeric (plus underscores) and must be at most 1024 characters long.
Also, Table decorators cannot be used.",
"reason" : "invalid"
} ],
"message" : "Invalid table ID
\"bqio_write_10GB_java_db516d21-50b1-45c5-a158-c249d60531c9\". Table IDs must
be alphanumeric (plus underscores) and must be at most 1024 characters long.
Also, Table decorators cannot be used.",
"status" : "INVALID_ARGUMENT"
}
at
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:417)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1132)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:515)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:448)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.tryCreateTable(BigQueryServicesImpl.java:520)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.createTable(BigQueryServicesImpl.java:505)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.tryCreateTable(CreateTables.java:205)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.getTableDestination(CreateTables.java:160)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.lambda$processElement$0(CreateTables.java:113)
at java.util.HashMap.computeIfAbsent(HashMap.java:1126)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn.processElement(CreateTables.java:112)
at
org.apache.beam.sdk.io.gcp.bigquery.CreateTables$CreateTablesFn$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
at
org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:82)
at
org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:180)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
at
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV.process(BigQueryIOIT.java:243)
at
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$MapKVToV$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:252)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:74)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:576)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:564)
at
org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
at
org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:218)
at
org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:183)
at
org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:335)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:77)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
at
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
at
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
at
org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Nov 20, 2019 7:20:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-20T19:20:30.882Z: Finished operation Read from source+Gather
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
Nov 20, 2019 7:20:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2019-11-20T19:20:30.977Z: Workflow failed. Causes: S02:Read from
source+Gather time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write
to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write failed.,
The job failed because a work item has failed 4 times. Look in previous log
entries for the cause of each one of the 4 failures. For more information, see
https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was
attempted on these workers:
testpipeline-jenkins-1120-11201118-uucq-harness-dm0j
Root cause: Work item failed.,
testpipeline-jenkins-1120-11201118-uucq-harness-dm0j
Root cause: Work item failed.,
testpipeline-jenkins-1120-11201118-uucq-harness-2gf0
Root cause: Work item failed.,
testpipeline-jenkins-1120-11201118-uucq-harness-mlts
Root cause: Work item failed.
Nov 20, 2019 7:20:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-20T19:20:31.085Z: Cleaning up.
Nov 20, 2019 7:20:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-20T19:20:31.179Z: Stopping worker pool...
Nov 20, 2019 7:22:23 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-20T19:22:22.669Z: Autoscaling: Resized worker pool from 5 to
0.
Nov 20, 2019 7:22:23 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-11-20T19:22:22.809Z: Worker pool stopped.
Nov 20, 2019 7:22:28 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2019-11-20_11_18_52-12211365401200612338 failed with status
FAILED.
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead
STANDARD_OUT
Load test results for test (ID): db516d21-50b1-45c5-a158-c249d60531c9 and
timestamp: 2019-11-20T19:18:45.476000000Z:
Metric: Value:
write_time 0.0
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead FAILED
java.lang.IllegalArgumentException: Writing avro formatted data is only
supported for FILE_LOADS, however the method was STREAMING_INSERTS
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:216)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:2362)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:1662)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:539)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:490)
at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
at
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWrite(BigQueryIOIT.java:159)
at
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testAvroWrite(BigQueryIOIT.java:148)
at
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWriteThenRead(BigQueryIOIT.java:121)
Gradle Test Executor 1 finished executing tests.
> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED
1 test completed, 1 failed
Finished generating test XML results (0.026 secs) into:
<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.039 secs) into:
<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest>
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution worker
for ':' Thread 9,5,main]) completed. Took 3 mins 45.163 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> There were failing tests. See the report at:
> file://<https://builds.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 4m 40s
80 actionable tasks: 55 executed, 25 from cache
Publishing build scan...
https://gradle.com/s/jeramh3yrx3sa
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]