[
https://issues.apache.org/jira/browse/BEAM-10248?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Graham Polley updated BEAM-10248:
---------------------------------
Description:
I am using `FILE_LOADS` (micro-batching) from Pub/Sub and writing to BigQuery.
My BigQuery dataset is in region `australia-southeast1`.
If the load job into BigQuery fails for some reason (e.g. table does not exist,
or schema has changed), then an error from BigQuery is returned for the load.
Beam then enters into a loop of retrying the load job. However, instead of
using a new job id (where the suffix is incremented by 1), it wrongly tries to
reinsert the job using the same job id. This is because when it tries to look
up if the job id already exists, it does not take into consideration the region
where the dataset is. Instead, it defaults to the US but the job was created in
`australia-southeast1`, so it returns `null`.
Bug is on LN308 of `BigQueryHelpers.java`. It should not return `null`. It
returns `null` because it's looking in the wrong region.
If I *test* using a dataset in BigQuery that is in the US region, then it
correctly finds the job id and begins to retry the job with a new job id
(suffixed with the retry count). We have seen this bug in other areas of Beam
before and in other tools/services on GCP e.g. Cloud Composer.
However, that leads me to my next problem/bug. Even if that is fixed, the
number of retries is set to `Integer.MX_VALUE`, so Beam will keep retrying the
job 2,147,483,647 times. This is not good.
The exception is swallowed up by the Beam SDK and never propagated back up the
stack for users to catch and handle. So, if a load job fails there is no way to
handle it and react for users.
I attempted to set `withFailedInsertRetryPolicy()` to transient errors only,
but it is not supported with `FILE_LOADS`. I also tried, using the
`WriteResult` object returned from the bigQuery sink/write, to get a handle on
the error but it does not work. Users need a way to catch and catch failed load
jobs when using `FILE_LOADS`.
{code:java}
public class TemplatePipeline {
private static final String TOPIC =
"projects/etl-demo-269105/topics/test-micro-batching";
private static final String BIGQUERY_DESTINATION_FILTERED =
"etl-demo-269105:etl_demo_fun.micro_batch_test_xxx";
public static void main(String[] args) throws Exception {
try {
PipelineOptionsFactory.register(DataflowPipelineOptions.class);
DataflowPipelineOptions options = PipelineOptionsFactory
.fromArgs(args)
.withoutStrictParsing()
.as(DataflowPipelineOptions.class);
Pipeline pipeline = Pipeline.create(options);
PCollection<PubsubMessage> messages = pipeline
.apply(PubsubIO.readMessagesWithAttributes().fromTopic(String.format(TOPIC,
options.getProject())))
.apply(Window.into(FixedWindows.of(Duration.standardSeconds(1))));
WriteResult result = messages.apply(ParDo.of(new RowToBQRow()))
.apply(BigQueryIO.writeTableRows()
.to(String.format(BIGQUERY_DESTINATION_FILTERED, options.getProject()))
.withCreateDisposition(CREATE_NEVER)
.withWriteDisposition(WRITE_APPEND)
.withMethod(BigQueryIO.Write.Method.FILE_LOADS)
.withTriggeringFrequency(Duration.standardSeconds(5))
.withNumFileShards(1)
.withExtendedErrorInfo()
.withSchema(getTableSchema()));
// result.getFailedInsertsWithErr().apply(ParDo.of(new
DoFn<BigQueryInsertError, String>() {
// @ProcessElement
// public void processElement(ProcessContext c) {
// for(ErrorProto err : c.element().getError().getErrors()){
// throw new RuntimeException(err.getMessage());
// }
//
// }
// }));
result.getFailedInserts().apply(ParDo.of(new DoFn<TableRow, String>() {
@ProcessElement
public void processElement(ProcessContext c) {
System.out.println(c.element());
c.output("foo");
throw new RuntimeException("Failed to load");
}
@FinishBundle
public void finishUp(FinishBundleContext finishBundleContextc){
System.out.println("Got here");
}
}));
pipeline.run();
} catch (Exception e) {
e.printStackTrace();
throw new Exception(e);
}
}
private static TableSchema getTableSchema() {
List<TableFieldSchema> fields = new ArrayList<>();
fields.add(new TableFieldSchema().setName("timestamp").setType("INTEGER"));
fields.add(new TableFieldSchema().setName("payload").setType("STRING"));
return new TableSchema().setFields(fields);
}
public static class RowToBQRow extends DoFn<PubsubMessage, TableRow> {
@ProcessElement
public void processElement(ProcessContext c) {
String payload = new String(c.element().getPayload(), StandardCharsets.UTF_8);
c.output(new TableRow()
.set("timestamp", System.currentTimeMillis())
.set("payload", payload)
);
}
}
}{code}
Stack trace is attached showing problem of same job id being used/inserted to
BigQuery on each retry.
{noformat}
{noformat}
was:
I am using `FILE_LOADS` (micro-batching) from Pub/Sub and writing to BigQuery.
My BigQuery dataset is in region `australia-southeast1`.
If the load job into BigQuery fails for some reason (e.g. table does not exist,
or schema has changed), then an error from BigQuery is returned for the load.
Beam then enters into a loop of retrying the load job. However, instead of
using a new job id (where the suffix is incremented by 1), it wrongly tries to
reinsert the job using the same job id. This is because when it tries to look
up if the job id already exists, it does not take into consideration the region
where the dataset is. Instead, it defaults to the US but the job was created in
`australia-southeast1`, so it returns `null`.
Bug is on LN308 of `BigQueryHelpers.java`. It should not return `null`. It
returns `null` because it's looking in the wrong region.
If I *test* using a dataset in BigQuery that is in the US region, then it
correctly finds the job id and begins to retry the job with a new job id
(suffixed with the retry count). We have seen this bug in other areas of Beam
before and in other tools/services on GCP e.g. Cloud Composer.
However, that leads me to my next problem/bug. Even if that is fixed, the
number of retries is set to `Integer.MX_VALUE`, so Beam will keep retrying the
job 2,147,483,647 times. This is not good.
The exception is swallowed up by the Beam SDK and never propagated back up the
stack for users to catch and handle. So, if a load job fails there is no way to
handle it and react for users.
I attempted to set `withFailedInsertRetryPolicy()` to transient errors only,
but it is not supported with `FILE_LOADS`. I also tried, using the
`WriteResult` object returned from the bigQuery sink/write, to get a handle on
the error but it does not work. Users need a way to catch and catch failed load
jobs when using `FILE_LOADS`.
{code:java}
public class TemplatePipeline {
private static final String TOPIC =
"projects/etl-demo-269105/topics/test-micro-batching";
private static final String BIGQUERY_DESTINATION_FILTERED =
"etl-demo-269105:etl_demo_fun.micro_batch_test_xxx";
public static void main(String[] args) throws Exception {
try {
PipelineOptionsFactory.register(DataflowPipelineOptions.class);
DataflowPipelineOptions options = PipelineOptionsFactory
.fromArgs(args)
.withoutStrictParsing()
.as(DataflowPipelineOptions.class);
Pipeline pipeline = Pipeline.create(options);
PCollection<PubsubMessage> messages = pipeline
.apply(PubsubIO.readMessagesWithAttributes().fromTopic(String.format(TOPIC,
options.getProject())))
.apply(Window.into(FixedWindows.of(Duration.standardSeconds(1))));
WriteResult result = messages.apply(ParDo.of(new RowToBQRow()))
.apply(BigQueryIO.writeTableRows()
.to(String.format(BIGQUERY_DESTINATION_FILTERED, options.getProject()))
.withCreateDisposition(CREATE_NEVER)
.withWriteDisposition(WRITE_APPEND)
.withMethod(BigQueryIO.Write.Method.FILE_LOADS)
.withTriggeringFrequency(Duration.standardSeconds(5))
.withNumFileShards(1)
.withExtendedErrorInfo()
.withSchema(getTableSchema()));
// result.getFailedInsertsWithErr().apply(ParDo.of(new
DoFn<BigQueryInsertError, String>() {
// @ProcessElement
// public void processElement(ProcessContext c) {
// for(ErrorProto err : c.element().getError().getErrors()){
// throw new RuntimeException(err.getMessage());
// }
//
// }
// }));
result.getFailedInserts().apply(ParDo.of(new DoFn<TableRow, String>() {
@ProcessElement
public void processElement(ProcessContext c) {
System.out.println(c.element());
c.output("foo");
throw new RuntimeException("Failed to load");
}
@FinishBundle
public void finishUp(FinishBundleContext finishBundleContextc){
System.out.println("Got here");
}
}));
pipeline.run();
} catch (Exception e) {
e.printStackTrace();
throw new Exception(e);
}
}
private static TableSchema getTableSchema() {
List<TableFieldSchema> fields = new ArrayList<>();
fields.add(new TableFieldSchema().setName("timestamp").setType("INTEGER"));
fields.add(new TableFieldSchema().setName("payload").setType("STRING"));
return new TableSchema().setFields(fields);
}
public static class RowToBQRow extends DoFn<PubsubMessage, TableRow> {
@ProcessElement
public void processElement(ProcessContext c) {
String payload = new String(c.element().getPayload(), StandardCharsets.UTF_8);
c.output(new TableRow()
.set("timestamp", System.currentTimeMillis())
.set("payload", payload)
);
}
}
}{code}
Stack trace showing problem of same job id being used/inserted to BigQuery on
each retry:
{noformat}
/Library/Java/JavaVirtualMachines/adoptopenjdk-11.jdk/Contents/Home/bin/java
-agentlib:jdwp=transport=dt_socket,address=127.0.0.1:50370,suspend=y,server=n
-ea
-javaagent:/Users/grahampolley/Library/Caches/IdeaIC2018.2/groovyHotSwap/gragent.jar
-javaagent:/Users/grahampolley/Library/Caches/IdeaIC2018.2/captureAgent/debugger-agent.jar=file:/private/var/folders/_t/1lxhy17n2mlc0_4rb3_3thq40001g7/T/capture.props
-Dfile.encoding=UTF-8 -classpath
"/Users/grahampolley/Dev/gcp-batch-ingestion-bigquery/out/production/classes:/Users/grahampolley/Dev/gcp-batch-ingestion-bigquery/out/production/resources:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-log4j12/1.7.30/c21f55139d8141d2231214fb1feaf50a1edca95e/slf4j-log4j12-1.7.30.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-runners-google-cloud-dataflow-java/2.22.0/f251d2508e2c55a6c44fdf4c2c66ce4407b84908/beam-runners-google-cloud-dataflow-java-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-runners-direct-java/2.22.0/541d048b57a0bb189310ebc85c6266241a732952/beam-runners-direct-java-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-sdks-java-io-google-cloud-platform/2.22.0/cdd9b337c909cd5ed85e379b7ae1c1a26dfba55f/beam-sdks-java-io-google-cloud-platform-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-sdks-java-extensions-google-cloud-platform-core/2.22.0/c3509a7b95a3533775cbaa61e438a482e2cdd5bf/beam-sdks-java-extensions-google-cloud-platform-core-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-sdks-java-expansion-service/2.22.0/6180ba45374960b0b639df0ae47c3b3736045612/beam-sdks-java-expansion-service-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-runners-java-fn-execution/2.22.0/cdeafa6cb9365caf63da4232e326b2eb2ddcde33/beam-runners-java-fn-execution-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-runners-core-construction-java/2.22.0/94c4145c597f700a4f3807282f916427dab5ef70/beam-runners-core-construction-java-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-sdks-java-extensions-protobuf/2.22.0/580a4bf3424738b30f840e29d4e5946df7017a84/beam-sdks-java-extensions-protobuf-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-sdks-java-fn-execution/2.22.0/5da4c54468112faf1f8b7f8bbf5524fe49d6c8ef/beam-sdks-java-fn-execution-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-sdks-java-extensions-protobuf/2.22.0/ec7722d88ddceb57806a76f7285bd8c0a1cc7fcd/beam-vendor-sdks-java-extensions-protobuf-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-sdks-java-core/2.22.0/73d5b6a07c6ae1579506c6224895efbd53d69f3f/beam-sdks-java-core-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-model-pipeline/2.22.0/3b5b6715b2f9b6e7e35b51acd904e30bde77d10e/beam-model-pipeline-2.22.0.jar:/Users/grahampolley/.m2/repository/org/apache/avro/avro/1.8.2/avro-1.8.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud.bigtable/bigtable-client-core/1.13.0/a86858be799ad73f1139017f91c4777147552fd4/bigtable-client-core-1.13.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-model-job-management/2.22.0/c155afa58e1cc382e7de62b9a67e0b918f2f5c1c/beam-model-job-management-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-model-fn-execution/2.22.0/8f180cc8d7ed9b0bfbbede2901ce9c62f10b50fd/beam-model-fn-execution-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.dropwizard.metrics/metrics-core/3.2.6/62fe170cffeded1cef60e9e3402a93b45ce14327/metrics-core-3.2.6.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.30/b5a4b6d16ab13e34a88fae84c35cd5d68cac922c/slf4j-api-1.7.30.jar:/Users/grahampolley/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-guava-26_0-jre/0.1/f309c3047ca99428e567afa42d233fb3e839bde1/beam-vendor-guava-26_0-jre-0.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud.bigdataoss/gcsio/2.1.3/6dfefdc343ae4e5d1ff6df480f0457c073bfc19c/gcsio-2.1.3.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud.bigdataoss/util/2.1.3/3a0d8e8883268fd1d8ec4241084fdb290bf7782b/util-2.1.3.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-clouddebugger/v2-rev20200313-1.30.9/f08af56f1e6cf8673d9408b99874eda40e6960fe/google-api-services-clouddebugger-v2-rev20200313-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-dataflow/v1b3-rev20200305-1.30.9/66c26eb008746cc8003c9dad799b2420cfdd75ed/google-api-services-dataflow-v1b3-rev20200305-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-bigquerystorage/0.125.0-beta/4ade9a93e57bdf032d8325e3fc0bd3cad75f153b/google-cloud-bigquerystorage-0.125.0-beta.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-bigquery/1.106.0/bc7232727654ba78edd63f8c95dd0ea47c4950f8/google-cloud-bigquery-1.106.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-bigquery/v2-rev20191211-1.30.9/4c8b0534651f0000d3320307cea24e145c237a5e/google-api-services-bigquery-v2-rev20191211-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-healthcare/v1beta1-rev20200307-1.30.9/89a426386623daf7528b0869a710673c3bf7c156/google-api-services-healthcare-v1beta1-rev20200307-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-pubsub/v1-rev20200312-1.30.9/aad0ede541289a8ca1a280facd0fe159fa9cebd4/google-api-services-pubsub-v1-rev20200312-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud.datastore/datastore-v1-proto-client/1.6.3/b0ffaeb68af9cb99772aa9933aa9baa0521568dc/datastore-v1-proto-client-1.6.3.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-cloudresourcemanager/v1-rev20200311-1.30.9/6c675cf826d657a08e2fece69317de4319d3192f/google-api-services-cloudresourcemanager-v1-rev20200311-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-storage/v1-rev20200326-1.30.9/5c0856380fff4fe8d11f5df03c653c16f69388e1/google-api-services-storage-v1-rev20200326-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api-client/google-api-client-java6/1.30.9/cef0e59174a3d6066598046209e44c9c34bb3132/google-api-client-java6-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api-client/google-api-client-jackson2/1.30.9/b6af6877119b2932ca293b14b33aa5f7393a358c/google-api-client-jackson2-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-core-http/1.92.4/d2776edaa94a4eee16332966adb575d8c5e63210/google-cloud-core-http-1.92.4.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api-client/google-api-client/1.30.9/8fc8641740ee0c26715738fb4d902779b96217d6/google-api-client-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-spanner/1.49.1/19dfd15e889a41c306b5dd62a622db6c471cb6fc/google-cloud-spanner-1.49.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-bigtable/1.9.1/8df06e49c314f524e6fdc11878ad5647f2a992b6/google-cloud-bigtable-1.9.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-core-grpc/1.92.2/af95d438cc2f05cdbc0c31101ca2fd17da753ee9/google-cloud-core-grpc-1.92.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api/gax-grpc/1.54.0/93d75eab29a7923f407efdae59a87e032b86d9f7/gax-grpc-1.54.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-core/1.92.5/97ca5c35f34ee67dde8f2d4e9b940f0a96393c66/google-cloud-core-1.92.5.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api/gax-httpjson/0.70.1/15b2c7844b9561ea036e009c153cf1ef70701afb/gax-httpjson-0.70.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api/gax/1.54.0/1f1668868b8b3fd5fc248d80c16dd9f09afc9180/gax-1.54.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-alts/1.29.0/aa559eb5550bb772e42665c271cfe4dfcf22116f/grpc-alts-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.auth/google-auth-library-oauth2-http/0.20.0/f33d4d6c91a68826816606a2208990eea93fcb2a/google-auth-library-oauth2-http-0.20.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-all/1.27.2/fd3c66752c77b7f2c1e345f852862af92dd36a83/grpc-all-1.27.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-auth/1.29.0/aa89399565038ca930510b8428588cc919086eba/grpc-auth-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.auth/google-auth-library-credentials/0.20.0/87a91a373e64ba5c3cdf8cc5cf54b189dd1492f8/google-auth-library-credentials-0.20.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.oauth-client/google-oauth-client-java6/1.30.6/6c744e05b99325030625d9c7df7f3306f3e80a4e/google-oauth-client-java6-1.30.6.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.oauth-client/google-oauth-client/1.30.6/5c9a0e62e0cbda2bad495fa72147a331617f2e99/google-oauth-client-1.30.6.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.http-client/google-http-client-jackson2/1.34.2/fc5cd5e7b6252bae6a65ef2a1091476afb25b7c7/google-http-client-jackson2-1.34.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.http-client/google-http-client-protobuf/1.33.0/aa38d441440dc8ed72509b94580403c58d910caa/google-http-client-protobuf-1.33.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.http-client/google-http-client-appengine/1.34.1/d5d538385a2fa2b23d1831c35af66ac2e5445a0f/google-http-client-appengine-1.34.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.http-client/google-http-client/1.34.2/58b4c500b5ce58271f1a6c5f6767aa8721158ea4/google-http-client-1.34.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.10.2/528de95f198afafbcfb0c09d2e43b6e0ea663ec/jackson-databind-2.10.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-annotations/2.10.2/3a13b6105946541b8d4181a0506355b5fae63260/jackson-annotations-2.10.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-core/2.10.2/73d4322a6bda684f676a2b5fe918361c4e5c7cca/jackson-core-2.10.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/joda-time/joda-time/2.10.5/7f1d89817cd20a32444d5ab4160f035ab9b864e7/joda-time-2.10.5.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-grpc-1_26_0/0.3/6871e7d0b92cd4983064166ee44d633e3800ef0f/beam-vendor-grpc-1_26_0-0.3.jar:/Users/grahampolley/.m2/repository/args4j/args4j/2.33/args4j-2.33.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-netty/1.27.2/dd4760eafb989dd1997c8f918eb376717b36e4aa/grpc-netty-1.27.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.opencensus/opencensus-contrib-grpc-util/0.24.0/1ac2f0c1a02c7ec0cd244e59228f687a90dfddab/opencensus-contrib-grpc-util-0.24.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-okhttp/1.27.2/5081585398fe003550717016928d18ec849f44f4/grpc-okhttp-1.27.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-testing/1.27.2/f0d7e35e5d0cb23a99afc26f2372e30b5e97cc01/grpc-testing-1.27.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-netty-shaded/1.29.0/80078c0aec3f2146ce94b5c0bddd722fa1f8e502/grpc-netty-shaded-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-grpclb/1.29.0/98635bf61b76cfdc3b38442290d4939120f9f51c/grpc-grpclb-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-core/1.29.0/b051a14a67c97bb9bbe0b9a03b5d7e7080e7b960/grpc-core-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.opencensus/opencensus-contrib-http-util/0.24.0/6d96406c272d884038eb63b262458df75b5445/opencensus-contrib-http-util-0.24.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.opencensus/opencensus-api/0.24.0/f974451b19007ce820f433311ce8adb88e2b7d2c/opencensus-api-0.24.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/grpc-google-cloud-pubsub-v1/1.85.1/5c59ba127376fc68b7c7f38f9b837e280b32b6dd/grpc-google-cloud-pubsub-v1-1.85.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/grpc-google-cloud-bigtable-v2/1.9.1/9edbe1087db616c9023b56f5581551002c95caad/grpc-google-cloud-bigtable-v2-1.9.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/grpc-google-cloud-bigtable-admin-v2/1.9.1/974022119ccf41a7443a4a092c77559966a18b4d/grpc-google-cloud-bigtable-admin-v2-1.9.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/grpc-google-common-protos/1.17.0/9c5b4135ff50dbc0d299ba8288c5f8e829470994/grpc-google-common-protos-1.17.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-protobuf/1.29.0/57a50ff72c0da333c31ad45616ea17f7d3b91b1c/grpc-protobuf-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-stub/1.29.0/2ded7df15689f2e606189456a91f7a9e1da9feee/grpc-stub-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-protobuf-lite/1.29.0/22e5ff00b942ee573c9dbadfc5e690b151a6d5e4/grpc-protobuf-lite-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-api/1.29.0/4f067a7b1657ad95c00fe958e8a66c8f8446c9f/grpc-api-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-context/1.29.0/1d8a441110f86f8927543dc3007639080441ea3c/grpc-context-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-bigtable-v2/1.9.1/a02ccc9174a00d97d9d1bad53a1bb9e456de8714/proto-google-cloud-bigtable-v2-1.9.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-pubsub-v1/1.85.1/cddb71da7461d091effaaf51ed773fbc766263d6/proto-google-cloud-pubsub-v1-1.85.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-spanner-admin-database-v1/1.49.1/b1c18dc081a1858060577767af00dec47f0ddeeb/proto-google-cloud-spanner-admin-database-v1-1.49.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-datastore-v1/0.85.0/30d57cf3b3b0ae569975fcbf2dff36b958afb2c0/proto-google-cloud-datastore-v1-0.85.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-bigtable-admin-v2/1.9.1/fe8584d617c60d1f8cf203c33bd0358f4423e257/proto-google-cloud-bigtable-admin-v2-1.9.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-spanner-admin-instance-v1/1.49.1/d0e502423bfed8985952dcb2f909cb8417b6923e/proto-google-cloud-spanner-admin-instance-v1-1.49.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-iam-v1/0.13.0/ed3d62b64aa23a3decf324c8988eb4aae9f36e94/proto-google-iam-v1-0.13.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-spanner-v1/1.49.1/d9d02615204a606caf87c91f67c0ff9c31e1ba52/proto-google-cloud-spanner-v1-1.49.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api/api-common/1.8.1/e89befb19b08ad84b262b2f226ab79aefcaa9d7f/api-common-1.8.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java-util/3.11.4/99a6a669e55f3d587ac8eb61857f7b81d0bbd7f7/protobuf-java-util-3.11.4.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/29.0-jre/801142b4c3d0f0770dd29abea50906cacfddd447/guava-29.0-jre.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http2/4.1.42.Final/819e7b5f2005770cf7558c04276fff080331c6df/netty-codec-http2-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler-proxy/4.1.42.Final/7b816d9f37ddcb68f6c1b9b0d7b5a98bfac40911/netty-handler-proxy-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.42.Final/5f71267aa784d0e6c5ec09fb988339d244b205a0/netty-codec-http-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.42.Final/fc6546be5df552d9729f008d8d41a6dee28127aa/netty-handler-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-tcnative-boringssl-static/2.0.17.Final/b1e5acbde8c444c656131238ac6ab9e73f694300/netty-tcnative-boringssl-static-2.0.17.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-bigquerystorage-v1beta1/0.90.0/29e65631a68262e36cc1868768029767bfd8bd72/proto-google-cloud-bigquerystorage-v1beta1-0.90.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-bigquerystorage-v1alpha2/0.90.0/2a1296e1b7d2dfe668813dbda443a3bcd477b35e/proto-google-cloud-bigquerystorage-v1alpha2-0.90.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-bigquerystorage-v1beta2/0.90.0/c59fa6741ab5d84a2de4c11f07f41510751e52c4/proto-google-cloud-bigquerystorage-v1beta2-0.90.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-bigquerystorage-v1/0.90.0/6c6431f01d5ed66f1c7cf4c5f95deda4f73876c4/proto-google-cloud-bigquerystorage-v1-0.90.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-common-protos/1.18.0/b69bc5942255dc9dd2d2de432c4b99a5b483679/proto-google-common-protos-1.18.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.11.4/7ec0925cc3aef0335bbc7d57edfd42b0f86f8267/protobuf-java-3.11.4.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-bytebuddy-1_10_8/0.1/16661d7186f2d10440f1bb1c1e27e51f5306abe8/beam-vendor-bytebuddy-1_10_8-0.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.perfmark/perfmark-api/0.19.0/2bfc352777fa6e27ad1e11d11ea55651ba93236b/perfmark-api-0.19.0.jar:/Users/grahampolley/.m2/repository/com/google/code/findbugs/jsr305/3.0.2/jsr305-3.0.2.jar:/Users/grahampolley/.m2/repository/org/xerial/snappy/snappy-java/1.1.4/snappy-java-1.1.4.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.tukaani/xz/1.8/c4f7d054303948eb6a4066194253886c8af07128/xz-1.8.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.errorprone/error_prone_annotations/2.3.4/dac170e4594de319655ffb62f41cbd6dbb5e601e/error_prone_annotations-2.3.4.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.httpcomponents/httpclient/4.5.11/f6d42fee5110c227bac18a550a297e028f2fb21a/httpclient-4.5.11.jar:/Users/grahampolley/.m2/repository/commons-logging/commons-logging/1.2/commons-logging-1.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.6.2/bd1b74a5d170686362091c7cf596bbc3adf5c09b/log4j-api-2.6.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.conscrypt/conscrypt-openjdk-uber/2.2.1/59a346d64c0ddca750c5c877e274d5e6278e53ce/conscrypt-openjdk-uber-2.2.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.github.classgraph/classgraph/4.8.65/be28c46df75712f1fece48639c05633c8217f71/classgraph-4.8.65.jar:/Users/grahampolley/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/Users/grahampolley/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/Users/grahampolley/.m2/repository/com/thoughtworks/paranamer/paranamer/2.7/paranamer-2.7.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.20/b8df472b31e1f17c232d2ad78ceb1c84e00c641b/commons-compress-1.20.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.auto.value/auto-value-annotations/1.7.1/58d76a9ec581f7c6d33f3343de9b2ba04a0ae799/auto-value-annotations-1.7.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.flogger/google-extensions/0.5.1/4f1d862216754651fc1e1f2b614746810e68a4ff/google-extensions-0.5.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.httpcomponents/httpcore/4.4.13/853b96d3afbb7bf8cc303fe27ee96836a10c1834/httpcore-4.4.13.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.j2objc/j2objc-annotations/1.3/ba035118bc8bac37d7eff77700720999acd9986d/j2objc-annotations-1.3.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.flogger/flogger-system-backend/0.5.1/b66d3bedb14da604828a8693bb24fd78e36b0e9e/flogger-system-backend-0.5.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.threeten/threetenbp/1.4.1/8e0de17c5077523503812837fa069459103fb714/threetenbp-1.4.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.13/3f18e1aa31031d89db6f01ba05d501258ce69d2c/commons-codec-1.13.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-lang3/3.5/6c6c702c89bfff3cd9e80b04d668c5e190d588c6/commons-lang3-3.5.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.code.gson/gson/2.8.6/9180733b7df8542621dc12e21e87557e8c99b8cb/gson-2.8.6.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.android/annotations/4.1.1.4/a1678ba907bf92691d879fef34e1a187038f9259/annotations-4.1.1.4.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-socks/4.1.42.Final/dd355f01dc00f93aaebe805b05026d7cf57c60ec/netty-codec-socks-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec/4.1.42.Final/b1d5ed85a558fbbadc2783f869fbd0adcd32b07b/netty-codec-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-transport/4.1.42.Final/857502e863c02c829fdafea61c3fda6bda01d0af/netty-transport-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-buffer/4.1.42.Final/6e6fc9178d1f1401aa0d6b843341efb91720f2cd/netty-buffer-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/javax.annotation/javax.annotation-api/1.3.2/934c04d3cfef185a8008e7bf34331b79730a9d43/javax.annotation-api-1.3.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.guava/failureaccess/1.0.1/1dcf1de382a0bf95a3d8b0849546c88bac1292c9/failureaccess-1.0.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.guava/listenablefuture/9999.0-empty-to-avoid-conflict-with-guava/b421526c5f297295adef1c886e5246c39d4ac629/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.checkerframework/checker-qual/2.11.1/8c43bf8f99b841d23aadda6044329dad9b63c185/checker-qual-2.11.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.flogger/flogger/0.5.1/71d1e2cef9cc604800825583df56b8ef5c053f14/flogger-0.5.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.checkerframework/checker-compat-qual/2.5.5/435dc33e3019c9f019e15f01aa111de9d6b2b79c/checker-compat-qual-2.5.5.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.codehaus.mojo/animal-sniffer-annotations/1.18/f7aa683ea79dc6681ee9fb95756c999acbb62f5d/animal-sniffer-annotations-1.18.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.squareup.okio/okio/1.13.0/a9283170b7305c8d92d25aff02a6ab7e45d06cbe/okio-1.13.0.jar:/Users/grahampolley/.m2/repository/com/squareup/okhttp/okhttp/2.5.0/okhttp-2.5.0.jar:/Users/grahampolley/.m2/repository/junit/junit/4.12/junit-4.12.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-resolver/4.1.42.Final/ccaacf418a9e486b65e82c47bed66439119c5fdb/netty-resolver-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.42.Final/e02700b574d3a0e2100308f971f0753ac8700e7c/netty-common-4.1.42.Final.jar:/Users/grahampolley/.m2/repository/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar:/Applications/IntelliJ
IDEA CE.app/Contents/lib/idea_rt.jar" org.polleyg.TemplatePipeline
--project=etl-demo-269105 --runner=DirectRunner --streaming=true
--tempLocation=gs://micro-batch-test/tmp
--stagingLocation=gs://micro-batch-test/tmp --workerZone=australia-southeast1-a
--region=asia-northeast1/Library/Java/JavaVirtualMachines/adoptopenjdk-11.jdk/Contents/Home/bin/java
-agentlib:jdwp=transport=dt_socket,address=127.0.0.1:50370,suspend=y,server=n
-ea
-javaagent:/Users/grahampolley/Library/Caches/IdeaIC2018.2/groovyHotSwap/gragent.jar
-javaagent:/Users/grahampolley/Library/Caches/IdeaIC2018.2/captureAgent/debugger-agent.jar=file:/private/var/folders/_t/1lxhy17n2mlc0_4rb3_3thq40001g7/T/capture.props
-Dfile.encoding=UTF-8 -classpath
"/Users/grahampolley/Dev/gcp-batch-ingestion-bigquery/out/production/classes:/Users/grahampolley/Dev/gcp-batch-ingestion-bigquery/out/production/resources:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-log4j12/1.7.30/c21f55139d8141d2231214fb1feaf50a1edca95e/slf4j-log4j12-1.7.30.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-runners-google-cloud-dataflow-java/2.22.0/f251d2508e2c55a6c44fdf4c2c66ce4407b84908/beam-runners-google-cloud-dataflow-java-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-runners-direct-java/2.22.0/541d048b57a0bb189310ebc85c6266241a732952/beam-runners-direct-java-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-sdks-java-io-google-cloud-platform/2.22.0/cdd9b337c909cd5ed85e379b7ae1c1a26dfba55f/beam-sdks-java-io-google-cloud-platform-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-sdks-java-extensions-google-cloud-platform-core/2.22.0/c3509a7b95a3533775cbaa61e438a482e2cdd5bf/beam-sdks-java-extensions-google-cloud-platform-core-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-sdks-java-expansion-service/2.22.0/6180ba45374960b0b639df0ae47c3b3736045612/beam-sdks-java-expansion-service-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-runners-java-fn-execution/2.22.0/cdeafa6cb9365caf63da4232e326b2eb2ddcde33/beam-runners-java-fn-execution-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-runners-core-construction-java/2.22.0/94c4145c597f700a4f3807282f916427dab5ef70/beam-runners-core-construction-java-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-sdks-java-extensions-protobuf/2.22.0/580a4bf3424738b30f840e29d4e5946df7017a84/beam-sdks-java-extensions-protobuf-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-sdks-java-fn-execution/2.22.0/5da4c54468112faf1f8b7f8bbf5524fe49d6c8ef/beam-sdks-java-fn-execution-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-sdks-java-extensions-protobuf/2.22.0/ec7722d88ddceb57806a76f7285bd8c0a1cc7fcd/beam-vendor-sdks-java-extensions-protobuf-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-sdks-java-core/2.22.0/73d5b6a07c6ae1579506c6224895efbd53d69f3f/beam-sdks-java-core-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-model-pipeline/2.22.0/3b5b6715b2f9b6e7e35b51acd904e30bde77d10e/beam-model-pipeline-2.22.0.jar:/Users/grahampolley/.m2/repository/org/apache/avro/avro/1.8.2/avro-1.8.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud.bigtable/bigtable-client-core/1.13.0/a86858be799ad73f1139017f91c4777147552fd4/bigtable-client-core-1.13.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-model-job-management/2.22.0/c155afa58e1cc382e7de62b9a67e0b918f2f5c1c/beam-model-job-management-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-model-fn-execution/2.22.0/8f180cc8d7ed9b0bfbbede2901ce9c62f10b50fd/beam-model-fn-execution-2.22.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.dropwizard.metrics/metrics-core/3.2.6/62fe170cffeded1cef60e9e3402a93b45ce14327/metrics-core-3.2.6.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.30/b5a4b6d16ab13e34a88fae84c35cd5d68cac922c/slf4j-api-1.7.30.jar:/Users/grahampolley/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-guava-26_0-jre/0.1/f309c3047ca99428e567afa42d233fb3e839bde1/beam-vendor-guava-26_0-jre-0.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud.bigdataoss/gcsio/2.1.3/6dfefdc343ae4e5d1ff6df480f0457c073bfc19c/gcsio-2.1.3.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud.bigdataoss/util/2.1.3/3a0d8e8883268fd1d8ec4241084fdb290bf7782b/util-2.1.3.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-clouddebugger/v2-rev20200313-1.30.9/f08af56f1e6cf8673d9408b99874eda40e6960fe/google-api-services-clouddebugger-v2-rev20200313-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-dataflow/v1b3-rev20200305-1.30.9/66c26eb008746cc8003c9dad799b2420cfdd75ed/google-api-services-dataflow-v1b3-rev20200305-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-bigquerystorage/0.125.0-beta/4ade9a93e57bdf032d8325e3fc0bd3cad75f153b/google-cloud-bigquerystorage-0.125.0-beta.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-bigquery/1.106.0/bc7232727654ba78edd63f8c95dd0ea47c4950f8/google-cloud-bigquery-1.106.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-bigquery/v2-rev20191211-1.30.9/4c8b0534651f0000d3320307cea24e145c237a5e/google-api-services-bigquery-v2-rev20191211-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-healthcare/v1beta1-rev20200307-1.30.9/89a426386623daf7528b0869a710673c3bf7c156/google-api-services-healthcare-v1beta1-rev20200307-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-pubsub/v1-rev20200312-1.30.9/aad0ede541289a8ca1a280facd0fe159fa9cebd4/google-api-services-pubsub-v1-rev20200312-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud.datastore/datastore-v1-proto-client/1.6.3/b0ffaeb68af9cb99772aa9933aa9baa0521568dc/datastore-v1-proto-client-1.6.3.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-cloudresourcemanager/v1-rev20200311-1.30.9/6c675cf826d657a08e2fece69317de4319d3192f/google-api-services-cloudresourcemanager-v1-rev20200311-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-storage/v1-rev20200326-1.30.9/5c0856380fff4fe8d11f5df03c653c16f69388e1/google-api-services-storage-v1-rev20200326-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api-client/google-api-client-java6/1.30.9/cef0e59174a3d6066598046209e44c9c34bb3132/google-api-client-java6-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api-client/google-api-client-jackson2/1.30.9/b6af6877119b2932ca293b14b33aa5f7393a358c/google-api-client-jackson2-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-core-http/1.92.4/d2776edaa94a4eee16332966adb575d8c5e63210/google-cloud-core-http-1.92.4.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api-client/google-api-client/1.30.9/8fc8641740ee0c26715738fb4d902779b96217d6/google-api-client-1.30.9.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-spanner/1.49.1/19dfd15e889a41c306b5dd62a622db6c471cb6fc/google-cloud-spanner-1.49.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-bigtable/1.9.1/8df06e49c314f524e6fdc11878ad5647f2a992b6/google-cloud-bigtable-1.9.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-core-grpc/1.92.2/af95d438cc2f05cdbc0c31101ca2fd17da753ee9/google-cloud-core-grpc-1.92.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api/gax-grpc/1.54.0/93d75eab29a7923f407efdae59a87e032b86d9f7/gax-grpc-1.54.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-core/1.92.5/97ca5c35f34ee67dde8f2d4e9b940f0a96393c66/google-cloud-core-1.92.5.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api/gax-httpjson/0.70.1/15b2c7844b9561ea036e009c153cf1ef70701afb/gax-httpjson-0.70.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api/gax/1.54.0/1f1668868b8b3fd5fc248d80c16dd9f09afc9180/gax-1.54.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-alts/1.29.0/aa559eb5550bb772e42665c271cfe4dfcf22116f/grpc-alts-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.auth/google-auth-library-oauth2-http/0.20.0/f33d4d6c91a68826816606a2208990eea93fcb2a/google-auth-library-oauth2-http-0.20.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-all/1.27.2/fd3c66752c77b7f2c1e345f852862af92dd36a83/grpc-all-1.27.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-auth/1.29.0/aa89399565038ca930510b8428588cc919086eba/grpc-auth-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.auth/google-auth-library-credentials/0.20.0/87a91a373e64ba5c3cdf8cc5cf54b189dd1492f8/google-auth-library-credentials-0.20.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.oauth-client/google-oauth-client-java6/1.30.6/6c744e05b99325030625d9c7df7f3306f3e80a4e/google-oauth-client-java6-1.30.6.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.oauth-client/google-oauth-client/1.30.6/5c9a0e62e0cbda2bad495fa72147a331617f2e99/google-oauth-client-1.30.6.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.http-client/google-http-client-jackson2/1.34.2/fc5cd5e7b6252bae6a65ef2a1091476afb25b7c7/google-http-client-jackson2-1.34.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.http-client/google-http-client-protobuf/1.33.0/aa38d441440dc8ed72509b94580403c58d910caa/google-http-client-protobuf-1.33.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.http-client/google-http-client-appengine/1.34.1/d5d538385a2fa2b23d1831c35af66ac2e5445a0f/google-http-client-appengine-1.34.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.http-client/google-http-client/1.34.2/58b4c500b5ce58271f1a6c5f6767aa8721158ea4/google-http-client-1.34.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.10.2/528de95f198afafbcfb0c09d2e43b6e0ea663ec/jackson-databind-2.10.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-annotations/2.10.2/3a13b6105946541b8d4181a0506355b5fae63260/jackson-annotations-2.10.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-core/2.10.2/73d4322a6bda684f676a2b5fe918361c4e5c7cca/jackson-core-2.10.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/joda-time/joda-time/2.10.5/7f1d89817cd20a32444d5ab4160f035ab9b864e7/joda-time-2.10.5.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-grpc-1_26_0/0.3/6871e7d0b92cd4983064166ee44d633e3800ef0f/beam-vendor-grpc-1_26_0-0.3.jar:/Users/grahampolley/.m2/repository/args4j/args4j/2.33/args4j-2.33.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-netty/1.27.2/dd4760eafb989dd1997c8f918eb376717b36e4aa/grpc-netty-1.27.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.opencensus/opencensus-contrib-grpc-util/0.24.0/1ac2f0c1a02c7ec0cd244e59228f687a90dfddab/opencensus-contrib-grpc-util-0.24.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-okhttp/1.27.2/5081585398fe003550717016928d18ec849f44f4/grpc-okhttp-1.27.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-testing/1.27.2/f0d7e35e5d0cb23a99afc26f2372e30b5e97cc01/grpc-testing-1.27.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-netty-shaded/1.29.0/80078c0aec3f2146ce94b5c0bddd722fa1f8e502/grpc-netty-shaded-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-grpclb/1.29.0/98635bf61b76cfdc3b38442290d4939120f9f51c/grpc-grpclb-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-core/1.29.0/b051a14a67c97bb9bbe0b9a03b5d7e7080e7b960/grpc-core-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.opencensus/opencensus-contrib-http-util/0.24.0/6d96406c272d884038eb63b262458df75b5445/opencensus-contrib-http-util-0.24.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.opencensus/opencensus-api/0.24.0/f974451b19007ce820f433311ce8adb88e2b7d2c/opencensus-api-0.24.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/grpc-google-cloud-pubsub-v1/1.85.1/5c59ba127376fc68b7c7f38f9b837e280b32b6dd/grpc-google-cloud-pubsub-v1-1.85.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/grpc-google-cloud-bigtable-v2/1.9.1/9edbe1087db616c9023b56f5581551002c95caad/grpc-google-cloud-bigtable-v2-1.9.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/grpc-google-cloud-bigtable-admin-v2/1.9.1/974022119ccf41a7443a4a092c77559966a18b4d/grpc-google-cloud-bigtable-admin-v2-1.9.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/grpc-google-common-protos/1.17.0/9c5b4135ff50dbc0d299ba8288c5f8e829470994/grpc-google-common-protos-1.17.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-protobuf/1.29.0/57a50ff72c0da333c31ad45616ea17f7d3b91b1c/grpc-protobuf-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-stub/1.29.0/2ded7df15689f2e606189456a91f7a9e1da9feee/grpc-stub-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-protobuf-lite/1.29.0/22e5ff00b942ee573c9dbadfc5e690b151a6d5e4/grpc-protobuf-lite-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-api/1.29.0/4f067a7b1657ad95c00fe958e8a66c8f8446c9f/grpc-api-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-context/1.29.0/1d8a441110f86f8927543dc3007639080441ea3c/grpc-context-1.29.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-bigtable-v2/1.9.1/a02ccc9174a00d97d9d1bad53a1bb9e456de8714/proto-google-cloud-bigtable-v2-1.9.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-pubsub-v1/1.85.1/cddb71da7461d091effaaf51ed773fbc766263d6/proto-google-cloud-pubsub-v1-1.85.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-spanner-admin-database-v1/1.49.1/b1c18dc081a1858060577767af00dec47f0ddeeb/proto-google-cloud-spanner-admin-database-v1-1.49.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-datastore-v1/0.85.0/30d57cf3b3b0ae569975fcbf2dff36b958afb2c0/proto-google-cloud-datastore-v1-0.85.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-bigtable-admin-v2/1.9.1/fe8584d617c60d1f8cf203c33bd0358f4423e257/proto-google-cloud-bigtable-admin-v2-1.9.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-spanner-admin-instance-v1/1.49.1/d0e502423bfed8985952dcb2f909cb8417b6923e/proto-google-cloud-spanner-admin-instance-v1-1.49.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-iam-v1/0.13.0/ed3d62b64aa23a3decf324c8988eb4aae9f36e94/proto-google-iam-v1-0.13.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-spanner-v1/1.49.1/d9d02615204a606caf87c91f67c0ff9c31e1ba52/proto-google-cloud-spanner-v1-1.49.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api/api-common/1.8.1/e89befb19b08ad84b262b2f226ab79aefcaa9d7f/api-common-1.8.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java-util/3.11.4/99a6a669e55f3d587ac8eb61857f7b81d0bbd7f7/protobuf-java-util-3.11.4.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/29.0-jre/801142b4c3d0f0770dd29abea50906cacfddd447/guava-29.0-jre.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http2/4.1.42.Final/819e7b5f2005770cf7558c04276fff080331c6df/netty-codec-http2-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler-proxy/4.1.42.Final/7b816d9f37ddcb68f6c1b9b0d7b5a98bfac40911/netty-handler-proxy-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.42.Final/5f71267aa784d0e6c5ec09fb988339d244b205a0/netty-codec-http-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.42.Final/fc6546be5df552d9729f008d8d41a6dee28127aa/netty-handler-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-tcnative-boringssl-static/2.0.17.Final/b1e5acbde8c444c656131238ac6ab9e73f694300/netty-tcnative-boringssl-static-2.0.17.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-bigquerystorage-v1beta1/0.90.0/29e65631a68262e36cc1868768029767bfd8bd72/proto-google-cloud-bigquerystorage-v1beta1-0.90.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-bigquerystorage-v1alpha2/0.90.0/2a1296e1b7d2dfe668813dbda443a3bcd477b35e/proto-google-cloud-bigquerystorage-v1alpha2-0.90.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-bigquerystorage-v1beta2/0.90.0/c59fa6741ab5d84a2de4c11f07f41510751e52c4/proto-google-cloud-bigquerystorage-v1beta2-0.90.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-cloud-bigquerystorage-v1/0.90.0/6c6431f01d5ed66f1c7cf4c5f95deda4f73876c4/proto-google-cloud-bigquerystorage-v1-0.90.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.api.grpc/proto-google-common-protos/1.18.0/b69bc5942255dc9dd2d2de432c4b99a5b483679/proto-google-common-protos-1.18.0.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.11.4/7ec0925cc3aef0335bbc7d57edfd42b0f86f8267/protobuf-java-3.11.4.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-bytebuddy-1_10_8/0.1/16661d7186f2d10440f1bb1c1e27e51f5306abe8/beam-vendor-bytebuddy-1_10_8-0.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.perfmark/perfmark-api/0.19.0/2bfc352777fa6e27ad1e11d11ea55651ba93236b/perfmark-api-0.19.0.jar:/Users/grahampolley/.m2/repository/com/google/code/findbugs/jsr305/3.0.2/jsr305-3.0.2.jar:/Users/grahampolley/.m2/repository/org/xerial/snappy/snappy-java/1.1.4/snappy-java-1.1.4.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.tukaani/xz/1.8/c4f7d054303948eb6a4066194253886c8af07128/xz-1.8.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.errorprone/error_prone_annotations/2.3.4/dac170e4594de319655ffb62f41cbd6dbb5e601e/error_prone_annotations-2.3.4.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.httpcomponents/httpclient/4.5.11/f6d42fee5110c227bac18a550a297e028f2fb21a/httpclient-4.5.11.jar:/Users/grahampolley/.m2/repository/commons-logging/commons-logging/1.2/commons-logging-1.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.6.2/bd1b74a5d170686362091c7cf596bbc3adf5c09b/log4j-api-2.6.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.conscrypt/conscrypt-openjdk-uber/2.2.1/59a346d64c0ddca750c5c877e274d5e6278e53ce/conscrypt-openjdk-uber-2.2.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.github.classgraph/classgraph/4.8.65/be28c46df75712f1fece48639c05633c8217f71/classgraph-4.8.65.jar:/Users/grahampolley/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/Users/grahampolley/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/Users/grahampolley/.m2/repository/com/thoughtworks/paranamer/paranamer/2.7/paranamer-2.7.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.20/b8df472b31e1f17c232d2ad78ceb1c84e00c641b/commons-compress-1.20.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.auto.value/auto-value-annotations/1.7.1/58d76a9ec581f7c6d33f3343de9b2ba04a0ae799/auto-value-annotations-1.7.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.flogger/google-extensions/0.5.1/4f1d862216754651fc1e1f2b614746810e68a4ff/google-extensions-0.5.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.httpcomponents/httpcore/4.4.13/853b96d3afbb7bf8cc303fe27ee96836a10c1834/httpcore-4.4.13.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.j2objc/j2objc-annotations/1.3/ba035118bc8bac37d7eff77700720999acd9986d/j2objc-annotations-1.3.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.flogger/flogger-system-backend/0.5.1/b66d3bedb14da604828a8693bb24fd78e36b0e9e/flogger-system-backend-0.5.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.threeten/threetenbp/1.4.1/8e0de17c5077523503812837fa069459103fb714/threetenbp-1.4.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.13/3f18e1aa31031d89db6f01ba05d501258ce69d2c/commons-codec-1.13.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-lang3/3.5/6c6c702c89bfff3cd9e80b04d668c5e190d588c6/commons-lang3-3.5.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.code.gson/gson/2.8.6/9180733b7df8542621dc12e21e87557e8c99b8cb/gson-2.8.6.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.android/annotations/4.1.1.4/a1678ba907bf92691d879fef34e1a187038f9259/annotations-4.1.1.4.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-socks/4.1.42.Final/dd355f01dc00f93aaebe805b05026d7cf57c60ec/netty-codec-socks-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec/4.1.42.Final/b1d5ed85a558fbbadc2783f869fbd0adcd32b07b/netty-codec-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-transport/4.1.42.Final/857502e863c02c829fdafea61c3fda6bda01d0af/netty-transport-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-buffer/4.1.42.Final/6e6fc9178d1f1401aa0d6b843341efb91720f2cd/netty-buffer-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/javax.annotation/javax.annotation-api/1.3.2/934c04d3cfef185a8008e7bf34331b79730a9d43/javax.annotation-api-1.3.2.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.guava/failureaccess/1.0.1/1dcf1de382a0bf95a3d8b0849546c88bac1292c9/failureaccess-1.0.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.guava/listenablefuture/9999.0-empty-to-avoid-conflict-with-guava/b421526c5f297295adef1c886e5246c39d4ac629/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.checkerframework/checker-qual/2.11.1/8c43bf8f99b841d23aadda6044329dad9b63c185/checker-qual-2.11.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.google.flogger/flogger/0.5.1/71d1e2cef9cc604800825583df56b8ef5c053f14/flogger-0.5.1.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.checkerframework/checker-compat-qual/2.5.5/435dc33e3019c9f019e15f01aa111de9d6b2b79c/checker-compat-qual-2.5.5.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/org.codehaus.mojo/animal-sniffer-annotations/1.18/f7aa683ea79dc6681ee9fb95756c999acbb62f5d/animal-sniffer-annotations-1.18.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/com.squareup.okio/okio/1.13.0/a9283170b7305c8d92d25aff02a6ab7e45d06cbe/okio-1.13.0.jar:/Users/grahampolley/.m2/repository/com/squareup/okhttp/okhttp/2.5.0/okhttp-2.5.0.jar:/Users/grahampolley/.m2/repository/junit/junit/4.12/junit-4.12.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-resolver/4.1.42.Final/ccaacf418a9e486b65e82c47bed66439119c5fdb/netty-resolver-4.1.42.Final.jar:/Users/grahampolley/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.42.Final/e02700b574d3a0e2100308f971f0753ac8700e7c/netty-common-4.1.42.Final.jar:/Users/grahampolley/.m2/repository/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar:/Applications/IntelliJ
IDEA CE.app/Contents/lib/idea_rt.jar" org.polleyg.TemplatePipeline
--project=etl-demo-269105 --runner=DirectRunner --streaming=true
--tempLocation=gs://micro-batch-test/tmp
--stagingLocation=gs://micro-batch-test/tmp --workerZone=australia-southeast1-a
--region=asia-northeast1Connected to the target VM, address: '127.0.0.1:50370',
transport: 'socket'15:25:23,603 0 [main] WARN
org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource - Created subscription
projects/etl-demo-269105/subscriptions/test-micro-batching_beam_-2917199999199772085
to topic projects/etl-demo-269105/topics/test-micro-batching. Note this
subscription WILL NOT be deleted when the pipeline terminates15:25:35,827 12224
[direct-runner-worker] INFO org.apache.beam.sdk.io.gcp.bigquery.BatchLoads -
Writing BigQuery temporary files to
gs://micro-batch-test/tmp/BigQueryWriteTemp/beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8/
before loading them.15:34:52,585 568982 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryRowWriter - Opening TableRowWriter
to
gs://micro-batch-test/tmp/BigQueryWriteTemp/beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8/ae94e5dc-68eb-4bb1-a1e2-1e747110be8e.15:35:00,216
576613 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - Waiting for jobs to
complete.15:35:00,216 576613 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - Job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
failed. retrying.15:35:01,532 577929 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.WriteTables - Loading 1 files into
GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=etl_demo_fun,
projectId=etl-demo-269105, tableId=micro_batch_test_xxx}} using job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}}, job id iteration
015:35:02,408 578805 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - Started BigQuery
job: GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}}.bq show -j
--format=prettyjson --project_id=etl-demo-269105
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-015:35:02,408
578805 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
started15:35:02,909 579306 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - BigQuery job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}} completed in state
DONE15:35:03,659 580056 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - No BigQuery job with
job id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
found.15:35:03,659 580056 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - job id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
not found, so retrying with that id15:35:03,676 580073 [direct-runner-worker]
INFO org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - Load job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
failed, will retry: {"errorResult":{"message":"Not found: Table
etl-demo-269105:etl_demo_fun.micro_batch_test_xxx was not found in location
australia-southeast1","reason":"notFound"},"errors":[{"message":"Not found:
Table etl-demo-269105:etl_demo_fun.micro_batch_test_xxx was not found in
location australia-southeast1","reason":"notFound"}],"state":"DONE"}. Next job
id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-015:35:03,676
580073 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - Job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
failed. retrying.15:35:05,818 582215 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.WriteTables - Loading 1 files into
GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=etl_demo_fun,
projectId=etl-demo-269105, tableId=micro_batch_test_xxx}} using job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}}, job id iteration
015:35:06,475 582872 [direct-runner-worker] WARN
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer - Request
failed with code 409, performed 0 retries due to IOExceptions, performed 0
retries due to unsuccessful status codes, HTTP framework says request can be
retried, (caller responsible for retrying):
https://bigquery.googleapis.com/bigquery/v2/projects/etl-demo-269105/jobs.
15:35:06,505 582902 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - BigQuery job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}} already exists,
will not retry inserting
it:com.google.api.client.googleapis.json.GoogleJsonResponseException: 409
Conflict{ "code" : 409, "errors" : [ { "domain" : "global", "message" :
"Already Exists: Job
etl-demo-269105:australia-southeast1.beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0",
"reason" : "duplicate" } ], "message" : "Already Exists: Job
etl-demo-269105:australia-southeast1.beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0",
"status" : "ALREADY_EXISTS"} at
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:444)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1108) at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:542)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:475)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:592)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:244)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:229)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startLoadJob(BigQueryServicesImpl.java:170)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables.lambda$startLoad$79287d2b$1(WriteTables.java:408)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJob.runJob(BigQueryHelpers.java:209)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJobManager.waitForDone(BigQueryHelpers.java:152)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn.finishBundle(WriteTables.java:274)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn$DoFnInvoker.invokeFinishBundle(Unknown
Source) at
org.apache.beam.repackaged.direct_java.runners.core.SimpleDoFnRunner.finishBundle(SimpleDoFnRunner.java:237)
at
org.apache.beam.repackaged.direct_java.runners.core.SimplePushbackSideInputDoFnRunner.finishBundle(SimplePushbackSideInputDoFnRunner.java:124)
at
org.apache.beam.runners.direct.ParDoEvaluator.finishBundle(ParDoEvaluator.java:265)
at
org.apache.beam.runners.direct.DoFnLifecycleManagerRemovingTransformEvaluator.finishBundle(DoFnLifecycleManagerRemovingTransformEvaluator.java:73)
at
org.apache.beam.runners.direct.DirectTransformExecutor.finishBundle(DirectTransformExecutor.java:188)
at
org.apache.beam.runners.direct.DirectTransformExecutor.run(DirectTransformExecutor.java:126)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:264) at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java) at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)15:35:06,523 582920
[direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
started15:35:07,041 583438 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - BigQuery job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}} completed in state
DONE15:35:07,730 584127 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - No BigQuery job with
job id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
found.15:35:07,731 584128 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - job id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
not found, so retrying with that id15:35:07,731 584128 [direct-runner-worker]
INFO org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - Load job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
failed, will retry: {"errorResult":{"message":"Not found: Table
etl-demo-269105:etl_demo_fun.micro_batch_test_xxx was not found in location
australia-southeast1","reason":"notFound"},"errors":[{"message":"Not found:
Table etl-demo-269105:etl_demo_fun.micro_batch_test_xxx was not found in
location australia-southeast1","reason":"notFound"}],"state":"DONE"}. Next job
id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-015:35:07,731
584128 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - Job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
failed. retrying.15:35:10,461 586858 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.WriteTables - Loading 1 files into
GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=etl_demo_fun,
projectId=etl-demo-269105, tableId=micro_batch_test_xxx}} using job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}}, job id iteration
015:35:11,096 587493 [direct-runner-worker] WARN
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer - Request
failed with code 409, performed 0 retries due to IOExceptions, performed 0
retries due to unsuccessful status codes, HTTP framework says request can be
retried, (caller responsible for retrying):
https://bigquery.googleapis.com/bigquery/v2/projects/etl-demo-269105/jobs.
15:35:11,098 587495 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - BigQuery job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}} already exists,
will not retry inserting
it:com.google.api.client.googleapis.json.GoogleJsonResponseException: 409
Conflict{ "code" : 409, "errors" : [ { "domain" : "global", "message" :
"Already Exists: Job
etl-demo-269105:australia-southeast1.beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0",
"reason" : "duplicate" } ], "message" : "Already Exists: Job
etl-demo-269105:australia-southeast1.beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0",
"status" : "ALREADY_EXISTS"} at
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:444)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1108) at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:542)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:475)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:592)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:244)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:229)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startLoadJob(BigQueryServicesImpl.java:170)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables.lambda$startLoad$79287d2b$1(WriteTables.java:408)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJob.runJob(BigQueryHelpers.java:209)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJobManager.waitForDone(BigQueryHelpers.java:152)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn.finishBundle(WriteTables.java:274)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn$DoFnInvoker.invokeFinishBundle(Unknown
Source) at
org.apache.beam.repackaged.direct_java.runners.core.SimpleDoFnRunner.finishBundle(SimpleDoFnRunner.java:237)
at
org.apache.beam.repackaged.direct_java.runners.core.SimplePushbackSideInputDoFnRunner.finishBundle(SimplePushbackSideInputDoFnRunner.java:124)
at
org.apache.beam.runners.direct.ParDoEvaluator.finishBundle(ParDoEvaluator.java:265)
at
org.apache.beam.runners.direct.DoFnLifecycleManagerRemovingTransformEvaluator.finishBundle(DoFnLifecycleManagerRemovingTransformEvaluator.java:73)
at
org.apache.beam.runners.direct.DirectTransformExecutor.finishBundle(DirectTransformExecutor.java:188)
at
org.apache.beam.runners.direct.DirectTransformExecutor.run(DirectTransformExecutor.java:126)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:264) at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java) at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)15:35:11,098 587495
[direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
started15:35:11,520 587917 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - BigQuery job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}} completed in state
DONE15:35:12,502 588899 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - No BigQuery job with
job id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
found.15:35:12,503 588900 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - job id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
not found, so retrying with that id15:35:12,503 588900 [direct-runner-worker]
INFO org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - Load job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
failed, will retry: {"errorResult":{"message":"Not found: Table
etl-demo-269105:etl_demo_fun.micro_batch_test_xxx was not found in location
australia-southeast1","reason":"notFound"},"errors":[{"message":"Not found:
Table etl-demo-269105:etl_demo_fun.micro_batch_test_xxx was not found in
location australia-southeast1","reason":"notFound"}],"state":"DONE"}. Next job
id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-015:35:12,503
588900 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - Job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
failed. retrying.15:35:14,707 591104 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.WriteTables - Loading 1 files into
GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=etl_demo_fun,
projectId=etl-demo-269105, tableId=micro_batch_test_xxx}} using job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}}, job id iteration
015:35:15,273 591670 [direct-runner-worker] WARN
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer - Request
failed with code 409, performed 0 retries due to IOExceptions, performed 0
retries due to unsuccessful status codes, HTTP framework says request can be
retried, (caller responsible for retrying):
https://bigquery.googleapis.com/bigquery/v2/projects/etl-demo-269105/jobs.
15:35:15,274 591671 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - BigQuery job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}} already exists,
will not retry inserting
it:com.google.api.client.googleapis.json.GoogleJsonResponseException: 409
Conflict{ "code" : 409, "errors" : [ { "domain" : "global", "message" :
"Already Exists: Job
etl-demo-269105:australia-southeast1.beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0",
"reason" : "duplicate" } ], "message" : "Already Exists: Job
etl-demo-269105:australia-southeast1.beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0",
"status" : "ALREADY_EXISTS"} at
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:444)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1108) at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:542)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:475)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:592)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:244)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:229)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startLoadJob(BigQueryServicesImpl.java:170)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables.lambda$startLoad$79287d2b$1(WriteTables.java:408)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJob.runJob(BigQueryHelpers.java:209)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJobManager.waitForDone(BigQueryHelpers.java:152)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn.finishBundle(WriteTables.java:274)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn$DoFnInvoker.invokeFinishBundle(Unknown
Source) at
org.apache.beam.repackaged.direct_java.runners.core.SimpleDoFnRunner.finishBundle(SimpleDoFnRunner.java:237)
at
org.apache.beam.repackaged.direct_java.runners.core.SimplePushbackSideInputDoFnRunner.finishBundle(SimplePushbackSideInputDoFnRunner.java:124)
at
org.apache.beam.runners.direct.ParDoEvaluator.finishBundle(ParDoEvaluator.java:265)
at
org.apache.beam.runners.direct.DoFnLifecycleManagerRemovingTransformEvaluator.finishBundle(DoFnLifecycleManagerRemovingTransformEvaluator.java:73)
at
org.apache.beam.runners.direct.DirectTransformExecutor.finishBundle(DirectTransformExecutor.java:188)
at
org.apache.beam.runners.direct.DirectTransformExecutor.run(DirectTransformExecutor.java:126)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:264) at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java) at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)15:35:15,274 591671
[direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
started15:35:15,659 592056 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - BigQuery job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}} completed in state
DONE15:35:16,224 592621 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - No BigQuery job with
job id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
found.15:35:16,224 592621 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - job id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
not found, so retrying with that id15:35:16,224 592621 [direct-runner-worker]
INFO org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - Load job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
failed, will retry: {"errorResult":{"message":"Not found: Table
etl-demo-269105:etl_demo_fun.micro_batch_test_xxx was not found in location
australia-southeast1","reason":"notFound"},"errors":[{"message":"Not found:
Table etl-demo-269105:etl_demo_fun.micro_batch_test_xxx was not found in
location australia-southeast1","reason":"notFound"}],"state":"DONE"}. Next job
id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-015:35:16,224
592621 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - Job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
failed. retrying.15:35:22,555 598952 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.WriteTables - Loading 1 files into
GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=etl_demo_fun,
projectId=etl-demo-269105, tableId=micro_batch_test_xxx}} using job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}}, job id iteration
015:35:23,194 599591 [direct-runner-worker] WARN
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer - Request
failed with code 409, performed 0 retries due to IOExceptions, performed 0
retries due to unsuccessful status codes, HTTP framework says request can be
retried, (caller responsible for retrying):
https://bigquery.googleapis.com/bigquery/v2/projects/etl-demo-269105/jobs.
15:35:23,195 599592 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - BigQuery job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}} already exists,
will not retry inserting
it:com.google.api.client.googleapis.json.GoogleJsonResponseException: 409
Conflict{ "code" : 409, "errors" : [ { "domain" : "global", "message" :
"Already Exists: Job
etl-demo-269105:australia-southeast1.beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0",
"reason" : "duplicate" } ], "message" : "Already Exists: Job
etl-demo-269105:australia-southeast1.beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0",
"status" : "ALREADY_EXISTS"} at
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:444)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1108) at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:542)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:475)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:592)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:244)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:229)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startLoadJob(BigQueryServicesImpl.java:170)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables.lambda$startLoad$79287d2b$1(WriteTables.java:408)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJob.runJob(BigQueryHelpers.java:209)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJobManager.waitForDone(BigQueryHelpers.java:152)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn.finishBundle(WriteTables.java:274)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn$DoFnInvoker.invokeFinishBundle(Unknown
Source) at
org.apache.beam.repackaged.direct_java.runners.core.SimpleDoFnRunner.finishBundle(SimpleDoFnRunner.java:237)
at
org.apache.beam.repackaged.direct_java.runners.core.SimplePushbackSideInputDoFnRunner.finishBundle(SimplePushbackSideInputDoFnRunner.java:124)
at
org.apache.beam.runners.direct.ParDoEvaluator.finishBundle(ParDoEvaluator.java:265)
at
org.apache.beam.runners.direct.DoFnLifecycleManagerRemovingTransformEvaluator.finishBundle(DoFnLifecycleManagerRemovingTransformEvaluator.java:73)
at
org.apache.beam.runners.direct.DirectTransformExecutor.finishBundle(DirectTransformExecutor.java:188)
at
org.apache.beam.runners.direct.DirectTransformExecutor.run(DirectTransformExecutor.java:126)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:264) at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java) at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)15:35:23,195 599592
[direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
started15:35:23,697 600094 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - BigQuery job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}} completed in state
DONE15:35:24,254 600651 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - No BigQuery job with
job id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
found.15:35:24,254 600651 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - job id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
not found, so retrying with that id15:35:24,254 600651 [direct-runner-worker]
INFO org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - Load job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
failed, will retry: {"errorResult":{"message":"Not found: Table
etl-demo-269105:etl_demo_fun.micro_batch_test_xxx was not found in location
australia-southeast1","reason":"notFound"},"errors":[{"message":"Not found:
Table etl-demo-269105:etl_demo_fun.micro_batch_test_xxx was not found in
location australia-southeast1","reason":"notFound"}],"state":"DONE"}. Next job
id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-015:35:24,254
600651 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - Job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
failed. retrying.15:35:30,901 607298 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.WriteTables - Loading 1 files into
GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=etl_demo_fun,
projectId=etl-demo-269105, tableId=micro_batch_test_xxx}} using job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}}, job id iteration
015:35:31,542 607939 [direct-runner-worker] WARN
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer - Request
failed with code 409, performed 0 retries due to IOExceptions, performed 0
retries due to unsuccessful status codes, HTTP framework says request can be
retried, (caller responsible for retrying):
https://bigquery.googleapis.com/bigquery/v2/projects/etl-demo-269105/jobs.
15:35:31,542 607939 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - BigQuery job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}} already exists,
will not retry inserting
it:com.google.api.client.googleapis.json.GoogleJsonResponseException: 409
Conflict{ "code" : 409, "errors" : [ { "domain" : "global", "message" :
"Already Exists: Job
etl-demo-269105:australia-southeast1.beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0",
"reason" : "duplicate" } ], "message" : "Already Exists: Job
etl-demo-269105:australia-southeast1.beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0",
"status" : "ALREADY_EXISTS"} at
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:444)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1108) at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:542)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:475)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:592)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:244)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:229)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startLoadJob(BigQueryServicesImpl.java:170)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables.lambda$startLoad$79287d2b$1(WriteTables.java:408)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJob.runJob(BigQueryHelpers.java:209)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJobManager.waitForDone(BigQueryHelpers.java:152)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn.finishBundle(WriteTables.java:274)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn$DoFnInvoker.invokeFinishBundle(Unknown
Source) at
org.apache.beam.repackaged.direct_java.runners.core.SimpleDoFnRunner.finishBundle(SimpleDoFnRunner.java:237)
at
org.apache.beam.repackaged.direct_java.runners.core.SimplePushbackSideInputDoFnRunner.finishBundle(SimplePushbackSideInputDoFnRunner.java:124)
at
org.apache.beam.runners.direct.ParDoEvaluator.finishBundle(ParDoEvaluator.java:265)
at
org.apache.beam.runners.direct.DoFnLifecycleManagerRemovingTransformEvaluator.finishBundle(DoFnLifecycleManagerRemovingTransformEvaluator.java:73)
at
org.apache.beam.runners.direct.DirectTransformExecutor.finishBundle(DirectTransformExecutor.java:188)
at
org.apache.beam.runners.direct.DirectTransformExecutor.run(DirectTransformExecutor.java:126)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:264) at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java) at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)15:35:31,543 607940
[direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
started15:35:32,042 608439 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - BigQuery job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}} completed in state
DONE15:35:32,588 608985 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - No BigQuery job with
job id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
found.15:35:32,588 608985 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - job id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
not found, so retrying with that id15:35:32,588 608985 [direct-runner-worker]
INFO org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - Load job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
failed, will retry: {"errorResult":{"message":"Not found: Table
etl-demo-269105:etl_demo_fun.micro_batch_test_xxx was not found in location
australia-southeast1","reason":"notFound"},"errors":[{"message":"Not found:
Table etl-demo-269105:etl_demo_fun.micro_batch_test_xxx was not found in
location australia-southeast1","reason":"notFound"}],"state":"DONE"}. Next job
id
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-015:35:32,588
608985 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - Job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
failed. retrying.15:35:44,165 620562 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.WriteTables - Loading 1 files into
GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=etl_demo_fun,
projectId=etl-demo-269105, tableId=micro_batch_test_xxx}} using job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}}, job id iteration
015:35:44,688 621085 [direct-runner-worker] WARN
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer - Request
failed with code 409, performed 0 retries due to IOExceptions, performed 0
retries due to unsuccessful status codes, HTTP framework says request can be
retried, (caller responsible for retrying):
https://bigquery.googleapis.com/bigquery/v2/projects/etl-demo-269105/jobs.
15:35:44,689 621086 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - BigQuery job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}} already exists,
will not retry inserting
it:com.google.api.client.googleapis.json.GoogleJsonResponseException: 409
Conflict{ "code" : 409, "errors" : [ { "domain" : "global", "message" :
"Already Exists: Job
etl-demo-269105:australia-southeast1.beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0",
"reason" : "duplicate" } ], "message" : "Already Exists: Job
etl-demo-269105:australia-southeast1.beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0",
"status" : "ALREADY_EXISTS"} at
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:444)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1108) at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:542)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:475)
at
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:592)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:244)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startJob(BigQueryServicesImpl.java:229)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl.startLoadJob(BigQueryServicesImpl.java:170)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables.lambda$startLoad$79287d2b$1(WriteTables.java:408)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJob.runJob(BigQueryHelpers.java:209)
at
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJobManager.waitForDone(BigQueryHelpers.java:152)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn.finishBundle(WriteTables.java:274)
at
org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn$DoFnInvoker.invokeFinishBundle(Unknown
Source) at
org.apache.beam.repackaged.direct_java.runners.core.SimpleDoFnRunner.finishBundle(SimpleDoFnRunner.java:237)
at
org.apache.beam.repackaged.direct_java.runners.core.SimplePushbackSideInputDoFnRunner.finishBundle(SimplePushbackSideInputDoFnRunner.java:124)
at
org.apache.beam.runners.direct.ParDoEvaluator.finishBundle(ParDoEvaluator.java:265)
at
org.apache.beam.runners.direct.DoFnLifecycleManagerRemovingTransformEvaluator.finishBundle(DoFnLifecycleManagerRemovingTransformEvaluator.java:73)
at
org.apache.beam.runners.direct.DirectTransformExecutor.finishBundle(DirectTransformExecutor.java:188)
at
org.apache.beam.runners.direct.DirectTransformExecutor.run(DirectTransformExecutor.java:126)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:264) at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java) at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)15:35:44,689 621086
[direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers - job
beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0
started15:35:45,758 622155 [direct-runner-worker] INFO
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl - BigQuery job
GenericData{classInfo=[jobId, location, projectId],
{jobId=beam_load_templatepipelinegrahampolley0612052535e06af6fe_7e9fba10c9a24d50b81cdd6364d913c8_18825aa47e9218cc06def366031f7280_00001_00000-0,
location=australia-southeast1, projectId=etl-demo-269105}} completed in state
DONE{noformat}
> Beam does not set correct region for BigQuery when requesting load job status
> -----------------------------------------------------------------------------
>
> Key: BEAM-10248
> URL: https://issues.apache.org/jira/browse/BEAM-10248
> Project: Beam
> Issue Type: Bug
> Components: io-java-gcp
> Affects Versions: 2.22.0
> Environment: -Beam 2.22.0
> -DirectRunner & DataflowRunner
> -Java 11
> Reporter: Graham Polley
> Priority: P1
> Attachments: Untitled document.pdf
>
>
> I am using `FILE_LOADS` (micro-batching) from Pub/Sub and writing to
> BigQuery. My BigQuery dataset is in region `australia-southeast1`.
> If the load job into BigQuery fails for some reason (e.g. table does not
> exist, or schema has changed), then an error from BigQuery is returned for
> the load.
> Beam then enters into a loop of retrying the load job. However, instead of
> using a new job id (where the suffix is incremented by 1), it wrongly tries
> to reinsert the job using the same job id. This is because when it tries to
> look up if the job id already exists, it does not take into consideration the
> region where the dataset is. Instead, it defaults to the US but the job was
> created in `australia-southeast1`, so it returns `null`.
> Bug is on LN308 of `BigQueryHelpers.java`. It should not return `null`. It
> returns `null` because it's looking in the wrong region.
> If I *test* using a dataset in BigQuery that is in the US region, then it
> correctly finds the job id and begins to retry the job with a new job id
> (suffixed with the retry count). We have seen this bug in other areas of Beam
> before and in other tools/services on GCP e.g. Cloud Composer.
> However, that leads me to my next problem/bug. Even if that is fixed, the
> number of retries is set to `Integer.MX_VALUE`, so Beam will keep retrying
> the job 2,147,483,647 times. This is not good.
> The exception is swallowed up by the Beam SDK and never propagated back up
> the stack for users to catch and handle. So, if a load job fails there is no
> way to handle it and react for users.
> I attempted to set `withFailedInsertRetryPolicy()` to transient errors only,
> but it is not supported with `FILE_LOADS`. I also tried, using the
> `WriteResult` object returned from the bigQuery sink/write, to get a handle
> on the error but it does not work. Users need a way to catch and catch failed
> load jobs when using `FILE_LOADS`.
>
> {code:java}
> public class TemplatePipeline {
> private static final String TOPIC =
> "projects/etl-demo-269105/topics/test-micro-batching";
> private static final String BIGQUERY_DESTINATION_FILTERED =
> "etl-demo-269105:etl_demo_fun.micro_batch_test_xxx";
> public static void main(String[] args) throws Exception {
> try {
> PipelineOptionsFactory.register(DataflowPipelineOptions.class);
> DataflowPipelineOptions options = PipelineOptionsFactory
> .fromArgs(args)
> .withoutStrictParsing()
> .as(DataflowPipelineOptions.class);
> Pipeline pipeline = Pipeline.create(options);
> PCollection<PubsubMessage> messages = pipeline
> .apply(PubsubIO.readMessagesWithAttributes().fromTopic(String.format(TOPIC,
> options.getProject())))
> .apply(Window.into(FixedWindows.of(Duration.standardSeconds(1))));
> WriteResult result = messages.apply(ParDo.of(new RowToBQRow()))
> .apply(BigQueryIO.writeTableRows()
> .to(String.format(BIGQUERY_DESTINATION_FILTERED, options.getProject()))
> .withCreateDisposition(CREATE_NEVER)
> .withWriteDisposition(WRITE_APPEND)
> .withMethod(BigQueryIO.Write.Method.FILE_LOADS)
> .withTriggeringFrequency(Duration.standardSeconds(5))
> .withNumFileShards(1)
> .withExtendedErrorInfo()
> .withSchema(getTableSchema()));
> // result.getFailedInsertsWithErr().apply(ParDo.of(new
> DoFn<BigQueryInsertError, String>() {
> // @ProcessElement
> // public void processElement(ProcessContext c) {
> // for(ErrorProto err : c.element().getError().getErrors()){
> // throw new RuntimeException(err.getMessage());
> // }
> //
> // }
> // }));
> result.getFailedInserts().apply(ParDo.of(new DoFn<TableRow, String>() {
> @ProcessElement
> public void processElement(ProcessContext c) {
> System.out.println(c.element());
> c.output("foo");
> throw new RuntimeException("Failed to load");
> }
> @FinishBundle
> public void finishUp(FinishBundleContext finishBundleContextc){
> System.out.println("Got here");
> }
> }));
> pipeline.run();
> } catch (Exception e) {
> e.printStackTrace();
> throw new Exception(e);
> }
> }
> private static TableSchema getTableSchema() {
> List<TableFieldSchema> fields = new ArrayList<>();
> fields.add(new TableFieldSchema().setName("timestamp").setType("INTEGER"));
> fields.add(new TableFieldSchema().setName("payload").setType("STRING"));
> return new TableSchema().setFields(fields);
> }
> public static class RowToBQRow extends DoFn<PubsubMessage, TableRow> {
> @ProcessElement
> public void processElement(ProcessContext c) {
> String payload = new String(c.element().getPayload(),
> StandardCharsets.UTF_8);
> c.output(new TableRow()
> .set("timestamp", System.currentTimeMillis())
> .set("payload", payload)
> );
> }
> }
> }{code}
> Stack trace is attached showing problem of same job id being used/inserted to
> BigQuery on each retry.
> {noformat}
> {noformat}
>
>
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)