[jira] [Commented] (BEAM-8089) Error while using customGcsTempLocation() with Dataflow

2020-06-10 Thread Beam JIRA Bot (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8089?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17131260#comment-17131260
 ] 

Beam JIRA Bot commented on BEAM-8089:
-

This issue was marked "stale-assigned" and has not received a public comment in 
7 days. It is now automatically unassigned. If you are still working on it, you 
can assign it to yourself again. Please also give an update about the status of 
the work.

> Error while using customGcsTempLocation() with Dataflow
> ---
>
> Key: BEAM-8089
> URL: https://issues.apache.org/jira/browse/BEAM-8089
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.13.0
>Reporter: Harshit Dwivedi
>Priority: P2
>
> I have the following code snippet which writes content to BigQuery via File 
> Loads.
> Currently the files are being written to a GCS Bucket, but I want to write 
> them to the local file storage of Dataflow instead and want BigQuery to load 
> data from there.
>  
>  
>  
> {code:java}
> BigQueryIO
>  .writeTableRows()
>  .withNumFileShards(100)
>  .withTriggeringFrequency(Duration.standardSeconds(90))
>  .withMethod(BigQueryIO.Write.Method.FILE_LOADS)
>  .withSchema(getSchema())
>  .withoutValidation()
>  .withCustomGcsTempLocation(new ValueProvider() {
>     @Override
>     public String get(){
>          return "/home/harshit/testFiles";     
> }
>     @Override
>     public boolean isAccessible(){
>          return true;     
> }})
>  .withTimePartitioning(new TimePartitioning().setType("DAY"))
>  .withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)
>  .withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
>  .to(tableName));
> {code}
>  
>  
> On running this, I don't see any files being written to the provided path and 
> the BQ load jobs fail with an IOException.
>  
> I looked at the docs, but I was unable to find any working example for this.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-8089) Error while using customGcsTempLocation() with Dataflow

2020-06-01 Thread Kenneth Knowles (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8089?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17121920#comment-17121920
 ] 

Kenneth Knowles commented on BEAM-8089:
---

This issue is assigned but has not received an update in 30 days so it has been 
labeled "stale-assigned". If you are still working on the issue, please give an 
update and remove the label. If you are no longer working on the issue, please 
unassign so someone else may work on it. In 7 days the issue will be 
automatically unassigned.

> Error while using customGcsTempLocation() with Dataflow
> ---
>
> Key: BEAM-8089
> URL: https://issues.apache.org/jira/browse/BEAM-8089
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.13.0
>Reporter: Harshit Dwivedi
>Assignee: Chamikara Madhusanka Jayalath
>Priority: P2
>  Labels: stale-assigned
>
> I have the following code snippet which writes content to BigQuery via File 
> Loads.
> Currently the files are being written to a GCS Bucket, but I want to write 
> them to the local file storage of Dataflow instead and want BigQuery to load 
> data from there.
>  
>  
>  
> {code:java}
> BigQueryIO
>  .writeTableRows()
>  .withNumFileShards(100)
>  .withTriggeringFrequency(Duration.standardSeconds(90))
>  .withMethod(BigQueryIO.Write.Method.FILE_LOADS)
>  .withSchema(getSchema())
>  .withoutValidation()
>  .withCustomGcsTempLocation(new ValueProvider() {
>     @Override
>     public String get(){
>          return "/home/harshit/testFiles";     
> }
>     @Override
>     public boolean isAccessible(){
>          return true;     
> }})
>  .withTimePartitioning(new TimePartitioning().setType("DAY"))
>  .withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)
>  .withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
>  .to(tableName));
> {code}
>  
>  
> On running this, I don't see any files being written to the provided path and 
> the BQ load jobs fail with an IOException.
>  
> I looked at the docs, but I was unable to find any working example for this.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-8089) Error while using customGcsTempLocation() with Dataflow

2019-08-26 Thread Chamikara Jayalath (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8089?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16916117#comment-16916117
 ] 

Chamikara Jayalath commented on BEAM-8089:
--

Have you tried running Dataflow in the same region as where your bucket located 
using option [1] ? Networks charges should not apply in this case according to 
[2].

 

[1] 
[https://github.com/apache/beam/blob/master/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/options/DataflowPipelineOptions.java#L133]

[2] [https://cloud.google.com/storage/pricing]

> Error while using customGcsTempLocation() with Dataflow
> ---
>
> Key: BEAM-8089
> URL: https://issues.apache.org/jira/browse/BEAM-8089
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.13.0
>Reporter: Harshit Dwivedi
>Assignee: Chamikara Jayalath
>Priority: Major
>
> I have the following code snippet which writes content to BigQuery via File 
> Loads.
> Currently the files are being written to a GCS Bucket, but I want to write 
> them to the local file storage of Dataflow instead and want BigQuery to load 
> data from there.
>  
>  
>  
> {code:java}
> BigQueryIO
>  .writeTableRows()
>  .withNumFileShards(100)
>  .withTriggeringFrequency(Duration.standardSeconds(90))
>  .withMethod(BigQueryIO.Write.Method.FILE_LOADS)
>  .withSchema(getSchema())
>  .withoutValidation()
>  .withCustomGcsTempLocation(new ValueProvider() {
>     @Override
>     public String get(){
>          return "/home/harshit/testFiles";     
> }
>     @Override
>     public boolean isAccessible(){
>          return true;     
> }})
>  .withTimePartitioning(new TimePartitioning().setType("DAY"))
>  .withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)
>  .withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
>  .to(tableName));
> {code}
>  
>  
> On running this, I don't see any files being written to the provided path and 
> the BQ load jobs fail with an IOException.
>  
> I looked at the docs, but I was unable to find any working example for this.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (BEAM-8089) Error while using customGcsTempLocation() with Dataflow

2019-08-26 Thread Harshit Dwivedi (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8089?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16916053#comment-16916053
 ] 

Harshit Dwivedi commented on BEAM-8089:
---

The data ingested into GCS is around 250Gb for us per day, so we are incurring 
a lot of network charges.

I wanted to avoid this charge by storing everything in Dataflow PD instead of 
GCS.

> Error while using customGcsTempLocation() with Dataflow
> ---
>
> Key: BEAM-8089
> URL: https://issues.apache.org/jira/browse/BEAM-8089
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.13.0
>Reporter: Harshit Dwivedi
>Assignee: Chamikara Jayalath
>Priority: Major
>
> I have the following code snippet which writes content to BigQuery via File 
> Loads.
> Currently the files are being written to a GCS Bucket, but I want to write 
> them to the local file storage of Dataflow instead and want BigQuery to load 
> data from there.
>  
>  
>  
> {code:java}
> BigQueryIO
>  .writeTableRows()
>  .withNumFileShards(100)
>  .withTriggeringFrequency(Duration.standardSeconds(90))
>  .withMethod(BigQueryIO.Write.Method.FILE_LOADS)
>  .withSchema(getSchema())
>  .withoutValidation()
>  .withCustomGcsTempLocation(new ValueProvider() {
>     @Override
>     public String get(){
>          return "/home/harshit/testFiles";     
> }
>     @Override
>     public boolean isAccessible(){
>          return true;     
> }})
>  .withTimePartitioning(new TimePartitioning().setType("DAY"))
>  .withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)
>  .withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
>  .to(tableName));
> {code}
>  
>  
> On running this, I don't see any files being written to the provided path and 
> the BQ load jobs fail with an IOException.
>  
> I looked at the docs, but I was unable to find any working example for this.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (BEAM-8089) Error while using customGcsTempLocation() with Dataflow

2019-08-26 Thread Chamikara Jayalath (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8089?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16916052#comment-16916052
 ] 

Chamikara Jayalath commented on BEAM-8089:
--

BTW may I ask why you cannot use GCS in this case ? Dataflow already needs GCS 
to run and storage costs should be minimum.

> Error while using customGcsTempLocation() with Dataflow
> ---
>
> Key: BEAM-8089
> URL: https://issues.apache.org/jira/browse/BEAM-8089
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.13.0
>Reporter: Harshit Dwivedi
>Assignee: Chamikara Jayalath
>Priority: Major
>
> I have the following code snippet which writes content to BigQuery via File 
> Loads.
> Currently the files are being written to a GCS Bucket, but I want to write 
> them to the local file storage of Dataflow instead and want BigQuery to load 
> data from there.
>  
>  
>  
> {code:java}
> BigQueryIO
>  .writeTableRows()
>  .withNumFileShards(100)
>  .withTriggeringFrequency(Duration.standardSeconds(90))
>  .withMethod(BigQueryIO.Write.Method.FILE_LOADS)
>  .withSchema(getSchema())
>  .withoutValidation()
>  .withCustomGcsTempLocation(new ValueProvider() {
>     @Override
>     public String get(){
>          return "/home/harshit/testFiles";     
> }
>     @Override
>     public boolean isAccessible(){
>          return true;     
> }})
>  .withTimePartitioning(new TimePartitioning().setType("DAY"))
>  .withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)
>  .withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
>  .to(tableName));
> {code}
>  
>  
> On running this, I don't see any files being written to the provided path and 
> the BQ load jobs fail with an IOException.
>  
> I looked at the docs, but I was unable to find any working example for this.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (BEAM-8089) Error while using customGcsTempLocation() with Dataflow

2019-08-26 Thread Chamikara Jayalath (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8089?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16916051#comment-16916051
 ] 

Chamikara Jayalath commented on BEAM-8089:
--

I don't think we can fork Beam code for a very specific scenario of the 
Dataflow runner (single worker with autoscaling disabled). In general, Dataflow 
does not fuse the step that write files and the step that execute the BQ job so 
these two steps may not execute in the same worker.

> Error while using customGcsTempLocation() with Dataflow
> ---
>
> Key: BEAM-8089
> URL: https://issues.apache.org/jira/browse/BEAM-8089
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.13.0
>Reporter: Harshit Dwivedi
>Assignee: Chamikara Jayalath
>Priority: Major
>
> I have the following code snippet which writes content to BigQuery via File 
> Loads.
> Currently the files are being written to a GCS Bucket, but I want to write 
> them to the local file storage of Dataflow instead and want BigQuery to load 
> data from there.
>  
>  
>  
> {code:java}
> BigQueryIO
>  .writeTableRows()
>  .withNumFileShards(100)
>  .withTriggeringFrequency(Duration.standardSeconds(90))
>  .withMethod(BigQueryIO.Write.Method.FILE_LOADS)
>  .withSchema(getSchema())
>  .withoutValidation()
>  .withCustomGcsTempLocation(new ValueProvider() {
>     @Override
>     public String get(){
>          return "/home/harshit/testFiles";     
> }
>     @Override
>     public boolean isAccessible(){
>          return true;     
> }})
>  .withTimePartitioning(new TimePartitioning().setType("DAY"))
>  .withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)
>  .withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
>  .to(tableName));
> {code}
>  
>  
> On running this, I don't see any files being written to the provided path and 
> the BQ load jobs fail with an IOException.
>  
> I looked at the docs, but I was unable to find any working example for this.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (BEAM-8089) Error while using customGcsTempLocation() with Dataflow

2019-08-26 Thread Harshit Dwivedi (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8089?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16916037#comment-16916037
 ] 

Harshit Dwivedi commented on BEAM-8089:
---

For my use-case, I have a single worker and since that runs on a single VM, 
would it be possible to implement this?

I have disable autoscaling for my use case and hence the Dataflow job will 
always run on a single VM.

> Error while using customGcsTempLocation() with Dataflow
> ---
>
> Key: BEAM-8089
> URL: https://issues.apache.org/jira/browse/BEAM-8089
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.13.0
>Reporter: Harshit Dwivedi
>Assignee: Chamikara Jayalath
>Priority: Major
>
> I have the following code snippet which writes content to BigQuery via File 
> Loads.
> Currently the files are being written to a GCS Bucket, but I want to write 
> them to the local file storage of Dataflow instead and want BigQuery to load 
> data from there.
>  
>  
>  
> {code:java}
> BigQueryIO
>  .writeTableRows()
>  .withNumFileShards(100)
>  .withTriggeringFrequency(Duration.standardSeconds(90))
>  .withMethod(BigQueryIO.Write.Method.FILE_LOADS)
>  .withSchema(getSchema())
>  .withoutValidation()
>  .withCustomGcsTempLocation(new ValueProvider() {
>     @Override
>     public String get(){
>          return "/home/harshit/testFiles";     
> }
>     @Override
>     public boolean isAccessible(){
>          return true;     
> }})
>  .withTimePartitioning(new TimePartitioning().setType("DAY"))
>  .withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)
>  .withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
>  .to(tableName));
> {code}
>  
>  
> On running this, I don't see any files being written to the provided path and 
> the BQ load jobs fail with an IOException.
>  
> I looked at the docs, but I was unable to find any working example for this.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (BEAM-8089) Error while using customGcsTempLocation() with Dataflow

2019-08-26 Thread Chamikara Jayalath (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8089?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16916033#comment-16916033
 ] 

Chamikara Jayalath commented on BEAM-8089:
--

True, seems like this is supported in a limited way (wildcards not supported 
for example).

 

I think Beam will have a hard time supporting this since most Beam runners are 
distributed and use multiple nodes to write data (to files) in parallel. So 
there's no "single" local disk. This is why we use a distributed storage 
location to which all workers have access to write individual files (a 
directory in GCS in this case) and execute a single BQ load job for all files 
from there.

> Error while using customGcsTempLocation() with Dataflow
> ---
>
> Key: BEAM-8089
> URL: https://issues.apache.org/jira/browse/BEAM-8089
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.13.0
>Reporter: Harshit Dwivedi
>Assignee: Chamikara Jayalath
>Priority: Major
>
> I have the following code snippet which writes content to BigQuery via File 
> Loads.
> Currently the files are being written to a GCS Bucket, but I want to write 
> them to the local file storage of Dataflow instead and want BigQuery to load 
> data from there.
>  
>  
>  
> {code:java}
> BigQueryIO
>  .writeTableRows()
>  .withNumFileShards(100)
>  .withTriggeringFrequency(Duration.standardSeconds(90))
>  .withMethod(BigQueryIO.Write.Method.FILE_LOADS)
>  .withSchema(getSchema())
>  .withoutValidation()
>  .withCustomGcsTempLocation(new ValueProvider() {
>     @Override
>     public String get(){
>          return "/home/harshit/testFiles";     
> }
>     @Override
>     public boolean isAccessible(){
>          return true;     
> }})
>  .withTimePartitioning(new TimePartitioning().setType("DAY"))
>  .withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)
>  .withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
>  .to(tableName));
> {code}
>  
>  
> On running this, I don't see any files being written to the provided path and 
> the BQ load jobs fail with an IOException.
>  
> I looked at the docs, but I was unable to find any working example for this.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (BEAM-8089) Error while using customGcsTempLocation() with Dataflow

2019-08-26 Thread Harshit Dwivedi (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8089?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16916022#comment-16916022
 ] 

Harshit Dwivedi commented on BEAM-8089:
---

But the BQ documentation says that this can be done, 

[https://cloud.google.com/bigquery/docs/loading-data-local]

> Error while using customGcsTempLocation() with Dataflow
> ---
>
> Key: BEAM-8089
> URL: https://issues.apache.org/jira/browse/BEAM-8089
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.13.0
>Reporter: Harshit Dwivedi
>Assignee: Chamikara Jayalath
>Priority: Major
>
> I have the following code snippet which writes content to BigQuery via File 
> Loads.
> Currently the files are being written to a GCS Bucket, but I want to write 
> them to the local file storage of Dataflow instead and want BigQuery to load 
> data from there.
>  
>  
>  
> {code:java}
> BigQueryIO
>  .writeTableRows()
>  .withNumFileShards(100)
>  .withTriggeringFrequency(Duration.standardSeconds(90))
>  .withMethod(BigQueryIO.Write.Method.FILE_LOADS)
>  .withSchema(getSchema())
>  .withoutValidation()
>  .withCustomGcsTempLocation(new ValueProvider() {
>     @Override
>     public String get(){
>          return "/home/harshit/testFiles";     
> }
>     @Override
>     public boolean isAccessible(){
>          return true;     
> }})
>  .withTimePartitioning(new TimePartitioning().setType("DAY"))
>  .withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)
>  .withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
>  .to(tableName));
> {code}
>  
>  
> On running this, I don't see any files being written to the provided path and 
> the BQ load jobs fail with an IOException.
>  
> I looked at the docs, but I was unable to find any working example for this.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (BEAM-8089) Error while using customGcsTempLocation() with Dataflow

2019-08-26 Thread Chamikara Jayalath (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8089?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16915992#comment-16915992
 ] 

Chamikara Jayalath commented on BEAM-8089:
--

BQ cannot execute load jobs from local files. Files have to be in GCS.

 

So I think this is working as intended.

> Error while using customGcsTempLocation() with Dataflow
> ---
>
> Key: BEAM-8089
> URL: https://issues.apache.org/jira/browse/BEAM-8089
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.13.0
>Reporter: Harshit Dwivedi
>Assignee: Chamikara Jayalath
>Priority: Major
>
> I have the following code snippet which writes content to BigQuery via File 
> Loads.
> Currently the files are being written to a GCS Bucket, but I want to write 
> them to the local file storage of Dataflow instead and want BigQuery to load 
> data from there.
>  
>  
>  
> {code:java}
> BigQueryIO
>  .writeTableRows()
>  .withNumFileShards(100)
>  .withTriggeringFrequency(Duration.standardSeconds(90))
>  .withMethod(BigQueryIO.Write.Method.FILE_LOADS)
>  .withSchema(getSchema())
>  .withoutValidation()
>  .withCustomGcsTempLocation(new ValueProvider() {
>     @Override
>     public String get(){
>          return "/home/harshit/testFiles";     
> }
>     @Override
>     public boolean isAccessible(){
>          return true;     
> }})
>  .withTimePartitioning(new TimePartitioning().setType("DAY"))
>  .withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)
>  .withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
>  .to(tableName));
> {code}
>  
>  
> On running this, I don't see any files being written to the provided path and 
> the BQ load jobs fail with an IOException.
>  
> I looked at the docs, but I was unable to find any working example for this.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (BEAM-8089) Error while using customGcsTempLocation() with Dataflow

2019-08-26 Thread Jira


[ 
https://issues.apache.org/jira/browse/BEAM-8089?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16915624#comment-16915624
 ] 

Ismaël Mejía commented on BEAM-8089:


Reassigned to you [~chamikara], maybe you can take a look or find someone to do 
so.

> Error while using customGcsTempLocation() with Dataflow
> ---
>
> Key: BEAM-8089
> URL: https://issues.apache.org/jira/browse/BEAM-8089
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.13.0
>Reporter: Harshit Dwivedi
>Assignee: Chamikara Jayalath
>Priority: Major
>
> I have the following code snippet which writes content to BigQuery via File 
> Loads.
> Currently the files are being written to a GCS Bucket, but I want to write 
> them to the local file storage of Dataflow instead and want BigQuery to load 
> data from there.
>  
>  
>  
> {code:java}
> BigQueryIO
>  .writeTableRows()
>  .withNumFileShards(100)
>  .withTriggeringFrequency(Duration.standardSeconds(90))
>  .withMethod(BigQueryIO.Write.Method.FILE_LOADS)
>  .withSchema(getSchema())
>  .withoutValidation()
>  .withCustomGcsTempLocation(new ValueProvider() {
>     @Override
>     public String get(){
>          return "/home/harshit/testFiles";     
> }
>     @Override
>     public boolean isAccessible(){
>          return true;     
> }})
>  .withTimePartitioning(new TimePartitioning().setType("DAY"))
>  .withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)
>  .withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
>  .to(tableName));
> {code}
>  
>  
> On running this, I don't see any files being written to the provided path and 
> the BQ load jobs fail with an IOException.
>  
> I looked at the docs, but I was unable to find any working example for this.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)