[ 
https://issues.apache.org/jira/browse/BEAM-6514?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16752966#comment-16752966
 ] 

Rumeshkrishnan commented on BEAM-6514:
--------------------------------------

For more details : 
https://github.com/GoogleCloudPlatform/DataflowJavaSDK/issues/609

> Dataflow Batch Job Failure is leaving Datasets/Tables behind in BigQuery
> ------------------------------------------------------------------------
>
>                 Key: BEAM-6514
>                 URL: https://issues.apache.org/jira/browse/BEAM-6514
>             Project: Beam
>          Issue Type: Bug
>          Components: io-java-gcp
>            Reporter: Rumeshkrishnan
>            Assignee: Chamikara Jayalath
>            Priority: Major
>
> Dataflow is leaving Datasets/Tables behind in BigQuery when the pipeline is 
> cancelled or when it fails. I cancelled a job or it failed at run time, and 
> it left behind a dataset and table in BigQuery.
>  # `cleanupTempResource` method involves cleaning tables and dataset after 
> batch job succeed.
>  # If job failed in the middle or cancelled explicitly, the temporary dataset 
> and tables remain exist. I do see the table expire period 1 day as per code 
> in `getTableToExtract` function written in BigQueryQuerySource.java.
>  # I can understand that, keep temp tables and dataset when failure for 
> debugging.
>  # Can we have pipeline or job optional parameters which get clean temporary 
> dataset and tables when cancel or fail ?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to