ramesh-aqfer commented on issue #13990:
URL: https://github.com/apache/iceberg/issues/13990#issuecomment-3261548028

   thanks @talatuyarer for the details.  
   
   > BigQueryMetastoreCatalog : The workflow is to create and manage your 
Iceberg tables directly within BigQuery using standard SQL DDL. These tables 
are fully read/write, show up in the UI, and get all the native BigQuery 
features like fine-grained access control. You then use the 
BigQueryMetastoreCatalog from open-source engines like Spark or Flink to read 
these BigQuery-managed tables.
   
   Our use case is opposite to this : we process the data and we want to update 
the final output into an Iceberg table - which should be later queried by our 
customers in BigQuery. For BigQuery this is an externally managed Iceberg 
table. Based on your clarification my understanding is that we should use 
BigLake RESTCatalog to register our updates in the BigLake, but we need to wait 
for the BigLake - BigQuery integration if our customers want to query this data 
in BigQuery. Am I correct ? 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org

Reply via email to