damccorm opened a new issue, #21155:
URL: https://github.com/apache/beam/issues/21155

   
https://stackoverflow.com/questions/69863008/call-the-bigquery-stored-procedure-in-dataflow-pipeline
   
   I have written a stored procedure in Bigquery and trying to call it within a 
dataflow pipeline. This works for the `SELECT` queries but not for the stored 
procedure:
   
   ```
   
   pipeLine = beam.Pipeline(options=options)
   rawdata = ( pipeLine
               | beam.io.ReadFromBigQuery(
   
                 query="CALL my_dataset.create_customer()", 
use_standard_sql=True)
             )
        
       pipeLine.run().wait_until_finish()
   
   ```
   
   
   Stored procedure:
   
   ```
   
   CREATE OR REPLACE PROCEDURE my_dataset.create_customer()
   BEGIN
       SELECT * 
       FROM `project_name.my_dataset.my_table`
   
       WHERE customer_name LIKE "%John%"
       ORDER BY created_time
       LIMIT 5;
   END;
   
   ```
   
   
   I am able to create the stored procedure and call it within the Bigquery 
console. But, in the dataflow pipeline, it throws an error:
   > "code": 400,
   > essage": "configuration.query.destinationEncryptionConfiguration cannot be 
set for scripts",
   > "message": "configuration.query.destinationEncryptionConfiguration cannot 
be set for scripts", "domain": "global",
   > eason": "invalid"
   > "status": "INVALID_ARGUMENT"
   >  
   
    
   
   Imported from Jira 
[BEAM-13194](https://issues.apache.org/jira/browse/BEAM-13194). Original Jira 
may contain additional context.
   Reported by: ahalya_h.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to