mataralhawiti commented on issue #23029:
URL: https://github.com/apache/beam/issues/23029#issuecomment-2520519131

   > > > > I'm facing the same issue where it tries to infer schema during 
pipeline submission from local machine (which doesn't have access to DB server).
   > > > 
   > > > 
   > > > Hi Matar, Did you ever manage to find a fix for this? We are running 
into the exact same issue.
   > > 
   > > 
   > > hey @RhysGrimshaw As of now, the only option is you must open the 
connection between the machine submitting (you local machine if you submitting 
manually or your Dataflow VMs subnet)the job and DB server. For example, in my 
case I opened the connection from our dataflow subnet to the DB server.
   > 
   > Thank you for getting back to me. How do you go about running the a Python 
script directly from the Subnet? The only options I seem to have available are 
from a template/builder, or by reusing a Dataflow Job. But since my job fails 
before it gets to Dataflow (due to it trying to infer schema from a local 
machine connection) I can't build a new job from this.
   
   @RhysGrimshaw for testing purpose, I opened the connection from my local 
machine to DB server. Production wise, we submit our jobs automatically via 
cloud schudeler & cloud composer. So the point to remember is your dataflow vm 
needs to be able to access the DB server for initial schema inference 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to