Hi All,
          I want to ask questions regarding sinking a very high volume
stream to Bigquery.

I will read messages from a Pubsub topic and write to Bigquery. In this
steaming job i am worried about hitting the bigquery streaming inserts
limit of 1gb per second on streaming Api writes

I am firstly unsure if Beam uses that Api or uses a temp directory to write
files and commits on intervals which brings me to another question do i
have to do windowing to save myself from hitting the 1gb per second limit?

Please advise. Thanks

Reply via email to