Good morning,

In my Big Data Google project we need to read batch data from the VTEX
platform every 15 minutes and record json's return to our cloud storage
datalake.
It makes the data available through HTTP GET requests for its API: (
https://documenter.getpostman.com/view/487146/vtex-oms-api/6tjSKqi)

I have not found a good way to do this reading using standard Apache Beam
components, my idea is to develop an HttpIO component using Source Sink to
read the request and write a JSON file to Cloud Storage. After that another
pipeline will process the file and write to BigQuery.

Can you please help me with this matter? Is that the best way to go? Do you
have any idea how to develop this new component?

Thank you.

Hugs,

Daniel Salerno de Arruda

Reply via email to