I'm developing an GAE application that needs to extract, transform and load 
data from highwinds to bigquery every half an hour. It also has to respond 
to queries that are done to the processed data, the data in bigquery.

I have a default module that handles the queries (automatic scaling and 
instance class F1) and another module named heavy-workers that is 
responsible for the ETL task (manual scaling and instance class B8).

The ETL task is done in a background job that connects to the ftp server, 
iterates through the files, and process each one of them. The process 
consists in downloading the file, parsing it and transforming it to a 
bigquery supported format. I'm using the streaming insert API to load the 
data into bigquery but I read that this is not the best alternative for 
batch processing. I don't understand how to do it otherwise. I would really 
appreciate some help here.
Source: 
https://cloud.google.com/developers/articles/bigquery-in-practice/#h.757bq5t4k2m8

Another issue I ran into is that when running multiple background threads 
with ndb.multi_put I get "Process terminated because the backend was 
stopped".

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/google-appengine.
For more options, visit https://groups.google.com/d/optout.

Reply via email to