I have been working on implementing a pipeline which will push data from
AppEngine into BigQuery. I am defining my mapper pipeline class in the same
manner as in this article
https://developers.google.com/bigquery/articles/datastoretobigquery#definepipeline
except
that I am passing a "filters" param to the input_reader. Specifically, I am
passing this:
params={
"input_reader":{
"entity_kind": entity_type,
"filters": [('pid', '=', pid), ('tags', '=', '2000')]
},
"output_writer":{
"filesystem": "gs",
"gs_bucket_name": settings.GOOGLE_CLOUD_STORAGE_BUCKET_NAME,
"output_sharding": "input"
}
}
When I do that, however, lines 532 and 533 of mapreduce.input_readers.py
are raising an exception stating that "Filter should be a tuple" and
showing me the first filter but represented as a list instead of a tuple.
Commenting out these two lines in input_readers.py allows the pipeline to
proceed as expected and everything works properly. Am I doing something
wrong here or is there a bug that is changing my tuples into lists?
Thanks,
Jeff.
--
You received this message because you are subscribed to the Google Groups
"Google App Engine" group.
To view this discussion on the web visit
https://groups.google.com/d/msg/google-appengine/-/A_DW3_ahN6AJ.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/google-appengine?hl=en.