Hey Guys, So I'm trying to migrate to the new GCS Client Library to update my Datastore > GCS CSV File > BigQuery Task. This is a cron that runs every night and pushes all new data over to BigQuery. I've always had issues with timeouts and file errors (hello 107!) and I'm hoping the new client library can take care of it. So, when exporting a large amount of data, we always run up against the 30 Second AppEngineFile limit and need to reconnect. Previously, I just schedule a new Task and pass the filename and WebSafeCursors and just pick it up from there.
Looking at the GcsOutputChannel, it says the object itself is Serializeable and can be reused before finalizing the file after serialization. Do I have to write my own serialization methods for this, or is there something like the Cursor.to/fromWebSafeString functions to take care of that (I don't see anything in the JavaDocs). If I have to write my own, would I use some sort of ObjectOutputStream > ByteArrayOutputStream then add that to my Task Options, then pull back in the other way? Do I need to worry about encoding? Would firing it into Memcache with a a unique key be a better approach than attempting to go the ObjectOutputStream approach? Thanks! E -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at http://groups.google.com/group/google-appengine. For more options, visit https://groups.google.com/groups/opt_out.
