Hi Tim,

I want to thank Zweitze for putting in his two cents, his first hand
experience with such a system is very valuable.  To address your
questions:

1) I am not aware of any problems or gotchas with report deletion.

2) See above.

3) A new report will only replace a completed report, not a running
one.

4) Report generation is a read-only process, so there should be no
locking or concurrency issues.

5) Unfortunately I can't provide any details about the internal
resource allocations done by the API.

6) While I think your strategy is sound, you may want to consider
running fewer than 15 threads.  At such a level there is a higher risk
of running into a strange corner case, especially with multi-threaded
environment, and unless you absolutely need that extra bandwidth it
may be best to operate somewhere under the maximum threshold of the
system.

Best,
- Eric Koleda, AdWords API Team


On Dec 5, 6:30 pm, Zweitze <[email protected]> wrote:
> I'm almost following that scheme, downloading and processing dozens of
> reports a day.
>
> Before I answer your questions, note that I don't use a MCC login,
> instead I use the account logins (we ask the customer to add an email-
> address into their account). We have some reasons to do this, but
> that's behind the scope of this post - the reasons are not technical.
>
> But there's an upside: when you access the reportservice with the
> account login (and not the MCC login), reports will be created in the
> Reports list of the respective account. Each account may have 15
> reports, so you won't run into the "max 15" limitation - I can get
> much more reports simultaneously. But, those accounts are still
> accessibly by the customer, and he may not like to see *his* old
> reports removed because of the daily sync. So, my code deletes the
> reports anyway.
>
> > So given a worker thread which:
> > schedules a single report
> > waits for its completion
> > gets the report
> > deletes it from google
> > processes the resulting download
> > continues to next report to be done. (or exits)
>
> That's pretty much how I do this, but I delete the report after the
> processing step. Reason is, sometimes the download is chopped off (the
> main reason is that a retrieved download URL is only valid for a short
> period, after that the URL does not become invalid, but Googles web
> server just states that the download is complete, no more bytes).
> Therefore, after downloading it is processed, when that fails because
> of an XML error, a second attempt to download and process is executed.
> When everything is processed (or failed twice), the report is deleted.
>
> > 1) when you delete a report it really isn't deleted.
>
> Actually, you cannot guarantee that a report will be deleted - the web
> service may fail, internet connectivity may have hickups etc.
>
> > 2) when you delete a report it takes time to delete.  or it takes time
> > for its deletion to be registered across the system.
>
> When the call returns, the report was deleted. It may not be known on
> every replicated Adwords server at that exact moment, but it is known
> on the server that your computer is communicating with.
>
> > 3) when you delete a report and issue the 16th report request the
> > system may get confused and kill a running report anyway.
>
> I don't think that will happen. When 15 reports are in progress, and
> you request a 16th report, that call will probably get Adwords error
> 32 ("A report job cannot be scheduled because the user has too many
> jobs already running.")
>
> > 4) it turns out that a running report will block other reports from
> > being executed, so calling reports concurrently becomes consecutively.
>
> Like previous question. When fifteen reports are in progress, you
> cannot request #16.
>
> > 5) it turns out that each user has a finite amount of processing
> > resources, and that calling reports concurrently slows all of them
> > down, so that the execution time is the same as consecutive reports.
>
> I never noticed this, but you may experience variations in processing
> time, depending on the time of the day.
>
> > 6) any other issues
>
> Well, if you create a system that accesses the Adwords API servers in
> a massive parallel fashion, Google may take measures against you.
> Somewhere (maybe T&C) Google states that you shouldn't have more that
> 5 (? or some number like that) API calls processed at the same time.
> Note that they are talking about SOAP function calls, not scheduled
> reports. But, your code will be doing many requests for the report
> state, to find out whether it's finished. Make sure that you don't
> call that function every 100 ms per thread, but -say- once a minute.
> I asked what happens when you cross that boundary too often, they said
> that I would be contacted first, before doing anything drastic. But
> you should check out yourself.
>
> > Do you have a test case for this- which you have run and which
> > confirms the above strategy does actually function correctly?
>
> I also did a performance test, testing the limits of my database
> server. At 40 parallel threads, my database got performance problems -
> but no problem with Adwords. In production on a faster computer I have
> 8 parallel threads, performance is good enough.

--

You received this message because you are subscribed to the Google Groups 
"AdWords API Forum" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/adwords-api?hl=en.


Reply via email to