its over 20 mb...i have gzip compression enabled on my server which
works dandy on web pages and sitemaps but i might just have to set up
an ftp upload for this feed which is going a bit beyond the realm of
cake.

On Jul 9, 2:22 pm, GravyFace <[email protected]> wrote:
> What's the size of your product feed as XML on disk?
>
> On Thu, Jul 9, 2009 at 2:02 PM, JamesF<[email protected]> wrote:
>
> > thats a good tip if the end user is downloading the xml...but in my
> > case its a product feed going to vast.com which wants raw xml
>
> > On Jul 9, 8:03 am, GravyFace <[email protected]> wrote:
> >> Not really sure what the end-user/consumer of this XML looks like, but
> >> I'd recommend also compressing it and providing the consumer of the
> >> "feed" (more like a batch) with a .zip file they can download and
> >> consume -- 20k records is much too bulky for any kind of near-time
> >> consumption in my opinion, because the requestor's parser will most
> >> likely be waiting for the request to complete (i.e. download the
> >> entire XML file) before parsing it anyways -- might as well save some
> >> bandwidth and transfer it reliably.
>
> >> Also, if you go this route, make sure you set the headers to
> >> Content-disposition:attachment to force the browser to prompt
> >> save/open, instead of trying to open in the browser.
>
> >> You can, however, provide a updated feed of "daily product updates"
> >> for those who have already downloaded (what sounds like) your entire
> >> product catalog.
>
> >> On Thu, Jul 9, 2009 at 2:48 AM, Bert Van den Brande<[email protected]> 
> >> wrote:
>
> >> > Hi,
>
> >> > I've ran into the same problem as you did and ended up writing a Cake
> >> > Shell script that periodically get's executed by a cronjob.
> >> > The script script writes the .xml file to a location on the webserver,
> >> > making it accessible through a Cake web page.
>
> >> > Don't have the code at hand here, but if you're interested I can dig it 
> >> > up.
>
> >> > Friendly greetings,
> >> > Bert Van den Brande
>
> >> > On Wed, Jul 8, 2009 at 5:38 PM, JamesF<[email protected]> wrote:
>
> >> >> Hello all,
>
> >> >> I have been using cake's xmlhelper to render my xml output quite
> >> >> successfully. but i need to create a rather large file (20000
> >> >> records), for a product feed export. if i limit my results to 100 at a
> >> >> time they render fine but obviously a brickwall is hit at higher
> >> >> numbers php generates out of memory erros and the like.
>
> >> >> my idea was to render xmlHelpers output (the view) to a file instead
> >> >> of live. then maybe the execution could be staggered and not timeout.
> >> >> i dont want to run up against php's script execution time limit.
>
> >> >> maybe its time to bust out requestAction with a for loop that runs
> >> >> through 100 records per iteration. i can set all that up, but am still
> >> >> unsure about how to output the view into a file.
>
> >> >> anyone else run into a solution for this?
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"CakePHP" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/cake-php?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to