Is this what you're thinking about?

http://blog.digg.com/?p=621



On May 30, 4:44 pm, Christiaan Baartse <[email protected]> wrote:
> I'm really curious how well this is supported among the different
> browsers.
> This made me think about a news article about Opera released some time
> ago.http://my.opera.com/WebApplications/blog/show.dml/438711
>
> AFAIK Firefox has a similar implementation for something almost the
> same.http://wehrlos.strain.at/httpreq/client.html
>
> Is "Transfer-Encoding: chunked" really supported on all browsers? If
> it is, I really dont get why the Opera implementation would've been so
> special if this already exists since.. 1999?
> If so I'd really have to go find out more about it!
>
> On 30 mei, 22:26, Andrew Ingram <[email protected]> wrote:
>
> > I'm talking about content-encoding: chunked, it's a single connection.
> > The server can periodically 'flush' the response stream to send all
> > content that has been generated up to that point. The idea being that
> > if each chunk is self-contained, ie a single entity on a response that
> > would return a list of entities, the AJAX library can handle the first
> > entities before the complete response has even finished being
> > generated by the server.
>
> > Technically this could require even less connection overhead than
> > Comet because you could keep the connection open after delivering each
> > 'update' - rather than requiring the client to create a new connection
> > each time.
>
> >http://en.wikipedia.org/wiki/Chunked_transfer_encoding
>
> > Now I could be completely mistaken and actually chunked encoding
> > requires multiple connections, but I don't believe this to be the
> > case.
>
> > On May 30, 3:31 pm, Ricardo <[email protected]> wrote:
>
> > > Creating lots of connections would probably have a large overhead
> > > making it slower than if you waited for the whole processing to end,
> > > for each connection you have to factor the 2-way latency + server
> > > response time. A better approach and already usable is HTTP Streaming/
> > > Comet:
>
> > >http://ajaxpatterns.org/HTTP_Streaming
>
> > > On May 29, 7:36 pm, Andrew Ingram <[email protected]> wrote:
>
> > > > Hi all,
>
> > > > I'm not even sure if this is possible with JavaScript at the moment,
> > > > but it would make a powerful feature if it were.
>
> > > > If returning a list of resources as the response to a request, it's
> > > > relatively trivial to configure the app (in Django at least) to flush
> > > > the stream after each resource and provide a semi real-time feed of
> > > > results, ie you don't have to wait for the last item to be calculated
> > > > before the first one is returned. This uses Content-Encoding chunked.
>
> > > > I was thinking that if jQuery could somehow recognise these types of
> > > > response, it could iterate over these individual resources as they
> > > > come over the wire, then the callback would be given individual items
> > > > rather than the full response. This would make AJAXy functionality
> > > > even more responsive because you can start handling parts of the
> > > > response before the server has even finished generating the later
> > > > parts.
>
> > > > Maybe this is already possible, but I couldn't have any documentation
> > > > or mention of it.
>
> > > > Any thoughts on this idea?
>
> > > > Regards,
> > > > Andrew Ingram
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"jQuery Development" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/jquery-dev?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to