On Jun 23, 10:26 am, "T.J. Crowder" <t...@crowdersoftware.com> wrote:
> Hi,
> Weird, the way things seem to run in themes in the group.  I just
> posted this to another thread[1] here:
> * * * *
> Most browsers, and most servers, place a limit on the number of
> simultaneous connections between the same endpoints, and typically
> that limit is two simultaneous connections.  I don't immediately have
> a reference for that, but if you search on "simultaneous HTTP
> connection limit" or "concurrent HTTP connection limit" or some such,
> you'll find info on it.  Note, again, that this is done at the browser
> level and also, sometimes, at the server level, and it varies by
> vendor (and configuration).
> * * * *
> [1]http://groups.google.com/group/prototype-scriptaculous/browse_thread/...
> > I have been looking for some kind of cancel
> > functionality to quit out of the long running requests when a user
> > clicks on a navigation link, but I have been unable to find anything.
> > Can someone help?
> The underlying XmlHttpRequest object may support the (relatively) new
> abort method (which I think is well-supported now).  I don't think
> Prototype has a documented means of aborting requests or of accessing
> the underlying XHR object, but it's a pretty open secret:  The XHR is
> stored as the "transport" property of the Ajax.Request object.  So on
> browsers that support it, you can abort requests:
> Creating the request and keeping a reference to it:
>     req = new Ajax.Request(...);
> ...then using that reference to cancel if supported:
>     try {
>         req.transport.abort();
>     } catch (e) {
>         // Handle the failure, if any and if you want to
>     }
> (Naturally if you want to you can test first to see if 'transport' is
> there and if 'transport.abort' is there.)
> HTH,
> --
> T.J. Crowder
> tj / crowder software / com
> Independent Software Engineer, consulting services available
> On Jun 23, 6:02 pm, franklinhu <frankli...@yahoo.com> wrote:
> > I have a page which uses prototype.js and it contains several ajax
> > requests that retrieve lists to be displayed to the user. This page
> > also has links which take the user to different parts of the
> > application. I am finding that these links do not take an action until
> > the long running requests are finished. This gives the appearance that
> > the links are broken as the user clicks on them. After the requests
> > are finished, then the page is able to navigate to the link that was
> > clicked, but this can take several seconds for my application.
> > I thought that the whole idea behind Ajax was that you could do things
> > in the background, and that you wouldn't have to wait for things to
> > finish. However, what I am encountering is that while the page does
> > display quickly, the user can't use it until it is finished rendering
> > which is almost as annoying as having to wait for the entire page to
> > render without Ajax. I have been looking for some kind of cancel
> > functionality to quit out of the long running requests when a user
> > clicks on a navigation link, but I have been unable to find anything.
> > Can someone help?
> > Here is the code I am using to call the requests.
> > <body onLoad="displayDIVData()">
> > My main page calls a javascript routine during the onLoad tag.
> > function displayDIVData()
> > {
> >         if (bAvailTrips) getAvailableTrips();
> > }
> > This function calls another function.
> > function getAvailableTrips()
> > {
> >         var AvailTripRequest = new Ajax.Updater('TripAvailableDivBody',
> > UniqueURL('GetAvailableTrips.asp'), {asynchronous:true,
> > evalScripts:true, method:'get'});
> > }
> > -fhuajax- Hide quoted text -
> - Show quoted text -

I tried using the transport.abort function and it appears to
immediately clear out the div where the results are supposed to go,
however, it still appears to be processing the URL that was passed to
it and won't navigate until it has returned.

I have found that external links work as expected, so there is some
thread problem on the javascript side. I noticed I had debugging
turned on and this made IIS single threaded. When I turned off
debugging, then I could continue to process requests in another
browser window while the other one was retrieving contents using Ajax.
However, within the same browser, navigation requests are still
suppressed and must wait for the Ajax call to complete.

Based on your response, it sound like this is the way Ajax is supposed
to work. If you have a long running request to the server, you are
unable to do internal navigation until that request has been
satisifed. Is that right? I still find that hard to believe. I was
hoping that I had just called Ajax in the wrong place or used the
wrong parameters.

You received this message because you are subscribed to the Google Groups 
"Prototype & script.aculo.us" group.
To post to this group, send email to prototype-scriptaculous@googlegroups.com
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to