Hi Manish,

Just to echo Orwin's comments, what you're asking should be handled at the application level, instead of the protocol level (HTTP). It sounds like what you want is paging. My suggestion would be to control the location and size of the values returned by passing in additional parameters in the request. For example if you could specify something like the following: http://myhost/path/request?start=100&length=100. This would get the second 100 records.

Mike

On Aug 24, 2004, at 12:41 AM, Manish Moorjani wrote:

Hi Mike,
Thanks for the quick response.
What I mean by partial response is as follows :

When I try accessing the url directly the application can do do things
1) Returns the response in one go
2) Keep on flushing after some data is fetched(say there are total 100 records, it displays 100 records first still the other 900 are getting fetched)


How do I implement the option 2 using HttpClient, that is can I get some part of the response from the URL and display it on screen and then make a
request for remaining part so that timeout doesnt take place.


Something like can I set the response lenght fetchech to say 1000 and the next time I make a new request I get the response after those 100 bytes.

The Suggestions given by u are good.
But I dont want to increase the timeout time for request because in that case I will have to wait for that much longer for a url which is not responding at all !!!


I hope I made it clear.

Regards,
Manish Moorjani
"If you are still amazed by something you accomplished yesterday then today has been a waste"


Hello Manish,

What you're doing here looks good.  It sounds like you're getting a
socket read timeout which is caused by a large delay when reading the
response.  I'm not sure what you mean by "read partial responses".  The
only options I can think of are setting the SO_TIMEOUT to a higher
value, or working on the server side to increase its performance.

Mike

On Aug 22, 2004, at 8:27 AM, Manish Moorjani wrote:

Hi,

I am using HttpClient for automatic login into third Party
Applications and then fetching the page and displaying it to the end
user.

The page I am trying to fetch queries the database based on a search
key and fetches more than 2000 records to display on the screen.

I am using the following code to fetch the response

StringBuffer sReply =new StringBuffer("");
InputStream rbas = method.getResponseBodyAsStream();
InputStreamReader isr = new InputStreamReader (rbas, "UTF8" /* "UTF8"
*/);
BufferedReader br = new BufferedReader (isr);
    while ((temp = (br.readLine())) != null) {
            sReply.append(temp+ "\r\n");
        }
        String strReply=sReply.toString();
method.releaseConnection();
br.close();
isr.close();
rbas.close();
}

But as the response is large sometimes the request times out before
the response is obtained.
Is there a way possible to read partial responses and displaying them
on the screen ???
Or the only option is to set the timeout time to max possible ?

One more thing the records are fetched quite fast when I directly
access the url

Regards,
Manish Moorjani
Infosys Technologies Ltd.
Phone: 91-44-24509530/40/50 Extn. 80395
"If you are still amazed by something you accomplished yesterday then
today has been a waste"

Regards,
Manish Moorjani
Infosys Technologies Ltd.
Phone: 91-44-24509530/40/50 Extn. 80395
"If you are still amazed by something you accomplished yesterday then today has been a waste"



---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to