All,
I'm using httpclient 4.0.1 accessing a remote web server (tomcat
6.0.29). When I perform a GET on "large" documents (like a large CSS
or Javascript) the content is getting "cut off". My basic pattern is:
entity = response.getEntity();
.
.
.
ins = entity.getContent();
org.apache.http.Header hdr = response.getFirstHeader("Content-Type");
if (hdr == null) {
isText = false;
} else {
isText =
response.getFirstHeader("Content-Type").getValue().startsWith("text");
}
org.apache.http.Header[] headers = response.getAllHeaders();
.
.
.
if (isText) {
req.setAttribute(ProxySys.AUTOIDM_STREAM_WRITER, true);
BufferedReader in = new BufferedReader(new
InputStreamReader(ins));
byte[] buffer = new byte[1024];
PrintWriter out = resp.getWriter();
String line;
HttpFilterChain chain = new
HttpFilterChain(holder,this);
StringBuffer lineBuff = new StringBuffer();
int len = 0;
while ((line = in.readLine()) != null) {
lineBuff.setLength(0);
lineBuff.append(line);
out.println(lineBuff.toString());
}
entity.consumeContent();
//out.flush();
//out.close();
} else {
req.setAttribute(ProxySys.AUTOIDM_STREAM_WRITER, false);
byte[] buffer = new byte[1024];
//InputStream in = entity.getContent();
int len;
OutputStream out = resp.getOutputStream();
while ((len = ins.read(buffer)) != -1) {
out.write(buffer, 0, len);
}
entity.consumeContent();
//out.flush();
//out.close();
}
.
.
.
out.flush();
out.close();
Any reason why that could be happening? binary objects (such as
images) load fine.
Any help would be greatly appreciated.
Thanks
Marc
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]