Hi Francois-Xavier,
Subclassing InputStream as you suggested worked very well, thank you so much
for this idea.
Here is what I have done to produce CSV (SuperCSV library) from Hibernate's
ScrollableResults and upload it to a servlet on the fly:
public class ScrollableResultsInputStreamProducer<E> extends InputStream {
private static final Logger logger =
Logger.getLogger(ScrollableResultsInputStreamProducer.class);
private final Session session;
private final ScrollableResults cursor;
private final Transformer<E> transformer;
private ByteArrayInputStream in;
private boolean eof;
public ScrollableResultsInputStreamProducer(Session session,
ScrollableResults cursor, Transformer<E> transformer)
throws IOException {
this.session = session;
this.cursor = cursor;
this.transformer = transformer;
this.in = new
ByteArrayInputStream(transformer.transformHeader());
}
@Override
public int read() throws IOException {
if (eof) {
return -1;
}
if (in != null) {
int data = in.read();
if (data > -1) {
return data;
}
else {
in.close();
in = null;
}
}
if (!cursor.next()) {
eof = true;
return -1;
}
in = new ByteArrayInputStream(transformer.transform((E)
cursor.get(0)));
session.clear();
return in.read();
}
@Override
public void close() throws IOException {
IOUtils.closeQuietly(in);
try {
cursor.close();
}
catch (Exception ex) {
logger.warn(ex.getMessage(), ex);
}
}
}
public interface Transformer<E> {
byte[] transform(E entity) throws IOException;
byte[] transformHeader() throws IOException;
}
public abstract class CsvTransformer<E> implements Transformer<E> {
private CellProcessor[] processors;
public CsvTransformer() {
processors = new CellProcessor[getHeader().size()];
CellProcessor notNullProcessor = new ConvertNullTo("");
for (int i = 0; i < processors.length; i++) {
processors[i] = notNullProcessor;
}
}
public byte[] transform(E entity) throws IOException {
ByteArrayOutputStream out = new ByteArrayOutputStream();
Writer writer = new OutputStreamWriter(out, "UTF-8");
CsvListWriter csvWriter = new CsvListWriter(writer,
CsvPreference.STANDARD_PREFERENCE);
List<List<String>> rows = getRows(entity);
for (List<String> row : rows) {
csvWriter.write(row, processors);
}
csvWriter.close();
return out.toByteArray();
}
public byte[] transformHeader() throws IOException {
ByteArrayOutputStream out = new ByteArrayOutputStream();
out.write(ByteOrderMark.UTF_8.getBytes());
out.flush();
Writer writer = new OutputStreamWriter(out, "UTF-8");
CsvListWriter csvWriter = new CsvListWriter(writer,
CsvPreference.STANDARD_PREFERENCE);
csvWriter.writeHeader(getHeader().toArray(new
String[getHeader().size()]));
csvWriter.close();
return out.toByteArray();
}
protected abstract List<String> getHeader();
protected abstract List<List<String>> getRows(E entity);
}
public class Uploader {
private static final Logger logger = Logger.getLogger(Uploader.class);
private final HttpClient client;
private final HttpPost request;
private final InputStream in;
private HttpResponse response;
public Uploader(String fileName, Date expiry, InputStream in) {
client = new DefaultHttpClient();
request = new HttpPost("http://localhost/foo/Servlet?filename="
+ fileName + "&expiry=" + expiry.getTime());
this.in = new BufferedInputStream(in);
request.setEntity(new InputStreamEntity(in, -1));
}
public String upload() throws IOException {
String token = null;
try {
response = client.execute(request);
token = EntityUtils.toString(response.getEntity());
}
finally {
close();
}
return token;
}
protected void close() {
try {
EntityUtils.consume(response.getEntity());
}
catch (Exception ex) {
logger.warn(ex.getMessage(), ex);
}
IOUtils.closeQuietly(in);
try {
client.getConnectionManager().shutdown();
}
catch (Exception ex) {
logger.warn(ex.getMessage(), ex);
}
}
}
public class UploadServlet extends BaseServlet {
protected void processRequest(HttpServletRequest httpRequest,
HttpServletResponse httpResponse, Map<RequestParam, String> params)
throws Exception {
/* Stream to database or whatever ... */
/* String token = dao.create(new
Upload(httpRequest.getInputStream(),params.get(RequestParam.FILE_NAME),params.get(RequestParam.EXPIRY));
sendResponse(token, httpResponse); */
}
protected RequestParam[] getMandatoryParameters() {
return new RequestParam[] { RequestParam.FILE_NAME,
RequestParam.EXPIRY };
}
}
Hope that might be useful for anyone.
Br, Stefan
-----Ursprüngliche Nachricht-----
Von: Francois-Xavier Bonnet [mailto:[email protected]] Im Auftrag von
François-Xavier Bonnet
Gesendet: Freitag, 15. Februar 2013 12:19
An: HttpClient User Discussion
Cc: Frenzel Stefan
Betreff: Re: AW: HttpClient.execute blocks until EOF of InputStreamBody instead
of sending chunks on the fly
If you don't want to spawn new threads, you cannot use HttpAsyncClient.
Then the most simple way is to produce your csv file in memory and then pass a
ByteArrayInputStream to HttpClient.
But if the data is too big to be buffered to memory, you can write your own
InputStream subclass that produces the data as HttpClient reads it.
It would be something like this:
InputStream producer = new InputStream() {
private ByteArrayInputStream data;
private boolean eof = false;
private int count = 0;
@Override
public int read() throws IOException {
if (eof)
return -1;
if (data != null) {
int result = data.read();
if (result > -1)
return data.read();
else {
data.close();
data = null;
}
}
String newData = nextLine();
if (newData == null) {
eof = true;
return -1;
}
data = new ByteArrayInputStream(newData.getBytes("UTF-8"));
return data.read();
}
private String nextLine() {
if (count < 10) {
System.out.println("Writing line " + count);
String result = "new Line " + count + "\n";;
count++;
return result;
}
return null;
}
};
On 15/02/2013 11:15, Frenzel Stefan wrote:
> Hi Francois-Xavier,
>
> Thank you very much for your answer. Switching to HttpAsyncClient made it.
> It's quite easy to use, good work!
> However, my client is within an EJB. According to EJB 3.0 specification it's
> not allowed to spawn new threads which is exclusively the container's job.
> Any suggestions or best practices?
>
>
> Br, Stefan
>
>
> -----Ursprüngliche Nachricht-----
> Von: Francois-Xavier Bonnet [mailto:[email protected]] Im Auftrag
> von François-Xavier Bonnet
> Gesendet: Freitag, 15. Februar 2013 09:33
> An: HttpClient User Discussion
> Cc: Frenzel Stefan
> Betreff: Re: HttpClient.execute blocks until EOF of InputStreamBody
> instead of sending chunks on the fly
>
> Hi Stefan,
>
> You cannot use piped streams to write and read in the same thread as each
> read / write operation is blocking. Here client.execute(post) is blocking
> while trying to read the InputStrean.
> You should either create separate threads for read and write or consider
> using HttpAsyncClient.
>
> Francois-Xavier
>
>
> On 15/02/2013 09:05, Frenzel Stefan wrote:
>> Hey guys,
>>
>> I am pretty new to HttpComponents and just wondered if it is possible to
>> stream data from input stream of unknown length to an upload servlet.
>> I've already tried to get it working with ClientChunkEncodedPost example
>> bundled with HttpComponents 4.2.3 for the client side as well as
>> ServletFileUpload for the server side.
>> However, HttpClient.execute waits until EOF which will never be reached at
>> this stage.
>>
>> This is what I have:
>>
>> 1. Client:
>> pout = new PipedOutputStream();
>> pin = new PipedInputStream(pout, 8192);
>> writer = new OutputStreamWriter(pout, "UTF-8");
>> csvWriter = new CsvListWriter(writer,
>> CsvPreference.EXCEL_PREFERENCE);
>>
>> csvWriter.writeHeader(header);
>> client = new DefaultHttpClient();
>>
>> client.getParams().setParameter(CoreProtocolPNames.PROTOCOL_VERSION,
>> HttpVersion.HTTP_1_1);
>> post = new HttpPost("http://localhost/Foo/Servlet");
>>
>> MultipartEntity entity = new
>> MultipartEntity(HttpMultipartMode.BROWSER_COMPATIBLE);
>>
>> /* csvWriter is not written to at this stage, so is pin */
>> entity.addPart("file", new InputStreamBody(pin,
>> "text/csv", fileName));
>>
>> post.setEntity(entity);
>>
>> /* here it blocks until EOF */
>> HttpResponse response = client.execute(post);
>>
>>
>> 2. Server:
>>
>> protected void processRequest(HttpServletRequest httpRequest,
>> HttpServletResponse httpResponse, Map<RequestParam, String> params)
>> throws Exception {
>> InputStream in = null;
>> try {
>>
>> if(!ServletFileUpload.isMultipartContent(httpRequest)) {
>> throw new Exception("Not multipart");
>> }
>>
>> ServletFileUpload upload = new ServletFileUpload();
>> FileItemIterator it =
>> upload.getItemIterator(httpRequest);
>> while(it.hasNext()) {
>> FileItemStream item = it.next();
>> System.out.println(item.getFieldName());
>> in = item.openStream();
>> if(item.isFormField()) {
>> /* ... */
>> } else {
>> /* Write upload to database */
>> }
>> }
>> finally {
>> IOUtils.closeQuietly(in);
>> }
>> }
>>
>> Any help would be appreciated. Thanks!
>>
>>
>> Br, Stefan
>>
>>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
>
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]