I have a large lucene index from which I'm trying to extract term
vectors. I get a stackoverflow error, which is believe is caused by the
recursion in LuceneIterator.computeNext().  I could increase the stack
size, but with big enough data there could always be a problem.

I have a modified version that uses a loop instead of the recursion
which seems to work OK. Should I put a patch somewhere?

On a related note I'd quite like to be able to write the vectors
straight to s3 without writing to a local file first - is there a
practical way to do this?


Reply via email to