I'm getting several OutOfMemoryError messages from large sites like this one 
and gutenberg.com

JAVA_HEAP_MAX=-Xmx3528m
JAVA_PERM_HEAP=-XX:MaxPermSize=128m

Anyone care to share their settings for doing a crawl level 1 on a huge site?

fetching http://www.azlyrics.com/lyrics/3oh3/coloradosunrise.html
fetching http://www.azlyrics.com/lyrics/unloco/facedown.html
fetching http://www.azlyrics.com/lyrics/unfinishedthought/starfighter.html
#
# An unexpected error has been detected by Java Runtime Environment:
#
# java.lang.OutOfMemoryError: requested 145320 bytes for Chunk::new. Out of swap
 space?
#
#  Internal Error (allocation.cpp:218), pid=29949, tid=12
#  Error: Chunk::new



      

Reply via email to