Hello!

I use htdig to index the sites of a rather huge university.
Unfortunately the index grows bigger than 2GB even if I limit the valid
extensions and the file size and excerpt length - and as the server used for this
has about 100 GB of disk space I don't really think limiting databases to 2
GB of size is acceptable anyway.

The file system is, as I have been assured, capable of handling files bigger
than 2 GB - it's an ext3 linux file system from SuSE Linux 8.0 (kernel
2.4.18). So probably it's htdigs "fault" that the rundig process runs into file
size limit problems? Is that... a thing one can fix by just editing a few lines
of code or replacing all long integers with long long ones?

Peter Asemann

-- 
GMX - Die Kommunikationsplattform im Internet.
http://www.gmx.net



-------------------------------------------------------
This sf.net email is sponsored by:ThinkGeek
Welcome to geek heaven.
http://thinkgeek.com/sf
_______________________________________________
htdig-general mailing list <[EMAIL PROTECTED]>
To unsubscribe, send a message to <[EMAIL PROTECTED]> with a 
subject of unsubscribe
FAQ: http://htdig.sourceforge.net/FAQ.html

Reply via email to