I'm testing an index of 30 million pages, it requires 1.5gb of ram to search using tomcat 5, I plan on having an index with multiple billion pages, but if this is to scale then even with 16GB of ram I wont be able to have an index larger than 320million pages? how can I distribute the memory requirements across multiple machines, or is there another servlet program (like resin) that will require less memory to operate, has anyone else run into this? Thanks, -Jay Pound
- Memory usage Jay Pound
- Re: Memory usage2 Jay Pound
- Re: [Nutch-general] Re: Memory usage2 ogjunk-nutch
- Re: [Nutch-general] Re: Memory usage2 Sébastien LE CALLONNEC
- Re: [Nutch-general] Re: Memory us... Jay Pound
- distributed search webmaster
- Re: distributed search Piotr Kosiorowski
- Re: distributed sear... Jay Pound
- RE: Memory usage2 Paul Harrison
