http://archie.kumc.edu/robots.txt
Didn't intend to reply off-list last time. Jason >>> On 9/3/2009 at 9:36 AM, in message <[email protected]>, wally grotophorst <[email protected]> wrote: > Yes, I did have this problem on our old system (OSX Tiger server with > DSpace running under JBOSS), but I haven't been "in charge" of our > DSpace installation for a couple of years. Our current repository > librarian just left so I'm doing it again until I can find a new one. > > > I'll try the sitemap thing and look for the patch as well. > > Do you know where a battle-tested robots.txt might be found? What's the > name of your server and I'll look at that one. > > Thanks for getting back to me so quickly. This thing is making me nuts... > > > - Wally > > >> Wally, >> We had this problem after upgrading. Two things made the difference for >> us: >> 1) using a variation of the crontab script at >> > http://wiki.dspace.org/index.php/Idle_In_Transaction_Problem#Workaround:_Kill > ing_.22Idle_in_Transaction.22_processes_with_crontab >> (looks like you have experience with this :) and >> 2) editing robots.txt to block crawlers and implementing the sitemap >> feature in XMLUI. >> >> There is also a Cocoon patch that might help. See the thread >> http://www.nabble.com/AJP-Errors-td23981286.html >> Hopefully, Sean or Mark can say whether or not it solves the connection >> pool problem. >> >> Jason >> -- ------------------------------------------------------------------------------ Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july _______________________________________________ DSpace-tech mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/dspace-tech

