Hi
I recently opened concurrently two slightly different copies of a  
bibliography with just over 3900 entries, all of which have external  
URLs attached.  With both of these open, one CPU core was going at  
100% and BD was beachballing for minutes at a time (on a 2ghz intel  
core duo 1.5GB ram macbook running 10.5.4 and BD 1.3.18), .  I was  
able to copy the content of one into the other, and search for  
duplicates by title (with a delay of several minutes between every  
action) but after deleting the 7750 or so duplicates, which I'd hoped  
would speed things up, instead BD beachballed indefinitely and I force  
quit after about 10 minutes.

I'm hoping to deal with this by splitting this bibliography up into a  
few different files.  But I'm more concerned about my main working  
bibliography, which is about 1900 entries & which I'd really prefer to  
keep in a single file for everyday use.  I've noticed some performance  
degradation going from 1200 or so to 1900 but nothing like what I  
experienced with 3900.  So it seems like there's a non-linear  
relationship involved....I have one more file cabinet of 300+ articles  
to add to that biblio. so I'm hoping it doesn't get much worse.

I don't know that these issues can be addressed without a major  
rewrite...but I would be more interested in performance-focused  
improvements than new features etc. in future versions.

Derick






-------------------------------------------------------------------------
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
_______________________________________________
Bibdesk-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/bibdesk-users

Reply via email to