AFAIK there is not simple way to automatically remove duplicates. The  
problem is that duplicate entries are not necessarily identical  
entries, so a decision has to be made as to which of the duplicates  
is retained. And an automatic system cannot make this choice (apart  
from making a random choice). So the easiest is to put the selected  
duplicates in a temporary (static) group and delete duplicates  
manually. Make sure you use Option-Delete to delete the items,  
otherwise they are merely deleted from the group. If you sort by an  
appropriate column it should not be too hard to identify the duplicates.

Christiaan

On 3 Nov 2007, at 10:01 AM, Alex Hamann wrote:

> It seems my answer was a little premature (as Adrian pointed out in
> an email to me yesterday).  The solution I provided does indeed
> select all duplicates but is not good to just delete the second copy.
> Now I am bringing this back since I may run into a similar situation
> pretty soon. Has anybody found an easy way to delete only the copies
> but leaving one entry in place?
>
> a.
>
>
> Am 01.11.2007 um 09:21 schrieb Alex Hamann:
>
>> Database --> select duplicates
>>
>> alex
>>
>>
>> Am 01.11.2007 um 07:00 schrieb Adrian Butscher:
>>
>>> Hello,
>>>
>>> I would like to consolidate a large number of different .bib files.
>>> If I import all of them into a single BibDesk file, then I have  
>>> many,
>>> many duplicates.  Is there a way of removing all duplicates in one
>>> go?
>>>
>>> Thanks.
>>>
>>> Adrian Butscher
>>> Stanford University
>>>
>>> -------------------------------------------------------------------- 
>>> -
>>> -
>>> ---
>>> This SF.net email is sponsored by: Splunk Inc.
>>> Still grepping through log files to find problems?  Stop.
>>> Now Search log events and configuration files using AJAX and a
>>> browser.
>>> Download your FREE copy of Splunk now >> http://get.splunk.com/
>>> _______________________________________________
>>> Bibdesk-users mailing list
>>> [email protected]
>>> https://lists.sourceforge.net/lists/listinfo/bibdesk-users
>>>
>>
>>
>>
>> =============================================
>> please avoid sending me word attachements; see
>> http://www.gnu.org/philosophy/no-word-attachments.html
>> for details and background
>>
>>
>>
>> --------------------------------------------------------------------- 
>> -
>> ---
>> This SF.net email is sponsored by: Splunk Inc.
>> Still grepping through log files to find problems?  Stop.
>> Now Search log events and configuration files using AJAX and a
>> browser.
>> Download your FREE copy of Splunk now >> http://get.splunk.com/
>> _______________________________________________
>> Bibdesk-users mailing list
>> [email protected]
>> https://lists.sourceforge.net/lists/listinfo/bibdesk-users
>>
>
> =============================================
> please avoid sending me word attachements; see
> http://www.gnu.org/philosophy/no-word-attachments.html
> for details and background
>
>
>
> ---------------------------------------------------------------------- 
> ---
> This SF.net email is sponsored by: Splunk Inc.
> Still grepping through log files to find problems?  Stop.
> Now Search log events and configuration files using AJAX and a  
> browser.
> Download your FREE copy of Splunk now >> http://get.splunk.com/
> _______________________________________________
> Bibdesk-users mailing list
> [email protected]
> https://lists.sourceforge.net/lists/listinfo/bibdesk-users


-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Bibdesk-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/bibdesk-users

Reply via email to