Of the options given, I would think that doing a "picture content
checksum" would be best.  You can't rely on the EXIF data either because
it is possible within F-Spot to change the date/time on a photo.

SignMan359

On Sun, 2007-09-23 at 16:50 +0200, Sébastien Barthélemy wrote:

> Le dimanche 23 septembre 2007 à 00:43 +0200, Lorenzo Milesi a écrit :
> > Bill Moseley ha scritto:
> > > Version: 0.3.5-0ubuntu2
> > >
> > >
> > > I have duplicate images in my collection that point to the same file.
> > > Is there a way to clean those out of the database?
> > >   
> > There is a bug lost somewhere in the time! :)
> > http://bugzilla.gnome.org/show_bug.cgi?id=169646
> > Ciao
> 
> Hello
> 
> I'm wondering what the checksum duplicate should/would cover.
> 
> In the following scenario
> 1. I import some picture
> 2. I tag it, with "write metadata on file"
> 3. I re-import the initial picture by mistake.
> 
> after the step 2, the file on disk changed, therefore, comparing the
> checksums at step 3  won't detect duplicate.
> 
> 
> This is indeed an argument in favour of a policy "never modify the
> picture".
> 
> Another solution would be to store a checksum of the initial file.
> 
> Yet another solution would to use a "picture content checksum",
> independant of the file, the resolution etc... (gqview can find
> duplicates according to a similarity measure, I don't know how this
> works)
> 
> Another useful hint would be exif metadata, 2 pictures taken at the
> exact same time with the same camera may be duplicates.
> 
> 
> What do you think about this, will it be handled ?
> 
> cheers
> 
> SB
> 
> 
> _______________________________________________
> F-spot-list mailing list
> [email protected]
> http://mail.gnome.org/mailman/listinfo/f-spot-list
_______________________________________________
F-spot-list mailing list
[email protected]
http://mail.gnome.org/mailman/listinfo/f-spot-list

Reply via email to