Le dimanche 23 septembre 2007 à 00:43 +0200, Lorenzo Milesi a écrit : > Bill Moseley ha scritto: > > Version: 0.3.5-0ubuntu2 > > > > > > I have duplicate images in my collection that point to the same file. > > Is there a way to clean those out of the database? > > > There is a bug lost somewhere in the time! :) > http://bugzilla.gnome.org/show_bug.cgi?id=169646 > Ciao
Hello I'm wondering what the checksum duplicate should/would cover. In the following scenario 1. I import some picture 2. I tag it, with "write metadata on file" 3. I re-import the initial picture by mistake. after the step 2, the file on disk changed, therefore, comparing the checksums at step 3 won't detect duplicate. This is indeed an argument in favour of a policy "never modify the picture". Another solution would be to store a checksum of the initial file. Yet another solution would to use a "picture content checksum", independant of the file, the resolution etc... (gqview can find duplicates according to a similarity measure, I don't know how this works) Another useful hint would be exif metadata, 2 pictures taken at the exact same time with the same camera may be duplicates. What do you think about this, will it be handled ? cheers SB _______________________________________________ F-spot-list mailing list [email protected] http://mail.gnome.org/mailman/listinfo/f-spot-list
