On 2017-01-17 12:25 PM, Patrick Shanahan wrote:
* I. Ivanov <[email protected]> [01-17-17 11:52]:

On 2017-01-16 02:15 AM, Tobias Ellinghaus wrote:
Am Sonntag, 15. Januar 2017, 22:22:12 CET schrieb I. Ivanov:
Hi Guys,

I stumbled on an odd behavior. DT 2.2.1 on Ubuntu 16.04. I did the
following:

1. Have a folder on the local drive with 200+ images all corrected
everything is good.
2. Removed the images from the collection
3. Copy the images on a NAS share (my archive location)
4. Imported them and noticed not all thumbnails look right.
Did you ever copy back an older library.db? What you describe can happen when
image ids get "reused" internally, i.e., when there was another image before
with the same id which put its thumbnail into the cache.

[...]
I only have one library. I process them when they are on the local HDD. Once
they are done - I would remove them from there, move them on NAS (it is
actually a USB drive connected to the router) and re import the images from
this "archive" location.

I do such a workflow in order to gain speed (local drive is SSD) and the
"NAS" is slower.
Why not use dt to "move" the files and continue maintaining them the dt's
library.db?  Seems like a lot of extra work the way you describe.

I have several sub folders and when I copy them everything goes together. RAW +high res jpg exports +lo res JPG and in some cases even movies... Currently I just grab the complete folder. If I do "move" I'd still have to manually copy everything else.

____________________________________________________________________________
darktable user mailing list
to unsubscribe send a mail to [email protected]

Reply via email to