On 04/09/2015 02:44 PM, Rob Z. Smith wrote: > Not that I want to discourage anyone building facilities into dt if they wish > but if there is a system problem where the NAS drive is too slow and local > SSD has to be used for speed I tend to think addressing it at the system > level rather than within the dt application is best. Personally I store my > o/s, the dt application code, home directory and dt database etc on SSD and > use a NAS to hold just the images. This architecture works well - I can read > and write raw images to the NAS at about 4 images per second even on my > pretty humble system. If it was too slow handling the images themselves I > think I would first get a better NAS and network link, and if that was still > not fast enough I might front up the NAS with SSD using bcache or similar - > that way I would get the SSD speed benefits on a NAS volume for all > applications using it not just dt. >
Hi Rob, your point of view is really interesting. I thought this new feature was good because I think a lot of people have their picture on NAS or remote server. Now, reading your reply, I'm curious about how local copy can improve the workflow speed. The step where dt some time is slow is scrolling a large collection (thumb), but I think local copy doesn't help in this case (am I wrong?). When I switch to darkroom the image is loaded in memory, and a local copy on SSD I think it is better. But the point is: how much better? Is there a way to get performances feedback about the image loading, to compare local HD, local SSD, remote NFS, remote others... If I run darktable -perf I can read the line: [dev_process_preview] pixel pipeline processing took 0.078 secs (0.112 CPU) Is that the total amount of time to load the image to memory? Don't forget, in the proposed use case, when I start a session I have to local copy and the sync back the images, so that time needs to be considered. Last, I never used bcache but it sound a good tip for expert users :) Ivan ------------------------------------------------------------------------------ BPM Camp - Free Virtual Workshop May 6th at 10am PDT/1PM EDT Develop your own process in accordance with the BPMN 2 standard Learn Process modeling best practices with Bonita BPM through live exercises http://www.bonitasoft.com/be-part-of-it/events/bpm-camp-virtual- event?utm_ source=Sourceforge_BPM_Camp_5_6_15&utm_medium=email&utm_campaign=VA_SF _______________________________________________ Darktable-users mailing list Darktable-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/darktable-users