Am 20.05.2016 um 17:20 schrieb Dr. Marc Arnold Bach:
Hi,
that is interesting ... but isn't DT trusting its DB first and and second
the xmp?
DB needs to be copied as well otherwise you wouldn't see a change after
being back.
Yes and no. With xmp-based workflow I referred to either scanning the
folders when dt is started (there's a config option for this) or to
reimport every film roll before editing. I would guess the former is
more convenient. But I would guess one could find a more dynamic
integration with a little Lua scripting, but I'm not sure about this.
With this approach, you would not have to copy the db, the only drawback
is that styles are not synced between computers since these are stored
in the db only. But I am confident this will change some day :-) Of
course, you would have to manage conflicts when files are edited in
parallel on both computers.
Best regards
Chris
Marc
2016-05-20 10:45 GMT+02:00 Christian Mandel <[email protected]>:
Am 20.05.2016 um 10:23 schrieb Dr. Marc Arnold Bach:
Hi,
I am trying to fit my workflow to me personal needs.
I have for desired workpattern:
A) I use a weak laptop to import fotos, to sort them, tag them, rate them
in the livingroom.
B) I go downstairs using a powerful PC with better display to work with DT
RAW modules or Gimp
C) I am offsite, somewhere far away and cannot await to work with newly
taken raws... I task A and start B with bad performance on a laptop. Later
I import the results together with untouched pics to central archive and
maybe start A and B again.
D) I create local copies on the laptop of old raws and take them with me.
work qwith them maybe together with workflow C's new pictures.. Later I
update the sidecars and home-DB be importing again.
A and B are done:
To be able to work with a Laptop and PC on same data I decided to put pics
and DB to a SAMBA share an a fast fileserver in my house. I mount the
filesystem with the current machine, all changes are in the NAS, DB is
fine... If I unmount and boot other machine it will find same DB, same
raws, same paths...
C and D are tricky... having DB on a disconnected network, darktable is
not starting because the mountpoint folder /mnt/fotos is owned by root =<
readonly.
no DB can be created, it crashes..
Even if I would create a local DB, discard it after a while and would
import the raws + sidecars at home again.... how to deal with local
copies?
I cannot believe that I am the only one in the world with the purpose to
use a laptop offline and being in sync with a local workstation with
access
to an advanced storage solution to get a centralized backup.
A professional takes pics of employees somewhere, shows first results to
customer, plays around in train with files to use the time. Arrives at
home, rates in the garden and later after kids are in bed he is using
workstation to finish the job...
Lightroom (what I used before) was unflexible as well but at least it was
supporting a DB per filmrole and did not insist that each folder is a
separat filmrole...
in exchange it rejected network based DBs and was slow.
Any ideas? Using sidecars as info provider is possible but really slow...
I would need a DB sync of a local and central one...
Darktable has limited support for editing “detached” files. However, the
complex workflow that you describe may require a system that can resemble
this work flow. I suggest to have a look at
http://git-annex.branchable.com/. Of course this is an xmp-based
workflow, but I would guess you have no other option. With git-annex, you
could have your sidecars version-controlled in git and the raws and output
files (also version-controlled) in git-annex. This is kind of a seamless
integration. Git-annex can care about distributing your files among your
devices and you can do things like copying a bunch of files for editing
during a travel and let the system put them back later. All of your files
will always be visible to dt on all devices, but if file contents are tried
to access but are not there, the system will try to get them from wherever
a duplicate of the file lies. It can therefore as well access several cloud
providers and can store files encrypted.
I always wanted to give it a try for photo management but I never had the
time, and I have only one computer, so it is enough for me to have the xmp
files in a git.
One could dream about darktable interacting smarter with changed xmp
files, git and git-annex, if you have some programming skills you could try
using the internal lua scripting e.g. to commit new versions into git
whenever a xmp file is written. I always wanted to try this but I have very
limited time constraints at the moment and very limited programming skills
as well.
Best regards
Chris
Currently I only can think about using DT in a separate offline account and
copy sidecars arround.....
Regards
Marc
____________________________________________________________________________
darktable user mailing list
to unsubscribe send a mail to
[email protected]
____________________________________________________________________________
darktable user mailing list
to unsubscribe send a mail to
[email protected]
____________________________________________________________________________
darktable user mailing list
to unsubscribe send a mail to [email protected]
____________________________________________________________________________
darktable user mailing list
to unsubscribe send a mail to [email protected]