Re: [darktable-user] Duplicated version number created for existing image

2023-07-27 Thread Dusenberg

No problem, Guillermo, and thanks for your help.  It does seem a big
issue, particularly looking at the db, and  I do hope some more
developer eyes see the post.  In meantime, do you know which files I
would find the relevant 'create duplicate' functionality in the source
code so I can dig into the issue some more - I know nothing about how
the dt code is structured?

Regards
Dusenberg

On 27/07/2023 11:46, Guillermo Rozas wrote:

Hi,
sorry for the late reply, I was a bit busy. Yes, pixls.us
<http://pixls.us> will probably give you more 'developer eyes' to
check the problem. It's probably a big, so I would be prepared to file
one on GitHub.
Best regards,
Guillermo

On Thu, Jul 27, 2023, 06:21 Dusenberg  wrote:

I have posted this on pixls.us <http://pixls.us> now it is back up.
    Dusenberg

On 26/07/2023 20:39, Dusenberg wrote:

Guillermo

Since my last post, I have extracted data from the dt db image
table for the image and versions concerned from three backup
instances of the dt database going back 3 years:
    a) 2020-12-14 - pre-dt3.4.0, closest I can get to the
original shot date
    b) 2023-07-22 - dt4.2.1, before new duplicate created
    c) 2023-07-24 - dt4.2.1, after new duplicate created showing
duplicate version ''

Hope this sheds more light on the issue.

pre-dt3.4.0-library.db from backup archive 2020-12-14
--
select key, value from db_info;
key    value
version    30

select id, group_id, film_id,filename,version,max_version from
images where filename = "20200325_BonallyWoods_NIK_1413.NEF"
order by id;

id    group_id    film_id    filename    
version    max_version
3574    3574    135 20200325_BonallyWoods_NIK_1413.NEF    0    7
3578    3574    135 20200325_BonallyWoods_NIK_1413.NEF    1    7
3582    3574    135 20200325_BonallyWoods_NIK_1413.NEF    2    7
3583    3574    135 20200325_BonallyWoods_NIK_1413.NEF    3    7
3789    3574    135 20200325_BonallyWoods_NIK_1413.NEF    4    7
3790    3574    135 20200325_BonallyWoods_NIK_1413.NEF    5    7
3791    3574    135 20200325_BonallyWoods_NIK_1413.NEF    6    7
3792    3574    135 20200325_BonallyWoods_NIK_1413.NEF    7    7


library.db-snp-20230722155434 from backup archive BEFORE new
duplicate created

--
select key, value from db_info;
key    value
version    37

select id, group_id, film_id,filename,version,max_version from
images where filename = "20200325_BonallyWoods_NIK_1413.NEF"
order by id;

id    group_id    film_id    filename version    max_version
3574    3574    135 20200325_BonallyWoods_NIK_1413.NEF    0    7
3578    3574    135 20200325_BonallyWoods_NIK_1413.NEF    1    7
3582    3574    135 20200325_BonallyWoods_NIK_1413.NEF    2    7
3583    3574    135 20200325_BonallyWoods_NIK_1413.NEF    3    7
3789    3574    135 20200325_BonallyWoods_NIK_1413.NEF    4    7
3790    3574    135 20200325_BonallyWoods_NIK_1413.NEF    5    7
3791    3574    135 20200325_BonallyWoods_NIK_1413.NEF    6    7
3792    3574    135 20200325_BonallyWoods_NIK_1413.NEF    7    7


library.db-snp-20230724163328 from current backup AFTER new
duplicate created

---
select key, value from db_info;
key    value
version    37

select id, group_id, film_id,filename,version,max_version from
images where filename = "20200325_BonallyWoods_NIK_1413.NEF"
order by id;

id    group_id    film_id    filename version    max_version
3574    3574    135 20200325_BonallyWoods_NIK_1413.NEF    0    0
3578    3574    135 20200325_BonallyWoods_NIK_1413.NEF    1    3
3582    3574    135 20200325_BonallyWoods_NIK_1413.NEF    2    3
3583    3574    135 20200325_BonallyWoods_NIK_1413.NEF    3    3
duplicate
3789    3574    135 20200325_BonallyWoods_NIK_1413.NEF    4    3
3790    3574    135 20200325_BonallyWoods_NIK_1413.NEF    5    3
3791    3574    135 20200325_BonallyWoods_NIK_1413.NEF    6    3
3792    3574    135 20200325_BonallyWoods_NIK_1413.NEF    7    3
9744    3574    135 20200325_BonallyWoods_NIK_1413.NEF    3    3
duplicate

The dt database has an index, "images_filename_index" ON "images"
( "filename", "version" );  which means that a 'duplicate' is
related to a particular image filename. There are two
implications from this which may be relevant to the issue I raised:

a) The index is is not unique, it allows duplicates. Therefore
the database allows (and cannot trap) insertion of a new image

Re: [darktable-user] Duplicated version number created for existing image

2023-07-27 Thread Dusenberg

I have posted this on pixls.us now it is back up.
Dusenberg

On 26/07/2023 20:39, Dusenberg wrote:

Guillermo

Since my last post, I have extracted data from the dt db image table
for the image and versions concerned from three backup instances of
the dt database going back 3 years:
    a) 2020-12-14 - pre-dt3.4.0, closest I can get to the original
shot date
    b) 2023-07-22 - dt4.2.1, before new duplicate created
    c) 2023-07-24 - dt4.2.1, after new duplicate created showing
duplicate version ''

Hope this sheds more light on the issue.

pre-dt3.4.0-library.db from backup archive 2020-12-14
--
select key, value from db_info;
key    value
version    30

select id, group_id, film_id,filename,version,max_version from images
where filename = "20200325_BonallyWoods_NIK_1413.NEF" order by id;

id    group_id    film_id    filename version    max_version
3574    3574    135    20200325_BonallyWoods_NIK_1413.NEF 0    7
3578    3574    135    20200325_BonallyWoods_NIK_1413.NEF 1    7
3582    3574    135    20200325_BonallyWoods_NIK_1413.NEF 2    7
3583    3574    135    20200325_BonallyWoods_NIK_1413.NEF 3    7
3789    3574    135    20200325_BonallyWoods_NIK_1413.NEF 4    7
3790    3574    135    20200325_BonallyWoods_NIK_1413.NEF 5    7
3791    3574    135    20200325_BonallyWoods_NIK_1413.NEF 6    7
3792    3574    135    20200325_BonallyWoods_NIK_1413.NEF 7    7


library.db-snp-20230722155434 from backup archive BEFORE new duplicate
created
--
select key, value from db_info;
key    value
version    37

select id, group_id, film_id,filename,version,max_version from images
where filename = "20200325_BonallyWoods_NIK_1413.NEF" order by id;

id    group_id    film_id    filename version    max_version
3574    3574    135    20200325_BonallyWoods_NIK_1413.NEF 0    7
3578    3574    135    20200325_BonallyWoods_NIK_1413.NEF 1    7
3582    3574    135    20200325_BonallyWoods_NIK_1413.NEF 2    7
3583    3574    135    20200325_BonallyWoods_NIK_1413.NEF 3    7
3789    3574    135    20200325_BonallyWoods_NIK_1413.NEF 4    7
3790    3574    135    20200325_BonallyWoods_NIK_1413.NEF 5    7
3791    3574    135    20200325_BonallyWoods_NIK_1413.NEF 6    7
3792    3574    135    20200325_BonallyWoods_NIK_1413.NEF 7    7


library.db-snp-20230724163328 from current backup AFTER new duplicate
created
---
select key, value from db_info;
key    value
version    37

select id, group_id, film_id,filename,version,max_version from images
where filename = "20200325_BonallyWoods_NIK_1413.NEF" order by id;

id    group_id    film_id    filename version    max_version
3574    3574    135    20200325_BonallyWoods_NIK_1413.NEF 0    0
3578    3574    135    20200325_BonallyWoods_NIK_1413.NEF 1    3
3582    3574    135    20200325_BonallyWoods_NIK_1413.NEF 2    3
3583    3574    135    20200325_BonallyWoods_NIK_1413.NEF 3    3  
duplicate
3789    3574    135    20200325_BonallyWoods_NIK_1413.NEF 4    3
3790    3574    135    20200325_BonallyWoods_NIK_1413.NEF 5    3
3791    3574    135    20200325_BonallyWoods_NIK_1413.NEF 6    3
3792    3574    135    20200325_BonallyWoods_NIK_1413.NEF 7    3
9744    3574    135    20200325_BonallyWoods_NIK_1413.NEF 3    3   
duplicate

The dt database has an index, "images_filename_index" ON "images" (
"filename", "version" );  which means that a 'duplicate' is related to
a particular image filename. There are two implications from this
which may be relevant to the issue I raised:

a) The index is is not unique, it allows duplicates. Therefore the
database allows (and cannot trap) insertion of a new image table row
with a version number that already exists for the given filename
value. The fact I have a duplicate version for a filename suggests the
dt code also does not trap this.

b) This index assumes filenames are unique across the whole dt
database, which probably is not realistic given how cameras from the
same manufacturer can generate common filenames.

While a unique id is given to each imported image by the dt db to
ensure images with the same filename are permitted and can be handled,
it seems the 'duplicate image' functionality does not recognise this
potential.

Regards
Dusenberg

On 26/07/2023 09:59, Dusenberg wrote:

Guillermo,

Answers to your questions:

a) xmp's are named '__<$VERSION>.RAWextension.xmp'
    eg:
original raw: '20200325_BonallyWoods_NIK_1413.NEF'
 version 3 xmp: '20200325_BonallyWoods_NIK_1413_03.NEF.xmp'

b)  I've checked the files and the new duplicate '3' has overwritten
the existing xmp for the previous version 3. Also all xmp files in
that group new have new modified dates -    24 July 2023,

Re: [darktable-user] Duplicated version number created for existing image

2023-07-26 Thread Dusenberg

Guillermo

Since my last post, I have extracted data from the dt db image table for
the image and versions concerned from three backup instances of the dt
database going back 3 years:
    a) 2020-12-14 - pre-dt3.4.0, closest I can get to the original shot
date
    b) 2023-07-22 - dt4.2.1, before new duplicate created
    c) 2023-07-24 - dt4.2.1, after new duplicate created showing
duplicate version ''

Hope this sheds more light on the issue.

pre-dt3.4.0-library.db from backup archive 2020-12-14
--
select key, value from db_info;
key    value
version    30

select id, group_id, film_id,filename,version,max_version from images
where filename = "20200325_BonallyWoods_NIK_1413.NEF" order by id;

id    group_id    film_id    filename version    max_version
3574    3574    135    20200325_BonallyWoods_NIK_1413.NEF    0 7
3578    3574    135    20200325_BonallyWoods_NIK_1413.NEF    1 7
3582    3574    135    20200325_BonallyWoods_NIK_1413.NEF    2 7
3583    3574    135    20200325_BonallyWoods_NIK_1413.NEF    3 7
3789    3574    135    20200325_BonallyWoods_NIK_1413.NEF    4 7
3790    3574    135    20200325_BonallyWoods_NIK_1413.NEF    5 7
3791    3574    135    20200325_BonallyWoods_NIK_1413.NEF    6 7
3792    3574    135    20200325_BonallyWoods_NIK_1413.NEF    7 7


library.db-snp-20230722155434 from backup archive BEFORE new duplicate
created
--
select key, value from db_info;
key    value
version    37

select id, group_id, film_id,filename,version,max_version from images
where filename = "20200325_BonallyWoods_NIK_1413.NEF" order by id;

id    group_id    film_id    filename version    max_version
3574    3574    135    20200325_BonallyWoods_NIK_1413.NEF    0 7
3578    3574    135    20200325_BonallyWoods_NIK_1413.NEF    1 7
3582    3574    135    20200325_BonallyWoods_NIK_1413.NEF    2 7
3583    3574    135    20200325_BonallyWoods_NIK_1413.NEF    3 7
3789    3574    135    20200325_BonallyWoods_NIK_1413.NEF    4 7
3790    3574    135    20200325_BonallyWoods_NIK_1413.NEF    5 7
3791    3574    135    20200325_BonallyWoods_NIK_1413.NEF    6 7
3792    3574    135    20200325_BonallyWoods_NIK_1413.NEF    7 7


library.db-snp-20230724163328 from current backup AFTER new duplicate
created
---
select key, value from db_info;
key    value
version    37

select id, group_id, film_id,filename,version,max_version from images
where filename = "20200325_BonallyWoods_NIK_1413.NEF" order by id;

id    group_id    film_id    filename version    max_version
3574    3574    135    20200325_BonallyWoods_NIK_1413.NEF    0 0
3578    3574    135    20200325_BonallyWoods_NIK_1413.NEF    1 3
3582    3574    135    20200325_BonallyWoods_NIK_1413.NEF    2 3
3583    3574    135    20200325_BonallyWoods_NIK_1413.NEF    3 3  
duplicate
3789    3574    135    20200325_BonallyWoods_NIK_1413.NEF    4 3
3790    3574    135    20200325_BonallyWoods_NIK_1413.NEF    5 3
3791    3574    135    20200325_BonallyWoods_NIK_1413.NEF    6 3
3792    3574    135    20200325_BonallyWoods_NIK_1413.NEF    7 3
9744    3574    135    20200325_BonallyWoods_NIK_1413.NEF    3 3   
duplicate

The dt database has an index, "images_filename_index" ON "images" (
"filename", "version" );  which means that a 'duplicate' is related to a
particular image filename. There are two implications from this which
may be relevant to the issue I raised:

a) The index is is not unique, it allows duplicates. Therefore the
database allows (and cannot trap) insertion of a new image table row
with a version number that already exists for the given filename value.
The fact I have a duplicate version for a filename suggests the dt code
also does not trap this.

b) This index assumes filenames are unique across the whole dt database,
which probably is not realistic given how cameras from the same
manufacturer can generate common filenames.

While a unique id is given to each imported image by the dt db to ensure
images with the same filename are permitted and can be handled, it seems
the 'duplicate image' functionality does not recognise this potential.

Regards
Dusenberg

On 26/07/2023 09:59, Dusenberg wrote:

Guillermo,

Answers to your questions:

a) xmp's are named '__<$VERSION>.RAWextension.xmp'
    eg:
original raw: '20200325_BonallyWoods_NIK_1413.NEF'
 version 3 xmp: '20200325_BonallyWoods_NIK_1413_03.NEF.xmp'

b)  I've checked the files and the new duplicate '3' has overwritten
the existing xmp for the previous version 3. Also all xmp files in
that group new have new modified dates -    24 July 2023, when I
created the new duplicate

Also I didn't mention that the NIK_1413 RAW and duplicat

Re: [darktable-user] Duplicated version number created for existing image

2023-07-26 Thread Dusenberg

Guillermo,

Answers to your questions:

a) xmp's are named '__<$VERSION>.RAWextension.xmp'
    eg:
original raw: '20200325_BonallyWoods_NIK_1413.NEF'
 version 3 xmp: '20200325_BonallyWoods_NIK_1413_03.NEF.xmp'

b)  I've checked the files and the new duplicate '3' has overwritten the
existing xmp for the previous version 3. Also all xmp files in that
group new have new modified dates -    24 July 2023, when I created the
new duplicate

Also I didn't mention that the NIK_1413 RAW and duplicates are in a
single group.

Thanks for your time.
Dusenberg

On 26/07/2023 03:41, Guillermo Rozas wrote:

That sounds strange. How are the xmp files named? If the duplicate
uses a previously used version number, does it also overwrites the
corresponding xmp sidecar?
Regards,
Guillermo

On Tue, Jul 25, 2023 at 1:06 PM Dusenberg  wrote:

Hi Guillermo

Yes the original and all duplicates were in the database before
making the new duplicate.

Regards
Dusenberg

On 25/07/2023 15:03, Guillermo Rozas wrote:

Hi,
were the original and all the duplicates present in the database
before making the duplicate?
    Regards,
Guillermo

On Tue, Jul 25, 2023 at 5:57 AM Dusenberg 
wrote:

Originally posted to darktable-dev list in error.

dt 4.2.1 (OBS), Linux Mint 21,Ubuntu 22.04 jammy

I have an image from March 2020 developed in darktable. I
went back to it today to try another edit on it (its a
monochrome rendition that I just can't get 'right').

However, today when I created a duplicate of this 2020 image
in dt 4.2.1, it was given version number '3' - which already
exists for that image (there are seven pre-existing
duplicates). I see that dt has also given the new duplicate a
different 'image id' to the original RAW image. I've never
seen this before, although its not often I go back in time
like this.

My workflow is that I always create a new version (duplicate)
of the base RAW for a different edit so I can trace back any
final output that may result. My filenaming system is
'___' where
filename is composed of ''.  Original camera images are renamed during
download onto my workstation via a bespoke script (ie outside
dt).  I use variables in the dt export module to ensure any
output follows this format.  This provides unique
identification of every image and its derivatives across my
libraries, even when intermediate tiffs are involved in say,
focus stacks.

This is critical for me - I can't have two different edits of
a RAW with the same filename!  Why has it happened and what
can I do about it?

Thanks


darktable user mailing list to unsubscribe send a mail to
darktable-user+unsubscr...@lists.darktable.org
<mailto:darktable-user%2bunsubscr...@lists.darktable.org>



darktable user mailing list to unsubscribe send a mail to
darktable-user+unsubscr...@lists.darktable.org




darktable user mailing list to unsubscribe send a mail to
darktable-user+unsubscr...@lists.darktable.org
<mailto:darktable-user%2bunsubscr...@lists.darktable.org>



darktable user mailing list to unsubscribe send a mail to
darktable-user+unsubscr...@lists.darktable.org



darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org



Re: [darktable-user] Duplicated version number created for existing image

2023-07-25 Thread Dusenberg

Hi Guillermo

Yes the original and all duplicates were in the database before making
the new duplicate.

Regards
Dusenberg

On 25/07/2023 15:03, Guillermo Rozas wrote:

Hi,
were the original and all the duplicates present in the database
before making the duplicate?
Regards,
Guillermo

On Tue, Jul 25, 2023 at 5:57 AM Dusenberg  wrote:

Originally posted to darktable-dev list in error.

dt 4.2.1 (OBS), Linux Mint 21,Ubuntu 22.04 jammy

I have an image from March 2020 developed in darktable. I went
back to it today to try another edit on it (its a monochrome
rendition that I just can't get 'right').

However, today when I created a duplicate of this 2020 image in dt
4.2.1, it was given version number '3' - which already exists for
that image (there are seven pre-existing duplicates). I see that
dt has also given the new duplicate a different 'image id' to the
original RAW image. I've never seen this before, although its not
often I go back in time like this.

My workflow is that I always create a new version (duplicate) of
the base RAW for a different edit so I can trace back any final
output that may result. My filenaming system is
'___' where
filename is composed of ''.  Original camera images are renamed during download
onto my workstation via a bespoke script (ie outside dt).  I use
variables in the dt export module to ensure any output follows
this format.  This provides unique identification of every image
and its derivatives across my libraries, even when intermediate
tiffs are involved in say, focus stacks.

This is critical for me - I can't have two different edits of a
RAW with the same filename!  Why has it happened and what can I do
about it?

Thanks

darktable user mailing list to unsubscribe send a mail to
darktable-user+unsubscr...@lists.darktable.org
<mailto:darktable-user%2bunsubscr...@lists.darktable.org>



darktable user mailing list to unsubscribe send a mail to
darktable-user+unsubscr...@lists.darktable.org



darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org



[darktable-user] Duplicated version number created for existing image

2023-07-25 Thread Dusenberg

Originally posted to darktable-dev list in error.

dt 4.2.1 (OBS), Linux Mint 21,Ubuntu 22.04 jammy

I have an image from March 2020 developed in darktable. I went back to
it today to try another edit on it (its a monochrome rendition that I
just can't get 'right').

However, today when I created a duplicate of this 2020 image in dt
4.2.1, it was given version number '3' - which already exists for that
image (there are seven pre-existing duplicates). I see that dt has also
given the new duplicate a different 'image id' to the original RAW
image. I've never seen this before, although its not often I go back in
time like this.

My workflow is that I always create a new version (duplicate) of the
base RAW for a different edit so I can trace back any final output that
may result. My filenaming system is '___' where filename is composed of
''.  Original camera
images are renamed during download onto my workstation via a bespoke
script (ie outside dt).  I use variables in the dt export module to
ensure any output follows this format.  This provides unique
identification of every image and its derivatives across my libraries,
even when intermediate tiffs are involved in say, focus stacks.

This is critical for me - I can't have two different edits of a RAW with
the same filename!  Why has it happened and what can I do about it?

Thanks


darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org



Re: [darktable-user] Tokina lens not detected

2022-02-08 Thread Dusenberg

  
  
I had same problem with a Tokina AT-X 24-70mm f/2.8
  PRO FX on D810.  I got the answer here  https://dev.exiv2.org/projects/exiv2/wiki/Lens_Recognition_in_Exiv2_v026_(and_later)/

  The solution needs exiv2 0.26+, which at the time meant it
  wouldn't work on Ubuntu/Mint, and I eventually moved to Leap 15. 
  
  It works perfectly for darktable but not for Exiftool and other
  apps that don't use exiv2, or which only look in specific exif
  fields; eg most image viewers and Digikam. Eg here's a test I did
  with a .exiv2 file I set up mapping the id number the camera gives
  for the Tokina lens to the correct description:
  
  jar@photoworkstation:~/Desktop> exiv2 --version | grep exiv2
  exiv2 0.27.2
  jar@photoworkstation:~/Desktop> cat ~/.exiv2 
  [Nikon]
  137=Tokina AT-X 24-70mm f/2.8 PRO FX
  
  jar@photoworkstation:~/Desktop> exiv2 -pv --grep lens/i
  NIK_1486.NEF
  0x0083 Nikon3   LensType    Byte    1  6
  0x0084 Nikon3   Lens    Rational    4 
  240/10 700/10 28/10 28/10
  0x008b Nikon3   LensFStops  Undefined   4  72
  1 12 0
  0x000c NikonLd3 LensIDNumber    Byte    1 
  137   <- not translated by exiv2!
  0x000d NikonLd3 LensFStops  Byte    1  72
  **DOESN'T WORK. 
  
  BUT... using -pa option to exiv2
  jar@photoworkstation:~/Desktop> exiv2 -pa --grep lens/i
  NIK_1486.NEF
  Exif.Nikon3.LensType Byte    1  D G 
  Exif.Nikon3.Lens Rational    4 
  24-70mm F2.8
  Exif.Nikon3.LensFStops   Undefined   4  6
  Exif.NikonLd3.LensIDNumber   Byte    1  Tokina
  AT-X 24-70mm f/2.8 PRO FX
  Exif.NikonLd3.LensFStops Byte    1  F6.0
**DOES WORK. . .

Good luck

  
On 08/02/2022 18:13, Jean-Luc CECCOLI
  wrote:


  
  Hello,
  
  Yes, the data is specific to a sensor size.
If you want your lens to be recognized, you need to create the
data.
If you happen to find a lens with exactly the same
characteristics, you could copy the data adapting the fields
related to your lens.
It was my case with the Nikkor 20mm f/2.8 AIS similar to the AF
one.
  
  Rgrds,
  
  J.-Luc
  
  
envoyé : 7 février 2022 à 22:09
  de : Michael Rasmussen 
  à : darktable-user@lists.darktable.org
  objet : Re: [darktable-user] Tokina lens not detected
  
  
  On Mon, 7 Feb 2022 20:55:27 +
  Ludger Bolmerg  wrote:


  Yes, I was using a FX camera but I assumed that would not
matter.
Probably I was wrong. I see the same entry in my lensfun
file.

Without claiming to be an expert in these matters I am quit
  convincedthat the lens corrections are specific to the sensor
  dimension which is
  why the crop factor is part of the file.
  
  -- 
  Hilsen/Regards
  Michael Rasmussen
  
  Get my public GnuPG keys:
  michael  rasmussen  cc
  https://pgp.key-server.io/pks/lookup?search=0xD3C9A00E
  mir  datanom  net
  https://pgp.key-server.io/pks/lookup?search=0xE501F51C
  mir  miras  org
  https://pgp.key-server.io/pks/lookup?search=0xE3E80917
  --
  /usr/games/fortune -es says:
  Not intended to diagnose, treat, cure or prevent any disease.
  
  

  darktable user mailing list
  to unsubscribe send a mail to
  darktable-user+unsubscr...@lists.darktable.org
  




darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org






Re: [darktable-user] Windows or Linux?

2020-07-23 Thread Dusenberg

  
  
Re your [ASIDE]: I recently built a similarly spec'd
  photo workstation with Ryzen 3900x 12-core, 32Gb 3200Mhz DDR4,
  Radeon RX 5500XT 8Gb DDR6 GPU, 2 x NVMe M2 SSD, on an ASRock X570
  Phantom Gaming 4 - all PCIe4 - running OpenSUSE Leap 15.1.  It
  ABSOLUTELY FLIES!!  Edit actions in DT on 36mpix files that used
  to take a long time on my Linux Mint laptop (with SSD) which
  really got in the way of creativity,  now happen in real-time, and
  export of 16bit tifs take 3-5secs. Working on files on the NAS
  over 1Gbe network has little performance impact as database and
  cache are on the local SSD, except for export which takes around
  twice as long. I spend much less time in front of the computer,
  and get better results.  Well worth it. Good luck.
  
  
  

On 21/07/2020 23:09, Top Rock
  Photography wrote:


  
  

  I also run Dt on a
ten years old system, (AMD Phenom II X6), but with a few
upgrades, (4GB nVidia GTX 760, and a 4×2TB HDD RAID5 SATA
III storage, 1TB SSD SATA III system drive, 32 GB 1,333MHz
DDR3 RAM), and it runs fairly fast under Ubuntu 20.04. Also,
66MHz PCI 2.1 bus. The GPU is PCIe 3.0 capable, but the MB
is only PCIe 2.0 capable.
  
  
  I recently did a
clean Dt ver 3.0.2 install, and imported 68,000+ raw images,
and it took about an hour. I hear horror stories of Lr users
taking several hours to import about 1,000 images. Northrup
would say that he would come home from a shoot, set his Lr
to import the files, and they may be done by the morning,
(having run for 7-8 hours).
  
  
  Now that is a Lr CC
on Windows comparison. I have no idea how Dt on Windows will
do. I do know that ten years ago, my Linux system would run
circles around any similarly spec'ed windows machine, but
Windows has come a long way since then. The fact that Dt
uses HW acceleration, including GPU, and that it does
multithreading, (things that Lr does not do), puts it ahead.
  
  
  [ASIDE] I hope to
upgrade my computer soon, to a PCIe 4.0 MB, a 12 to 24 core
Ryzen, (24 to 48 threads), and 32GB ECC 2,666MHz DDR4 RAM,
but I will probably keep the same video card for now, (as
well as SSD and HDD RAID). Maybe, possibly, but not likely
anytime soon, get an M.2 system drive, and a 6 to 8 GB PCIe
4.0 video card. [/ASIDE]
  

  

  

  

  

  

  
Sincerely,

  
Karim Hosein
Top Rock Photography
754.999.1652
  

  

  

  

  

  

  
  

  darktable user mailing list
  to unsubscribe send a mail to
  darktable-user+unsubscr...@lists.darktable.org
  




darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org






[darktable-user] Last edit date?

2020-07-01 Thread Dusenberg

  
  
Anyone know how to display the date/time of the last
  edit on an image in DT? I like to go back over my edits and tweak
  them, so I have a lot of versions, and it isn't always the biggest
  version number which was edited last.  So being able to see when I
  last edited an image is quite important.  Can't find any reference
  to this anywhere.



darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org






Re: [darktable-user] Lens not recognized in Darktable 3.0.2

2020-06-19 Thread Dusenberg

  
  
I had the same problem on my Nikon when I bought a
  Tokina 24-70mm F2.8 AT-X Pro for it. The issue is that the camera
  doesn't recognise the foreign lens and so does not populate the
  necessary exif fields with the lens data. It's not a Darktable
  issue.
  
  This is my understanding of the situation. 
  
  The camera gets an id number from an attached lens( '141' in your
  case, '137' in mine)  and uses it to look up the lens data in it's
  internal table which it then includes in exif makers data in a raw
  file. Darktable (and many other photo apps) use exiv2 to query the
  exif data in a raw file and exiv2 returns the data as set by the
  camera.  However when a Nikon (and maybe other makes) identifies a
  foreign lens it seems to allocate an arbitrary id and marks it
  'Unknown (8D 54 68 68 24 24 87 02)' in exif. This is what exiv2
  reports to the application.  This means Darktable has no data
  which it can use to process the lens - so can't lookup the lens in
  lensfun for correction info.
  
  It's a camera manufacturer issue that manifests in exiv2, and is
  discussed here https://dev.exiv2.org/boards/3/topics/2782
  and described by exiv2 developer here: http://dev.exiv2.org/projects/exiv2/wiki/Lens_Recognition_in_Exiv2_v026_(and_later)/

  While you will be able to manually select the lens in Darktable
  Lens Correction module, I unfortunately couldn't because the lens
  wasn't in lensfun. At first I tried to use other lenses correction
  data but this wasn't satisfactory so I calibrated the lens, which
  is now in the latest lensfun db. I still had to manually select
  the lens in Darktable of course - which is a real pain if you
  process lots of images because you can't use an automatic preset
  to apply Lens Correction.

You will note the exiv2  fix is only
available for exiv2 0.26 on.  As of early 2020, I found hardly
any Linux distros that had/were planning to upgrade to that
version - OpenSUSE was one, I think Fedora the other.  Therefore
there may not be much you can do about it.  Upgrading exiv2 on a
distro . You may be able to upgrade exiv2 to 0.26 on your
platform and recompile Darktable to use it, but that will
break any graphics package that depend on the pre-upgrade
exiv2 version - which may or may not be an issue for you.
   
  I also lost the ability to add lens info to final image
  files. I scripted a workaround for this using exiftool to look for
  the bad id ('137' for me) in the LensID exif tag and then wrote
  the Tokina data into the LensMaker and LensModel tags.  This won't
  help the Darktable issue, however.
  
  Fortunately, I recently replaced my laptop with a workstation on
  which I chose to run OpenSUSE which as default has exiv2 0.27, so
  I can now use the workaround described in the exiv2 link, and the
  problem is solved.
  
  Good luck

On 15/06/2020 21:39, Michael Rasmussen
  wrote:


  Hi all,

Weird problem in Darktable where lens is not found although there are
lens corrections available in Darktable for the specific lens.

Darktable recognizes the lens as '141'

Hardware:
Nikon D600
Tokina 100mm F2.8 MACRO AT-X M100 PRO D

Software

dpkg -s libimage-exiftool-perl
Package: libimage-exiftool-perl
Status: install ok installed
Priority: optional
Section: perl
Installed-Size: 20932
Maintainer: Debian Perl Group
 Architecture: all
Version: 12.00-1

this is darktable 3.0.2+9~g5ac2260e3
copyright (c) 2009-2020 johannes hanika
darktable-...@lists.darktable.org

compile options:
  bit depth is 64 bit
  normal build
  SSE2 optimized codepath enabled
  OpenMP support enabled
  OpenCL support enabled
  Lua support enabled, API version 5.0.2
  Colord support enabled
  gPhoto2 support enabled
  GraphicsMagick support enabled
  OpenEXR support enabled

Linse info from exiftool
Lens Type   : D
Lens: 100mm f/2.8
Lens ID Number  : 141
Lens ID : Unknown (8D 54 68 68 24 24 87 02)
Lens Spec   : 100mm f/2.8 D

As can be seen Darktable uses 'Lens ID Number'.

Rawtherapee 5.8 correctly identifies the lens.






darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org






[darktable-user] Parametric Masks - Lost settings on re-open of image

2020-06-13 Thread Dusenberg

  
  
version 3
  
  Last year I edited a group of raw images and created a focus
  stacked final output of them. One of the edits was to make the
  after-sunset sky pinker. I did this using Color Balance in HSL and
  changing the Highlights hue and saturation. I had to use a
  parametric mask to limit the change to just the original orange
  hues.  I recently viewed the final image and decided the sky
  colour was still not right so I re-opened the set to make the
  changes. 
  
  I created a new version of each retaining their original edits. 
  Going into Color Balance on the background image to modify the
  adjustments I previously made, I found the sliders for hue and
  saturation were non-default but the parametric mask settings were
  all at their defaults.  If I turn the mask off there is a major
  change to the image so Darktable must still know what the original
  mask settings are.  I remember spending quite a while setting that
  mask up but I can't remember what I eventually did - which channel
  I applied it to, and how I adjusted the input or output levels to
  get it right. 
  
  Anyone know how I can get darktable to resurrect the mask settings
  - or at least how I can find out what I did so I can tweak those
  settings? I really don't want to start over as I may never get
  back to where I was?  
  



darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org






[darktable-user] Lens Correction issues

2019-12-05 Thread Dusenberg

  
  
Hi
  
  Over last few months I've finally created a style which I can
  apply to all my landscape shots and which produces what I think
  are excellent images which usually need little or no manual
  attention.  A big thanks to all the developers - DT is a great
  piece of software.
  
  However I do have two issues with the Lens Correction module which
  I can't resolve and was hoping someone out there may know the
  answers.  I run DT 2.6.2 on a laptop with Linux Mint 19.1 Cinnamon
  (Ubuntu 18.4).
  
  Issue 1: 
  My Nikon D810 automatically applies lens distortion correction so
  the raw files are already corrected for the lens in use.  I have
  Nikon AF-S lenses so this works consistently and accurately.
  However the camera does not correct for TCA.  For example, my
  24-70 lens generally needs a red correction of 1.00020-1.00040
  while my 70-200 needs 1.00010-1.00030. I have to do this manually
  for every shot which is a pain and 'something else to do'. What I
  want is for DT to apply Lens Correction as part of my style so
  that lens and shot data of the current image is used but then only
  the TCA correction is applied with a preset value of say
  red:1.00030 for the 24-70 lens. I can then amend this if necessary
  during my normal workflow review of each image. I have tried and
  failed to do this.  The problem I find is that if I open LC in
  Darkroom, set it to TCA Only enter a correction and save as a
  preset, when I use that preset whether alone or as part of a
  style, LC no longer looks up the lens and shot data for the
  current image, but applies all the settings present during the
  creation of the preset - which means the wrong lens, f/stop,
  distance. I may be missing something obvious but I've no idea what
  it is.  Does anyone know how to do what I need?
  
  Issue 2:
  I have a Nikor AF-S 18-35mm f/3.5–4.5G ED lens which doesn't
  appear in Lensfun supported lenses list. However for TCA
  correction, the Nikkor AF-S 16-35mm f/4G ED entry in LC works
  satisfactorily. The problem is I can't find a way to get LC to use
  the 16-35mm lens entry instead of the as-shot lens.  Again, any
  ideas?
  
  Thanks


darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org






Re: Aw: Re: [darktable-user] Export: $(REVISION) numbering inconsistency

2019-09-03 Thread Dusenberg

  
  
Hajo - thanks. I had no idea such an app existed.
  I'll look into it.
  
  Cheers
  Dusenberg

On 03/09/2019 08:18, HaJo Schatz wrote:


  
  "Loose integration", to me in Linux, could mean a
2nd window which has buttons such as "check in", "revert",
"check out", etc. That window is already implemented -- it's
called "revision control GUI" and is a standalone app :-)
ALT-TAB switches quickly between darktable and the GUI.

  

  

  
-- 

  
  PGP key: http://tinyurl.com/2016PGPKEY

  

  


  
  
  
On Thu, Aug 29, 2019 at 6:08
  PM Dusenberg <dusenb...@gmx.co.uk> wrote:


   Thanks Micha, 
  
  your comments are much appreciated. I agree that the
  'loose integration' you outline would be the best
  approach.
  
  I'm not sure if I want to add to the complexity of my
  workflow by introducing another system at this stage. I
  think maybe as a temporary fix I'll manually add the
  version number on export rather than use $(VERSION), and
  then work out a more reliable method for the longer term.
  
  Regards
  
  Dusenberg
  

On
  29/08/2019 10:31, Michael Fritze wrote:


  

  Hi,
   
  but please not a deep integration. E.g. I use a
program (CAD) where users kept asking for revision
control integration. I use mercurial a long time
with this, not integrated. And then they integrated
SVN, including binaries. :-( Later git was added but
nothing configurable.
   
  Better would be to have most important actions
(init, add (or add all XMPs), commit, maybe push) as
shortcuts or in a panel and for more complex actions
a link to a dedicated tool or shell. For mercurial
that would be TortoiseHg Workbench. And all should
be configurable so everyone could use the favorite
system and dt can concentrate on its main task.
   
  Just what I think about that.
   
  BR, Micha.
   

  Gesendet: Mittwoch,
    28. August 2019 um 21:16 Uhr
Von: "Dusenberg" 
An: darktable-user@lists.darktable.org
Betreff: Re: [darktable-user] Export:
$(REVISION) numbering inconsistency
  
Thanks Hajo

My first thought on your suggestion is the
revision control system would need
integrating with darktable somehow.

    I'll think on this a while.

Dusenberg
  
   
  On
28/08/2019 05:09, HaJo Schatz wrote:
  
On Tue, Aug 27, 2019 at 11:44 PM
  Dusenberg <dusenb...@gmx.co.uk>
  wrote:

  

  My
workflow needs to keep track of
which raw version each
  development stream
  belongs to, as well as the
  sequence in a stream of each
  exported file and edit sidecar
  (so I can revert or branch). 
  

  
  Only loosely related but -- with such
requirements, my first thought would
probably be to try & use a (local)
revision control system. Lets y

Re: Aw: Re: [darktable-user] Export: $(REVISION) numbering inconsistency

2019-08-29 Thread Dusenberg

  
  
Thanks Micha, 
  
  your comments are much appreciated. I agree that the 'loose
  integration' you outline would be the best approach.
  
  I'm not sure if I want to add to the complexity of my workflow by
  introducing another system at this stage. I think maybe as a
  temporary fix I'll manually add the version number on export
  rather than use $(VERSION), and then work out a more reliable
  method for the longer term.
  
  Regards
  
  Dusenberg
  

On 29/08/2019 10:31, Michael Fritze
  wrote:


  
  

  Hi,
   
  but please not a deep integration. E.g. I use a program
(CAD) where users kept asking for revision control
integration. I use mercurial a long time with this, not
integrated. And then they integrated SVN, including
binaries. :-( Later git was added but nothing configurable.
   
  Better would be to have most important actions (init, add
(or add all XMPs), commit, maybe push) as shortcuts or in a
panel and for more complex actions a link to a dedicated
tool or shell. For mercurial that would be TortoiseHg
Workbench. And all should be configurable so everyone could
use the favorite system and dt can concentrate on its main
task.
   
  Just what I think about that.
   
  BR, Micha.
   

  Gesendet: Mittwoch,
28. August 2019 um 21:16 Uhr
    Von: "Dusenberg" 
An: darktable-user@lists.darktable.org
Betreff: Re: [darktable-user] Export: $(REVISION)
numbering inconsistency
  
Thanks Hajo

My first thought on your suggestion is the revision
control system would need integrating with darktable
somehow.

I'll think on this a while.
    
Dusenberg
  
   
  On 28/08/2019 05:09, HaJo
Schatz wrote:
  
    On Tue, Aug 27, 2019 at 11:44 PM Dusenberg <dusenb...@gmx.co.uk>
  wrote:

  

  My workflow
needs to keep track of which raw version
  each development stream belongs
  to, as well as the sequence in a
  stream of each exported file and edit
  sidecar (so I can revert or branch). 
  

  
  Only loosely related but -- with such
requirements, my first thought would probably be
to try & use a (local) revision control
system. Lets you annotate changes, revert,
branch and whatnot. 
   
  Hajo

  
  

  darktable user mailing list to unsubscribe send a mail
  to darktable-user+unsubscr...@lists.darktable.org

  darktable user mailing list to unsubscribe send a mail
  to darktable-user+unsubscr...@lists.darktable.org
  

  

  
  

  darktable user mailing list
  to unsubscribe send a mail to
  darktable-user+unsubscr...@lists.darktable.org
  




darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org






Re: [darktable-user] Export: $(REVISION) numbering inconsistency

2019-08-28 Thread Dusenberg

  
  
Thanks Hajo
  
  My first thought on your suggestion is the revision control system
  would need integrating with darktable somehow. 
  
  I'll think on this a while.
  
  Dusenberg
  
  

On 28/08/2019 05:09, HaJo Schatz wrote:


  On Tue, Aug 27, 2019 at 11:44 PM Dusenberg <dusenb...@gmx.co.uk>
wrote:
  
  

  
My workflow needs to keep
  track of which raw
version each development stream belongs to, as well as
the sequence in a stream of each exported file and
edit sidecar (so I can revert or branch).  

  

Only loosely related but -- with such requirements, my
  first thought would probably be to try & use a (local)
  revision control system. Lets you annotate changes, revert,
  branch and whatnot. 


Hajo
  



darktable user mailing list to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org


darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org






[darktable-user] Export: $(REVISION) numbering inconsistency

2019-08-27 Thread Dusenberg

  
  

Hi
  
  I'm fairly new to darktable, but have set up a reasonable
  workflow, and feel I'm producing good outputs. However I do
  have a problem with file management caused by DT version
  numbering which I hope someone can help me with as I've tried
  the DT manual and googling and found nothing.
  
Issue Summary
  I often need to create a
  number of intermediate tif files during a development
and also often have more
  than one version of a raw as I experiment/change my mind.  My workflow needs to keep track of
  which raw version each
development stream
belongs to, as well as the sequence in a stream of each
exported file and edit sidecar (so I can revert or branch). 
  

  On
  export in DT, I always use the filenaming format
  $(FILENAME)_$(VERSION)
[+ optional suffix], which in theory should mean the xmp
files have the same base filename as the exported file,
and in a subsequent export the version upon
  which that is based is part of $FILENAME so trace back is
  possible. Here is an example of files from part of one of
  my workflows: 
  
  NIK_0371.NEF   - original raw
  file
  .
  NIK_0371_04.NEF.XMP - first edit of fourth duplicate
  (automatic version naming by DT)
  NIK_0371_05.NEF.XMP - first edit of
  fifth duplicate (automatic version naming by DT)
   NIK_0371_4.tif - intermediate tif from first
edit of fourth duplicate (version naming
  produced by $(VERSION))
NIK_0371_4.tif.xmp -
  edit sidecar
of the tif from first edit of fourth duplicate
  NIK_0371_4_0.tif - intermediate tif from the edit of the first intermediate tif
  NIK_0371_4_0.tif.xmp - edit sidecar of the intermediate tif from the edit of
the first
intermediate tif
  NIK_0371_4_0_0-Final.tif
  ..
  
 The issue - exemplified
  above - is that the filename given
  automatically by DT to an xmp file for a duplicate raw has a
  2-digit version number but the $(VERSION) variable in the
  export module filename produces a 1-digit version number (for
  versions less than ten). This means the files are not
  sequenced correctly in File Manager - because '_4' and '_04'
  are very different to sort order, and it makes it much more
  complicated, time consuming and error-prone to keep track of
  the stack of files associated with each raw development
  stream.  This seems like a basic coding
  inconsistency but I wanted to make sure before reporting
  it as a bug. 
   
Does anyone know if there's a way to force $(VERSION) to
output 2-digits for all possible values so that it matches the
version number format automatically given by DT to xmp files
from a duplicate raw?  

Thanks 


darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org






Re: [darktable-user] Re: [darktable-dev] Re: HDR issue in darktable 2.6.0

2019-02-14 Thread Dusenberg

  
  
Thanks Remco.  
  
  That's usefull to know about Highlight Reconstruction being always
  on, and fits with what I'm seeing. 
  
  I'm still at early stages of learning how to use DT so will
  definitely explore your comment about using "reconstruct in LCh" -
  I take a lot of landscapes.
  
  The custom style I created by profling the camera using
  darktable-chart with Xrite ColorChecker target and camera jpeg,
  automatically disables the standard DT basecurve and applies a
  tonecurve and color lookup table. I had hoped it would give
  sufficiently reliable results, but it may indeed be clipping too
  soon like the Sony basecurve as you indicate - where do I look in DT to
  find out the point it is clipping at?
  
  Cheers 


On 14/02/2019 13:35, Remco Viëtor
  wrote:


  On jeudi 14 février 2019 12:23:03 CET Dusenberg wrote:

  
I have just been puzzling over the 'pink highlight' problem on several raws
I recently shot (not HDR), all of which have blown highlights caused by a 
camera operator silly error :)  I recently created a custom style for my
camera and I thought that was something to do with the problem, so I'm glad
it's a known characteristic for which a solution is available - don't
over-expose!

I also noticed that Highlight Reconstruction was switched on on these raws
and I hadn't done it as I like to start from a blank canvas, so switched it
off. Reading the trail of replies however, it seems the only way to
fix/reduce the pink problem is to use Highlight Reconstruction - so I'm now
wondering if DT automatically switches Highlight Reconstruction on if it
detects blown highlights; does anyone know?

  
  
As far as I know, "highlight reconstruction" is _always_ automatically 
switched on for raw files (and never for jpeg or png files). It just has no 
visible effect if there are no over-exposed areas. The default setting is 
"clip to white", I prefer using "reconstruct in LCh", which usually gives a 
bit more detail in slightly over-exposed areas (clouds are a typical example).

Keep in mind though, that you can get blown areas for two reasons in 
Darktable. First is of course over-exposure, in which case the raw data is 
clipped. 

But the basecurve that is (automatically) applied to raw files can also cause 
blown regions. Darktable picks a curve based on the detected camera (for the 
default case), and e.g. the Sony basecurve clips rather soon (at input values 
of about 90 on a scale 0..100). 

For that reason, I tend to switch to another curve (Canon or Leica, neither of 
which clips the highlights), or use the "filmic" module (which has a learning 
curve). That very often means also a correction in the "exposure" module!

Remco



darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org





  


darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org






Re: [darktable-user] Re: [darktable-dev] Re: HDR issue in darktable 2.6.0

2019-02-14 Thread Dusenberg

  
  
I have just been puzzling over the 'pink highlight'
  problem on several raws I recently shot (not HDR), all of which
  have blown highlights caused by a  camera operator silly error :)  I recently created  a custom style
  for my camera and I thought that was something to do with the
  problem, so I'm glad it's a known characteristic for which a
  solution is available - don't over-expose! 
  
  I also noticed that Highlight Reconstruction was switched on on
  these raws and I hadn't done it as I like to start from a blank
  canvas, so switched it off. Reading the trail of replies however,
  it seems the only way to fix/reduce the pink problem is to use  Highlight Reconstruction - so
  I'm now wondering if DT automatically switches Highlight Reconstruction on if it
  detects blown highlights; does
anyone know? 
  
  Thanks
  
  On 14/02/2019 06:41, David Vincent-Jones wrote:
  

A 'pink/magenta' cast is normally an
indicator of overexposure (on a single raw file) ... the red
and blue pixels have been flooded on the sensor. Is it
possible that one of your HDR exposures is simply putting
data out-of-normal range?

On 2019-02-13 10:23 p.m., Bruce
  Williams wrote:


  
  Yes, this is the correct way to use the
mailing list! 😀
Sorry if I'm unable to answer your question
  though.
  Cheers,
Bruce Williams.

  
  
  
On Thu., 14 Feb. 2019, 17:20 Mittagskogel
  Dobratsch 
  wrote:


  

  
Hi,


This is my first time using a mailing list
  so please bear with me if this is not the
  correct way to ask a question.


I have had an issue for some time in
  darktable with HDRs having a pink/magenta
  tint. This issue seems to have appeared and
  been fixed in the past. For example, user
  "bva" reports the same problem here: https://darktable-devel.narkive.com/vBQahOFL/hdr-pink-colors
  
Pretty much all of the conditions are the
  same for me; CR2 raw files, pink tint, heavy
  mazing.



Thank you,

-mk
  

  

  
  
___
  darktable developer mailing list to unsubscribe send a
  mail to darktable-dev+unsubscr...@lists.darktable.org
  

  
  

  darktable user mailing list to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org
  


___

Re: [darktable-user] Inaccurate color display or color picker?

2019-01-27 Thread Dusenberg



On 26/01/2019 20:03, Normand Fortier wrote:

So far, my understanding is this.

DT works in LAB color space. Most modules work in that space ("The 
local color pickers run in the color space of the individual module, 
which is usually L"; see also 
http://www.darktable.org/usermanual/en/color_management.html). The 
main histogram and the global color picker, at least, display rgb 
values. One would think that those values are converted from LAB 
values, but instead, those modules obtain RGB values after converting 
to the monitor color space.


If I am correct, it means DT modules do not all work on the same 
image: most work in LAB color space but some work on the image after 
conversion to the (RGB) monitor color space. See for example the two 
files appended: hist_mon.png shows the main histogram and the tone 
curve with my monitor profile selected for display, while 
hist_srgb.png shows the same information with srgb profile selected 
for display: the histogram in the tone curve looks identical, while 
the main histogram visibly differs. Note that the tone curve histogram 
is set to display rgb.


To me, this inconsistent behaviour is undesirable: I would expect all 
modules to work on the same underlying image.


The latter is how at least some other programs work.

- Lightroom:
"Lightroom uses a wide gamut RGB space similar to ProPhoto RGB to do 
all the image calculations, and the histogram and RGB percentage 
readouts are based on this native Lightroom RGB space."

http://www.adobepress.com/articles/article.asp?p=1930486

- RawTherapee:
Uses ProPhoto as default working profile 
(https://rawpedia.rawtherapee.com/Color_Management#Working_Profile). 
The UI indicates "If enabled, the working profile is used for 
rendering the main histogram and the Navigator panel, otherwise the 
gamma-corrected output profile is used". If I open my original png 
image with patches, the display of pixel rgb values (in the Navigator 
module) correspond to values written into the patch.


My impression is that it would be possible for modules displaying rgb 
values numerically or graphically to obtain such values through a 
conversion of the LAB values at the appropriate step of the pipeline 
-- this is what the global color picker does when it offers to display 
LAB and RGB values of a given area.


Can anyone provide feedback as to whether the above is correct?

If so, I could write a bug report, although I am not sure of the title 
or what should be requested. Note that there was a discussion around a 
request for rgb curves, that could be relevant:

https://redmine.darktable.org/issues/9559
 


darktable user mailing list
to unsubscribe send a mail to 
darktable-user+unsubscr...@lists.darktable.org




I'm new to darktable, and am finding this  mailing list invaluable.

I have same concerns as described in this thread. My use case is that I 
want to match the LAB luminosity of a colorchecker patch in an 
unprocessed raw file in DT with the reference luminosity value for the 
patch so I can produce an adequately accurate 'standard' colour profile 
for my camera.  I'm shooting with studio flood (5k which is the cc 
reference) so can adjust exposure quite finely by moving lamp distance.  
I was intending to use global colour picker, but now I'm at a loss, as 
in my view, the camera profile should depend only on the raw file and 
the cc references, not the output profile of my laptop monitor.


Also I want to use dt to set white balance for a shoot during raw 
processing taken from an image that includes colorchecker. But now I'm 
concerned that this too will be based on my monitor not the underlying 
file values so may affect colour in the resulting tiffs/prints


Really interested to hear dt dev feedback.

darktable user mailing list
to unsubscribe send a mail to darktable-user+unsubscr...@lists.darktable.org