I'm a little confused here Marcin, are you referring just to the time taken for 
import?  If so I wonder why this is really of concern, OK it would be nice for 
it to be faster or instantaneous even but import is a one off operation to load 
the database, subsequent speed of use of the images after that is much more 
important I would have thought.

That said I see your speed problem seems to be to do solely with .jpeg import 
not with raw.  This isn't something I can help on as I have used dt almost 
exclusively on raw's myself and the extremely small number of jpegs I have 
caused no problems - that said there have been other comments on the list 
regarding dt being in some cases much slower on jpegs than raw.

With regard to cache size experimenting to find your own optimum is going to be 
best - but in general computing terms the extra overhead of a large cache is 
usually small compared to the costs of cache misses from too small a cache :-)

Rgds,
Rob.

-----Original Message-----
From: Marcin Sikora [mailto:[email protected]]
Sent: 17 February 2013 22:54
To: johannes hanika
Cc: Rob Z. Smith; [email protected]
Subject: Re: [Darktable-users] Switching from Darkroom to Light Table is very 
S.L.O.W.

Thank you for info about the cache thumbnail generations. I was wondering how 
does it work in DT. How large cache can be set for DT without influencing its 
performance? In my DT the default cache size is 53687091. If I will set it up 
for large e.fg100x. Will this reduce the performance of DT?


Problem NOT really solved :-( I was happy too early. It was not a problem of 
the system and also not caused by the compilation method.


On the new system I have tested too fast. I have only imported RAWs without 
JPEGs (DT gives the option of skipping JPGs during import).


What I have done>

Import only RAWs: 31.000 RAWs

Import of JPEGs (please notoice that these were JPEGs from Digital
cameras): 31.000 RAWs + 46.000 JPEGs

THIS MAKES TROUBLES-> Importing Analog Picture Archive scanned with a Nikon 
COOLSCAN V ED. Only 7000 JPEGs comparing to the Digital Picture library (77.000 
RAWs+JPEGs)

The from starting to closing DT which is equivalent to switching from Ligt 
Table to Darkroom.


The RAWs library (31.000 pictures)

real 0m4.937s

user 0m5.584s

sys 0m1.184s

The Digital Pictures library RAWS+JPEGs (77.000 pictures)

real 0m11.040s

user 0m9.581s

sys 0m1.232s

The Scanned Analog Picture Library JPEGs (7.000 pictures)

real 3m24.549s !!!!!

user 0m56.184s

sys 2m49.059s !!!!


Why system time is so long? There is plenty physical memory free. Only one 
processor works 100 % whole time. Network transfer is not specially high.


So, I imported pictures from the Scanned Analog Pictures Archive bit by bit to 
check where is the problem. This is the result.


1150 JPEGs in the library.

real 0m6.589s

user 0m20.493s

sys 0m2.868s


2040 JPEGs in the library.

real 0m4.620s

user 0m10.721s

sys 0m1.748s


2700 JPEGs in the library.

real 0m6.504s

user 0m17.461s

sys 0m2.340s


3950 JPEGs in the library.

real 0m13.017s

user 0m27.554s

sys 0m2.968s


4740 JPEGs in the library.

real 0m18.567s

user 0m28.130s

sys 0m2.984s


>>>>>>> somewhere HERE is the problem <<<<<<<<<<<<< (importing picture
>>>>>>> set 4740 to 5940 alone to DT does not make such startup delay)


5940 JPEGs in the library.

real 2m39.562s

user 0m53.867s

sys 2m2.444s


7000 JPEGs !!!!!

real 3m24.549s !!!!!

user 0m56.184s

sys 2m49.059s !!!!


Any idea why this delay accurate for this set of data?

Here is an information about one of image from the Analog Scanned Picture 
Archive (using exiv2 command):

File name : 2001-04-28_0292.jpg

File size : 8694556 Bytes

MIME type : image/jpeg

Image size : 5555 x 3666

Camera make : Nikon

Camera model : Nikon COOLSCAN V ED

Image timestamp :

Image number :

Exposure time :

Aperture :

Exposure bias :

Flash :

Flash bias :

Focal length :

Subject distance:

ISO speed :

Exposure mode :

Metering mode :

Macro mode :

Image quality :

Exif Resolution : 5555 x 3666

White balance :

Thumbnail : image/jpeg, 6387 Bytes

Copyright :

Exif comment :


This problem does not influence the DT usability for RAW pictures processing 
for Digita picture precessing. There is something wrong with my archive 
pictures. But I can precess them in small portions.


Best regards and than you once again for all replays.



On Fri, Feb 15, 2013 at 11:08 PM, johannes hanika <[email protected]> wrote:
> On Sat, Feb 16, 2013 at 3:00 AM, Rob Z. Smith <[email protected]> wrote:
>>>-----Original Message-----
>>>From: Marcin Sikora [mailto:[email protected]]
>>
>>>Solved :-)
>>
>> Great!
>>
>>>What I have learnt. Please, correct me if I'm wrong.
>>>I believe that when the pictures are imported to the database on the local 
>>>machine and thumbnails are >generated. Then there is no much data traffic 
>>>between original picture data and DT, except the situation when >we want to 
>>>develop the picture in Darkroom. Switching back to the Light table is not 
>>>dependent on the location >where original pictures are stored.
>>
>> Now, having much to learn on dt myself, I am open to correction but I don't 
>> think it works quite like that.  AFAIK dt doesn't permanently store 
>> thumbnails but generates them 'as required' and keeps them in the 
>> ~/.cache/darktable directory.  I *think* that when you do the initial photo 
>> load it is creating database records and indexing those rather than creating 
>> thumbnails, so this should be very much a one off activity.  In subsequent 
>> use when collections are opened in the light table thumbnails are required 
>> (and if necessary generated) for the displayed images only.  When you scroll 
>> the light table different thumbnails are of course then required and 
>> obtained from the .cache directory or generated if not already there.  This 
>> generally works well I think but you get problems if the cache size isn't 
>> large enough to hold thumbnails for all your collection, in this case when 
>> you scroll the light table thumbnails are continuously being discarded and 
>> regenerated and predictably it is slow - hence the general advice to limit 
>> the size of collections you read into the light table or (very much second 
>> best) configure a huge cache area.
>>
>> I hope that helps - and isn't factually wrong :-)
>
> right. also going from dr->lt might re-create that thumbnail again for
> lighttable mode in the correct sizes.
>
>> Rgds,
>> Rob.
>>
> [removed garbage]
>
> ----------------------------------------------------------------------
> -------- The Go Parallel Website, sponsored by Intel - in partnership
> with Geeknet, is your hub for all things parallel software
> development, from weekly thought leadership blogs to news, videos,
> case studies, tutorials, tech docs, whitepapers, evaluation guides,
> and opinion stories. Check out the most recent posts - join the
> conversation now. http://goparallel.sourceforge.net/
> _______________________________________________
> Darktable-users mailing list
> [email protected]
> https://lists.sourceforge.net/lists/listinfo/darktable-users

The content of this email is private and confidential, and unless otherwise 
stated only the intended recipient may use the content of this email for its 
intended purpose. If you are not the intended recipient, you may not retain, 
copy, forward or disclose the information herein, and we ask you to notify the 
sender or contact our Customer Services department on 0844 633 1000 or at 
[email protected] The copyright and all other intellectual property rights 
subsisting in or to the contents of this email belong to NHBC or are used with 
the permission of the owner and all such rights are reserved. Recipients are 
asked to note that opinions, conclusions and other information in the contents 
of this email that do not relate to the official business of NHBC are neither 
given nor endorsed by NHBC. This email has been scanned for viruses, but NHBC 
does not accept any liability in respect of loss or damage caused by any virus 
which is not detected by its virus detection systems. Data Protection Act 1998. 
NHBC is the Data Controller for the purposes of the Act. Your personal details 
will be stored and processed in accordance with the Act for the purposes of 
dealing with your enquiry or claim and for research and statistical purposes. 
If you make a claim under a Buildmark policy you agree to your data being 
passed to others involved with your claim such as the original builder, or a 
consultant or remedial works contractor that we may employ in connection with 
your claim(s) and matter ancillary to your claim(s). Other than disclosure 
provided for in this statement, we will not pass any data about you to any 
other party without your permission unless we are required to do so by law. 
NHBC, the National House-Building Council, is a company registered by guarantee 
in England, registration number 320784, and it is authorised and regulated by 
the Financial Services Authority.

------------------------------------------------------------------------------
Everyone hates slow websites. So do we.
Make your web apps faster with AppDynamics
Download AppDynamics Lite for free today:
http://p.sf.net/sfu/appdyn_d2d_feb
_______________________________________________
Darktable-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/darktable-users

Reply via email to