Wolf Faust posted the following in the LPROF sourceforge.net Help forum: As Hal Engel did ask for feedback and bug reports in the "Windows build?" thread, here are some thoughts: 1. Fault tolerance: I haven't used the latest lprof version with the new spline regression code. But I got some error values from argyl CMS users using the same regression routines?? The extremly low error values reported from ProfileChecker make me a bit sceptical if everything got better. While the new spline regression routine surely brings many advantages, has anybody looked at the fault tolerance of the approximation in practice? That is, how does the new routine behaves if the target scan or reference file is slightly faulty because of whatever reason: serious dust/scratches in the scan, scanner noise, reproduction fault of the target, reproduction fault of the measurement. I wonder if not bad noise is incorporated into the profile seeing user reports from argyll CMS users with mean dE <0.35 on batch average slide film targets. I would recommend looking at what happens if one of the patches in the target scan is bad and how it effects the profile. 2. When Marti (lprof), KWLee (iphoto) and I (ICS) developed our scanner profilers, we shared some test scans in order to compare and test our profilers. I also did collect target scans from ~20 scanners to test my profiler with. Now that I can produce targets, I think there are better ways to test profilers. In order to test and compare profilers, I would strongly recommend generating test data that covers most extreme cases. Let me make a suggestion: I am willing to produce five 35mm individually measured slides. I would suggest on the new Velvia filmes with extreme color gamut. One slide is a standard IT8 target and the four other slides are >1000 test patches spreaded all over the RGB space and also covering tricky areas (high saturated colors, colors near DMin/DMax,...). Let us find 7-8 users of the current popular scanners (Nikon, Minolta, Epson flatbeds, Imacon, and a drum scanner) and I send the slides to these users for scanning. This data should be sufficient to test most (not all) color quality issues of the profiler. I have used this method here for testing a number issues. If the appoximation is smooth, has a good fault tolerance and the 1000 patches show low error values... than I guess one can assume the profiler does work very good... but this is not easy achive with slide films ;-9 PS: We should find the scanner users within the next 2 weeks as I will run a Velvia 100 production here and have some free film areas left to produce such things...
------------------------------------------------------------- You can view the thread here http://sourceforge.net/forum/message.php?msg_id=3658592 This struck me as a very good idea and I would like to pursue this. Is anyone here interested in assisting with this by providing high quality scans from the custom slides that Wolf is willing to produce for this effort? ------------------------------------------------------- This SF.Net email is sponsored by xPML, a groundbreaking scripting language that extends applications into web and mobile media. Attend the live webcast and join the prime developer group breaking into this new coding territory! http://sel.as-us.falkag.net/sel?cmd=lnk&kid=110944&bid=241720&dat=121642 _______________________________________________ Lcms-user mailing list Lcms-user@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/lcms-user