[filmscanners] Re: Why DSLR ouput looks sharper?

2003-09-11 Thread David J. Littleboy

From: "Paul D. DeRocco" <[EMAIL PROTECTED]>
> From: [EMAIL PROTECTED]
>
> What I want to know is:  which one will make a better 11x14 or
> 12x16" print?

4000 dpi scans of 645 slides look very nice at 12x16. Very. I've enver been
able to get a decent 11x14 from 35mm, but people claim it's possible. I
remain dubious. (But, FWIW, IMHO, 35mm ISO 100 slide film edges out 6 MP
DSLR digital.)

> Looking at screen pixels is not the final product, even though it
> does tell us something.

If you resize the scanned image down to 6MP, then you can do meaningful
on-screen comparisons. However, there may be actual detail that's lost by
resizing in this manner (though I doubt there is much).
>>

The game that I described was the results of downsampling to various levels
and seeing when things looked similar. I'll give 35mm credit for 8MP as
opposed to DSLR digital 6MP.


Beyond that, there are other differences between scanned slides and DSLR
images. I'm of the opinion that digital cameras generally have more accurate
color, because the chemical processes involved in film involve finicky
response curves that need to be canceled out by other equally finicky
curves, while what goes on in a CCD or CMOS sensor is pretty simple and
linear.


Maybe, but the new Velvia 100F and Astia 100F are pretty neutral in color
rendition. Nice grain, too.

>>
> I am especially interested in comparisons of prints made from the Sigma
> Fovion images versus the new Olympus E-1 DSLR versus scans of film.  Also,
> any other DSLR vs film scan prints.

The Foveon technology looks interesting, but their current implementation
doesn't have any diffuser over the sensor, which makes it appear sharper
than most competitive sensors, but it is prone to aliasing and moire.
<<

Yes, the lack of an _antialiasing filter_ makes the SD-9 rather problematic.
I was looking closely at the SD-9 and 10D versions of one of my favorite
test images, the "house shot", the shot of the brick building in Steve's
Digicam's reviews. Straight lines at a slight angle to vertical or
horizontal are an amazing mess in the SD-9 shot. Far worse than I would have
thought. I shoot a lot of urban stuff, and that would really irritate me no
end.

David J. Littleboy
[EMAIL PROTECTED]
Tokyo, Japan


Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or body


[filmscanners] RE: Why DSLR ouput looks sharper?

2003-09-11 Thread Austin Franklin
Berry,

> What I want to know is:  which one will make a better 11x14 or
> 12x16" print?

That depends on what characteristics of an image YOU like.  No one else can
tell you what YOU might think is better (except your wife ;-).

Regards,

Austin


Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or body


[filmscanners] RE: Why DSLR ouput looks sharper?

2003-09-11 Thread Paul D. DeRocco
> From: [EMAIL PROTECTED]
>
> What I want to know is:  which one will make a better 11x14 or
> 12x16" print?
> Looking at screen pixels is not the final product, even though it
> does tell
> us something.

If you resize the scanned image down to 6MP, then you can do meaningful
on-screen comparisons. However, there may be actual detail that's lost by
resizing in this manner (though I doubt there is much).

Beyond that, there are other differences between scanned slides and DSLR
images. I'm of the opinion that digital cameras generally have more accurate
color, because the chemical processes involved in film involve finicky
response curves that need to be canceled out by other equally finicky
curves, while what goes on in a CCD or CMOS sensor is pretty simple and
linear.

> I am especially interested in comparisons of prints made from the Sigma
> Fovion images versus the new Olympus E-1 DSLR versus scans of film.  Also,
> any other DSLR vs film scan prints.

The Foveon technology looks interesting, but their current implementation
doesn't have any diffuser over the sensor, which makes it appear sharper
than most competitive sensors, but it is prone to aliasing and moire.

--

Ciao,   Paul D. DeRocco
Paulmailto:[EMAIL PROTECTED]


Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or body


[filmscanners] Re: Why DSLR ouput looks sharper?

2003-09-11 Thread Berry Ives
on 9/11/03 4:59 PM, David J. Littleboy at [EMAIL PROTECTED] wrote:

>
> "Nagaraj, Ramesh" <[EMAIL PROTECTED]> asks:

> Take a picture using 6MP DSLR at full resolution. Also scan a slide
> using 4000dpi scanner. Open both image files in Adobe and observe at 100%.
>
> Image from DSLR looks to have sharper edges compared to scanner output.
> What is the reason for this?
> Is it because of in-built sharpening of DSLR?
> <
>
> It's because the digital image is sharper, cleaner, and better.
>
> If your 4000 dpi scan is from a sharp slide, sharpen lightly, downsample to
> 2400 dpi, and sharpen lightly again. You should get an image that is almost
> as good as the DSLR image and has a few more pixels as well.
>
> (This works even better if you use the new version of Picture Window Pro's
> Advanced Sharpen function, which includes Neat Image style noise reduction,
> for the "sharpen lightly" steps.)
>
> If your 4000 dpi scan is from a sharp _645_ slide, sharpen lightly,
> downsample to 2800 dpi, sharpen lightly, downsample to 2000 dpi, and sharpen
> lightly again. You'll get an image that is just as good as (if not better
> than) any DSLR image, and has 13MP or so.
>
> David J. Littleboy
> [EMAIL PROTECTED]
> Tokyo, Japan
>
> --
> --
> Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe
> filmscanners'
> or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or
> body

What I want to know is:  which one will make a better 11x14 or 12x16" print?
Looking at screen pixels is not the final product, even though it does tell
us something.

I am especially interested in comparisons of prints made from the Sigma
Fovion images versus the new Olympus E-1 DSLR versus scans of film.  Also,
any other DSLR vs film scan prints.

Berry


Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or body


[filmscanners] Re: Why DSLR ouput looks sharper?

2003-09-11 Thread David J. Littleboy

"Nagaraj, Ramesh" <[EMAIL PROTECTED]> asks:
>>>
 Take a picture using 6MP DSLR at full resolution. Also scan a slide
 using 4000dpi scanner. Open both image files in Adobe and observe at 100%.

 Image from DSLR looks to have sharper edges compared to scanner output.
 What is the reason for this?
 Is it because of in-built sharpening of DSLR?
 <

It's because the digital image is sharper, cleaner, and better.

If your 4000 dpi scan is from a sharp slide, sharpen lightly, downsample to
2400 dpi, and sharpen lightly again. You should get an image that is almost
as good as the DSLR image and has a few more pixels as well.

(This works even better if you use the new version of Picture Window Pro's
Advanced Sharpen function, which includes Neat Image style noise reduction,
for the "sharpen lightly" steps.)

If your 4000 dpi scan is from a sharp _645_ slide, sharpen lightly,
downsample to 2800 dpi, sharpen lightly, downsample to 2000 dpi, and sharpen
lightly again. You'll get an image that is just as good as (if not better
than) any DSLR image, and has 13MP or so.

David J. Littleboy
[EMAIL PROTECTED]
Tokyo, Japan


Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or body


[filmscanners] Archiving scans - DVD vs CD

2003-09-11 Thread Mike Brown
I was lucky enough to attended IFA, the Berlin consumer electronics
exhibition, last week & managed to speak to a guy at Verbatim about their
disks and longevity. (They're claiming 100 years on their write-once discs.)
He was very honest in making clear that this was a projected value based on
extended temperature, pressure & humidity storage. He mentioned that
humidity is a particular problem - time to buy sealable storage units &
silica gel maybe (or is silica gel a contaminant???)

Verbatim claim that their "Super AZO" dye makes a big difference (I notice
that some Verbatim CD-R disks in a local store were "Azo" and others "Super
Azo"). He went on to say that DVDs are better than CDs because both top and
bottom surfaces are coated with plastic - reducing the risks from humidity
and atmospheric contaminants.

Interesting conversation but I'd like to see some lab results!


Mike


Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or body


[filmscanners] RE: Why DSLR ouput looks sharper?

2003-09-11 Thread Paul D. DeRocco
> From: Nagaraj, Ramesh
>
>  Take a picture using 6MP DSLR at full resolution. Also scan a slide
>  using 4000dpi scanner. Open both image files in Adobe and
> observe at 100%.
>
>  Image from DSLR looks to have sharper edges compared to scanner output.
>  What is the reason for this?
>  Is it because of in-built sharpening of DSLR?

Partly, and I usually turn it down because I don't like it. But also 4000dpi
is higher resolution than any digicam, so you should expect softer images
when viewing at one screen pixel per image pixel. On top of that, some slide
scanners have trouble getting good focus from center to edge, because of the
curvature of the slide.

--

Ciao,   Paul D. DeRocco
Paulmailto:[EMAIL PROTECTED]


Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or body


[filmscanners] RE: Why DSLR ouput looks sharper?

2003-09-11 Thread Austin Franklin
Hi Ramesh,

A two pixel camera will give you a perfectly sharp image.  Sharpness is no
indication of image fidelity (ability to reproduce accurately).  It also
depends on your scanner and your film and a whole lot of other things...

Regards,

Austin


> Hi,
>
>  Take a picture using 6MP DSLR at full resolution. Also scan a slide
>  using 4000dpi scanner. Open both image files in Adobe and
> observe at 100%.
>
>  Image from DSLR looks to have sharper edges compared to scanner output.
>  What is the reason for this?
>  Is it because of in-built sharpening of DSLR?
>
> Thanks
> Ramesh


Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or body


[filmscanners] Why DSLR ouput looks sharper?

2003-09-11 Thread Nagaraj, Ramesh
Hi,

 Take a picture using 6MP DSLR at full resolution. Also scan a slide 
 using 4000dpi scanner. Open both image files in Adobe and observe at 100%.

 Image from DSLR looks to have sharper edges compared to scanner output.
 What is the reason for this?
 Is it because of in-built sharpening of DSLR?
 
Thanks
Ramesh


Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or body


[filmscanners] RE: 8 bit versus 16

2003-09-11 Thread Austin Franklin
> ...It doesn't apply to computer-generated
> images with gradients, tints, etc., either.
>
> Preston Earle
> [EMAIL PROTECTED]

Can you scan those with a film scanner?

;-)


Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or body


[filmscanners] Re: 8 bit versus 16

2003-09-11 Thread Preston Earle
"Austin Franklin" <[EMAIL PROTECTED]> noted: "I MUST stress, that
Margulis is specifically talking about COLOR images, NOT B&W, and that
distinction is VERY important."


Yes, color *photographic* images. It doesn't apply to computer-generated
images with gradients, tints, etc., either.

Preston Earle
[EMAIL PROTECTED]



Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or body


[filmscanners] RE: 8 bit versus 16

2003-09-11 Thread Nagaraj, Ramesh
>Preston wrote

>When Photoshop converts from 16-bit to 8-bit it applies very fine noise
>to try to control subsequent problems. Most scanners don't. I would have
>expected this to make a difference but not to the point that the scanner
>8-bit file would completely suck and the Photoshop 8-bit file would be
>just as good as the 16- bit version. I don't know whether this is all a
>function of Photoshop's superior algorithm or whether the scanner is
>doing something bad. Furthermore, I don't care. One way or another, the
>8-bit scanner file is bad and the 8-bit Photoshop file is good."

This is an interesting point. I too think Adobe may do better job of converting to 
8bit than Vuescan. I will follow it.

Thanks
Ramesh




Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or body


[filmscanners] RE: 8 bit versus 16

2003-09-11 Thread Austin Franklin
Hi Preston,

Great post, thanks...but again, I MUST stress, that Margulis is specifically
talking about COLOR images, NOT B&W, and that distinction is VERY important.

Regards,

Austin


> -Original Message-
> From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED] Behalf Of Preston Earle
> Sent: Thursday, September 11, 2003 1:58 PM
> To: [EMAIL PROTECTED]
> Subject: [filmscanners] Re: 8 bit versus 16
>
>
> Of interest in this discussion:
> http://www.ledet.com/margulis/ACT_postings/ColorCorrection/ACT-8-b
> it-16-bit.htm
> and
> http://www.ledet.com/margulis/ACT_postings/ColorCorrection/ACT-16-
> bit-2002.htm
>
> Money quote from Dan Margulis: "The bottom line of all my tests was,
> with one important caveat that I'll get to in a moment, there is no
> 16-bit advantage. I blasted these files with a series of corrections far
> beyond anything real-world; I worked at gammas ranging from 1.0 to 2.5
> and in all four of the standard RGBs, I worked with negs, positives,
> LAB, CMYK, RGB, Hue/Saturation, what have you. While the results weren't
> identical there were scarcely any cases where there would be detectable
> differences and in those one would be as likely to prefer the 8-bit
> version as the 16. So, I have no reservation in saying that there's no
> particular point in retaining files in 16-bit, although it doesn't hurt
> either.
>
> I'll show all these results later, but the surprise was in the files
> that Ric [Cohn] sent, which appeared to show just the sort of damage
> that 8-bit editing is supposed to cause, in an image with a dark rich
> blue gradient, a worst-case scenario in conjunction with the very dark
> original scan, which in itself was an attempt to give an advantage to
> 16-bit editing.
>
> Ric provided both original 8-bit and 16-bit versions of these files.
> Granted that the necessary corrections were very severe, they still
> showed that what he said was true: the 8-bit version banded rather badly
> and the 16-bit did not. I tried several different ways of trying to get
> around the disadvantage and could not do so without excessive effort.
>
> Ric's 8-bit original, however, was generated from the 16-bit scan not by
> Photoshop but rather within his own scanner software. Therefore, I tried
> further tests where I applied the same extreme corrections to the image,
> but this time not to Ric's 8-bit image but rather a direct Photoshop
> conversion of Ric's 16- bit image to 8-bit. Shockingly, this completely
> eliminated the problem. There was no reason to prefer the version
> corrected entirely in 16-bit.
>
> When Photoshop converts from 16-bit to 8-bit it applies very fine noise
> to try to control subsequent problems. Most scanners don't. I would have
> expected this to make a difference but not to the point that the scanner
> 8-bit file would completely suck and the Photoshop 8-bit file would be
> just as good as the 16- bit version. I don't know whether this is all a
> function of Photoshop's superior algorithm or whether the scanner is
> doing something bad. Furthermore, I don't care. One way or another, the
> 8-bit scanner file is bad and the 8-bit Photoshop file is good."
>
> Preston Earle
> [EMAIL PROTECTED]
>
> (Still in Group 3.)
>
>
> --
> --
> Unsubscribe by mail to [EMAIL PROTECTED], with
> 'unsubscribe filmscanners'
> or 'unsubscribe filmscanners_digest' (as appropriate) in the
> message title or body


Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or body


[filmscanners] Re: 8 bit versus 16

2003-09-11 Thread Alan Eckert
I couldn't resist throwing in my two bits (or eight bits, as the case may
be).  I tried using my SS4000 at 14 bits, or maybe it was 12 bits (it's not
capable of true 16 bits) because I had read that you lose less information
when making color corrections on high-bit files.  However, I found that my
carefully developed scanner-specific ICC profile didn't work on the high-bit
file, and so I had to go through a lot of agonizing effort just to get it
back to what it would have been after an 8-bit scan.  And I didn't entirely
succeed.  So for me, at least, _profiled_ 8-bit scans beat 16-bit scans
every time.

Alan

- Original Message -
From: "Austin Franklin" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Wednesday, September 10, 2003 10:30 PM
Subject: [filmscanners] RE: 8 bit versus 16


Hi Art,

> ...and that's even concluding that the scanner is really
> capturing the full 16 bit depth, which many do not.

I'm not sure ANY do.  Do you know of a scanner that really has a usable 16
bits of data for each color?  I know a few (and only a very few from what
I've seen) *claim* 16 bits, but that doesn't mean that they actually can
deliver 16 bits.  If they could, their dMax would be 4.8, and I've not heard
that claim.  I believe the best I've seen is 14 bits, or a dMax of 4.2...but
even at that, I'm skeptical that they actually meet that.

Even if they were capable of that, that doesn't mean the bits are always
used, especially for negative film.  Color negative film, say, with a
density range of 3.0, would only be able to use 10 bits our of what ever
range is available, anyway.

Regards,

Austin



Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe
filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title
or body


Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or body


[filmscanners] Re: 8 bit versus 16

2003-09-11 Thread Preston Earle
Of interest in this discussion:
http://www.ledet.com/margulis/ACT_postings/ColorCorrection/ACT-8-bit-16-bit.htm
and
http://www.ledet.com/margulis/ACT_postings/ColorCorrection/ACT-16-bit-2002.htm

Money quote from Dan Margulis: "The bottom line of all my tests was,
with one important caveat that I'll get to in a moment, there is no
16-bit advantage. I blasted these files with a series of corrections far
beyond anything real-world; I worked at gammas ranging from 1.0 to 2.5
and in all four of the standard RGBs, I worked with negs, positives,
LAB, CMYK, RGB, Hue/Saturation, what have you. While the results weren't
identical there were scarcely any cases where there would be detectable
differences and in those one would be as likely to prefer the 8-bit
version as the 16. So, I have no reservation in saying that there's no
particular point in retaining files in 16-bit, although it doesn't hurt
either.

I'll show all these results later, but the surprise was in the files
that Ric [Cohn] sent, which appeared to show just the sort of damage
that 8-bit editing is supposed to cause, in an image with a dark rich
blue gradient, a worst-case scenario in conjunction with the very dark
original scan, which in itself was an attempt to give an advantage to
16-bit editing.

Ric provided both original 8-bit and 16-bit versions of these files.
Granted that the necessary corrections were very severe, they still
showed that what he said was true: the 8-bit version banded rather badly
and the 16-bit did not. I tried several different ways of trying to get
around the disadvantage and could not do so without excessive effort.

Ric's 8-bit original, however, was generated from the 16-bit scan not by
Photoshop but rather within his own scanner software. Therefore, I tried
further tests where I applied the same extreme corrections to the image,
but this time not to Ric's 8-bit image but rather a direct Photoshop
conversion of Ric's 16- bit image to 8-bit. Shockingly, this completely
eliminated the problem. There was no reason to prefer the version
corrected entirely in 16-bit.

When Photoshop converts from 16-bit to 8-bit it applies very fine noise
to try to control subsequent problems. Most scanners don't. I would have
expected this to make a difference but not to the point that the scanner
8-bit file would completely suck and the Photoshop 8-bit file would be
just as good as the 16- bit version. I don't know whether this is all a
function of Photoshop's superior algorithm or whether the scanner is
doing something bad. Furthermore, I don't care. One way or another, the
8-bit scanner file is bad and the 8-bit Photoshop file is good."

Preston Earle
[EMAIL PROTECTED]

(Still in Group 3.)



Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or body


[filmscanners] RE: 8 bit versus 16

2003-09-11 Thread LAURIE SOLOMON
Art,
I concur with everything you have said except the last paragraph which
concerns something I have no knowledge about and no concern with, given that
I am not into gaming.  As I have noted in several of my posts, I see as a
potential positive for hi-bit scanning the fact that it furnishes more raw
data than a low bit scan and therefore provides for potential future
flexibility when creating archive master raw image files as ones final
output, which are to be used at a later date to generate working files for
specific purposes and output devices and  products( which in the future may
support high-bit files but not currently).  Otherwise, for general purposes,
high-bit scans as working files typically offer little added value overf an
8-bit file - except in a few rare (if not extreme) cases.

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] Behalf Of Arthur Entlich
Sent: Wednesday, September 10, 2003 9:01 PM
To: [EMAIL PROTECTED]
Subject: [filmscanners] Re: 8 bit versus 16


Hi Laurie,

This isn't about minutia, this is about belief systems and religion ;-)

The only real solution to deal with the zealotry would be a carefully
controlled double blind experiment.  Otherwise, we are indeed the blind
leading the blind, because simply, we shall see what we expect to see.

I am willing at accept that just like people's ability to hear musical
tones (AKA being "tone deaf" or not) some people are gifted with greater
color perception, and abilities to distinguish between them.  I happen
to have quite good color perception and color memory (i wish my event
memory was as good!), however, I "believe" tests would tend to show that
the vast majority within the bell curve cannot see the difference. I do
believe there is a small advantage to 16  bit files due to data getting
fractionated and pushed around slightly (although there really isn't a
very large step for it to go in either direction). But, if a great deal
of multiple manipulation is going to be accomplished a color could get
pushed a fraction of a 1/256th step in error. However, in terms of
printing, viewing on screen, etc, I think 16 bit files are of little to
no value, and that's even concluding that the scanner is really
capturing the full 16 bit depth, which many do not.

Regarding the matter of banding in 3D rendering, the as was mentioned by
another poster, that problem is due to use of unnatural restricted
pallets, and limited or no use of dithering, because random dithering in
a moving 3d object shows up as moving noise and it slows the rendering
process down, also.

Art


LAURIE SOLOMON wrote:

> On the face of it, this does seem to be another silly debate.  In his
> responses Austin covered his ass bymaking of point of sayin 16-bit is not
> necessary in most color scans as contrasted to all, which means that nay
> exception you bring up will be considered by his as an exception and not
the
> rule.  You on the other hand seem to have focused in on the needs of your
> own personal work flow and needs and not a general workflow or needs and
> come off as having an agenda of convincing others that your workflow is
the
> only good and acceptible one for everyone.
>
> If your work flow works for you and is something that requires you to
employ
> 16-bit scans as you perceive it, then you have to satisfy yourself and
> should stick to 16-bit scans.  If others think that 8-bit suffices for
their
> work than they should use that.  Neither has any need to convince the
other
> that what they are doing is justified much less the best and only proper
way
> to accomplish the goals at hand for each.
>
> Sometimes we become fanatical over trivial minutia which is not
significant
> to most and can not be dscerned by most even when they perform the
empirical
> experiments suggested.  Thus, for them this discussion becomes of as much
> practical relevance to their needs and work as knowing the number of
angles
> that can fit on the head of a pin.  If there is a key practical
significance
> to doing 16-bit color scans, it is to generate as complete a quasi-raw
data
> file as is currently possible from a scan for purposes of archiving as a
> master file off of which specific working files will be generated both now
> and in the future.  By doing a 16-bit scan of a color image, you capture
as
> much data as is now possible which may be of potential use in the future
as
> the software and hardware changes and approves to accomodate the use of
the
> additional data in a 16-bit file.  However, in many cases, the need and
> usefulness of a refined and tonally enhanced, extended 16-bit image file
as
> a working file that one is going to produce finished products from is of
> little practical use given today's software and hardware.
>
> If you or anyone thinks they see a difference in the final product when
> producing their work by using 16-bit as opposed to 8-bit scans, then by
all
> means they should use 16-bit scans and not worry about what anyone

[filmscanners] Re: 8 bit versus 16

2003-09-11 Thread
Art-I'm actually partially on your side--I agree that 8 bits is mostly the limit of 
what
humans can discern. There are rare cases of large, "shallow" (not much tonal range)
gradients that haven't been dithered, either artificially or by film grain, that can 
show
banding. But again, that's rare. And in any event, there are NO output solutions I 
know of
that actually support 16-bit, so whether or not we can discern 16-bit is pretty much a 
moot
point.

The only time I'm saying 16-bit matters is when you make tonal adjustments, either
gamma/levels/curves or dodging and burning. These throw away data. To have 8 bits of
significant data when you're done you have to start with more than 8 bits.

And if you're adjusting things in the scanner, the scanner's bit depth matters. One of 
the
reasons 1990-and-earlier drumscanners do such a horrid job on color neg is that they're
only 8 bits internally, spread over a 4.0 dynamic range. The much smaller range of 
color
neg leaves you only 5-6 bits data--generally an ugly mess.


Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or body


[filmscanners] Re: 8 bit versus 16

2003-09-11 Thread Arthur Entlich
OK, I was covering my butt, in case "someone" (a-hmmm) knew of some I
didn't... I don't deal in the $150,000 scanner market so maybe there are
some scanners that can accurately capture a full 16 bit depth.

Several scanner companies will throw around dMax numbers based upon the
mathematical "possibilities" rather than anything approaching reality.
What is the new Minolta 5400 claiming?

Art


Austin Franklin wrote:
> Hi Art,
>
>
>>...and that's even concluding that the scanner is really
>>capturing the full 16 bit depth, which many do not.
>
>
> I'm not sure ANY do.  Do you know of a scanner that really has a usable 16
> bits of data for each color?  I know a few (and only a very few from what
> I've seen) *claim* 16 bits, but that doesn't mean that they actually can
> deliver 16 bits.  If they could, their dMax would be 4.8, and I've not heard
> that claim.  I believe the best I've seen is 14 bits, or a dMax of 4.2...but
> even at that, I'm skeptical that they actually meet that.
>
> Even if they were capable of that, that doesn't mean the bits are always
> used, especially for negative film.  Color negative film, say, with a
> density range of 3.0, would only be able to use 10 bits our of what ever
> range is available, anyway.
>
> Regards,
>
> Austin
>
>


Unsubscribe by mail to [EMAIL PROTECTED], with 'unsubscribe filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or body