Bruce Rubenstein wrote:
Moore's Law depends on shrinking the geometries of the lines, traces and devices in the IC's. There are limits to this. Digital cameras are starting towards the end of where it is getting hard to shrink things. It should also be noted that Moore's Law applies to CPU's. It's much harder to improve the performance of a system than just a part.
I also think that the original comment was made tongue in cheek.


BR

[EMAIL PROTECTED] wrote:

Now there is a brave prediction. Absolutely can't happen! Wilbur and Orville's critics probably said the same thing. Give me one good reason why digital cameras won't technologically advance at the same rate as other electronics?





I made a post regarding this last month. I recall saying that we're not going to see magnitudes of megapixel improvements, as we do with computer performance. I did the math on a 35mm sensor.


Basically, you don't get useful information when your imaging site size approaches that of a lambda.

-R


Reply via email to