In a message dated 11/8/2007 12:55:16 P.M. Central Standard Time,  
[EMAIL PROTECTED] writes:

tried to  put the time for computer events into perspective.


A 100-MIPS processor can execute 100 million "average" instructions  per secon
d, so one "average" instruction takes one hundred-millionth of a second  (ten 
to the minus 8 power seconds).  A Direct Access Storage Device read of  a 4K 
block, if the data is not in the DASD Subsystem's cache, would take at  least 
one millisecond, which is ten to the minus three power seconds.  The  
difference is a factor of ten to the fifth power.  If you equate one  
instruction with 
one second, then one I/O is 100000 seconds, or a little more  than one day.
 
Bill  Fairchild
Franklin, TN



************************************** See what's new at http://www.aol.com

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [EMAIL PROTECTED] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html

Reply via email to