I don't think there is a way the operating system can detect how long is going to last some particular process. Not even the compiler can do this.
This makes me remember of Turing's proof where there is no way to compute if a program will terminate at some point or not. Just my two cents. Luis. On Sun, Apr 18, 2010 at 9:35 AM, Aaron Lewis <[email protected]>wrote: > -----BEGIN PGP SIGNED MESSAGE----- > Hash: SHA1 > > > Hi, > I'm reading Operating System Concepts (7th Edition) , Written by > Abraham , Peter & Greg. > > In chapter 5.3 , it talks about a schedule algorithm: SJF > SJF means shortest jobs schedules firstly. > > To compare different process , thy use a process running time. > > e.g > P1 takes 6 secs to run > P2 takes 3 seconds > P3 takes 10 secs > > Then we should put those tasks in array like this: > P2 => P1 => P3 > > That looks much reasonable , but my question is , how does an OS > know > that a process will takes longer time to finish its life ? > I think it's impossible to let OS know exactly how long a process > will > take to run. > > > So far in my experience , i think there's a few ways to compare > Process running time: > > Forgive me if i have a poor experience on OS ;-) > > I) Number of Loops in a Program , can be detected by compiler > As long as you have any loops , you are slower than any straight > ahead > program > > II) Length of Program , longer code takes longer time sometimes , > not a > good way. > > > Anyone wants to share some experience with me ? > > Be very glad to hear your voice ;-) > > > > - -- > Best Regards, > Aaron Lewis - PGP: 0x4A6D32A0 > FingerPrint EA63 26B2 6C52 72EA A4A5 EB6B BDFE 35B0 4A6D 32A0 > irc: A4r0n on freenode > Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ > > iEYEARECAAYFAkvLCq4ACgkQvf41sEptMqB/tgCgickA4qHtRxw7TpkAIi6ghHbz > x+kAoKaMkC0FU7NLioMw1hvhEuOvifO/ > =S080 > -----END PGP SIGNATURE-----

