El Martes 21 Jul 2009 16:54:01 Martin escribió: > My thought is that we must have a semantic shift so that what is > usefully utilised is rewarded, and not just *time spent* (perhaps busyly > uselessly spinning wheels) on whatever hardware.
The GPU and CPU apps don't necessarily make the same amount of floating point operations. If someone optimizes one of the two apps so that it can do the same with (slightly?) less calculations, and you grant credits per flops, then GPU and CPU get different credits for doing the exact same task (meaning same input, same output). If that happens, credits aren't really reflecting "work done", in my opinion. By the way, credits are already defined proportional to flops: "1/100th day of CPU time on a computer that does both 1000 double-precision MIPS and 1000 integer MIPS." In other words, "a 1 GigaFLOP machine, running full time, produces 100 units of credit in 1 day." _______________________________________________ boinc_dev mailing list [email protected] http://lists.ssl.berkeley.edu/mailman/listinfo/boinc_dev To unsubscribe, visit the above URL and (near bottom of page) enter your email address.
