Andrew Lentvorski wrote:
Christopher Smith wrote:
The other way is to make the system measure exactly what you care
about. The only real problem with "gaming" of systems is when what
you are measuring is a proxy for what you actually want.
Only if you can know and measure *exactly* what you want. If it is
even a *little* off, the system can be gamed.
And, you can never measure exactly what you want in a social system,
because external factors are always going to introduce some uncertainty.
Humans are pattern matching machines, to *not* expect them to game the
system is folly.
Agreed. Sometimes all you care about are the entirely measurable things.
One of the really clever things I've seen is the notion that whatever
you provide an incentive for, you also provide an incentive for the
opposing value. So, if you have a reward for quantity, you also have one
for quality. It's a clever idea, but I think it just results in people
spending more time trying to figure out how to optimally game things. ;-)
--Chris
--
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-lpsg