From: Stephen Holcomb <[EMAIL PROTECTED]>

: I already suggested this (having a practical component compliment the
: written test) and it was squashed by the LPI psychometrician (forgot the
: name...sorry) who indicated that written tests (if implemented properly)
can
: be better measurements of a candidates competency than a practical
: implementation exam combined with a written. I know that posture seems to
: fly in the face of field pragmatics/logic, not to mention the success of
: Cisco's CCIE and Red Hats RHCE programs, (I took the latter, and it was
: quite challenging) but nevertheless, that's where that debate ended. It
: seems that the more legitimate roadblock would be one of expense,
logistics
: and institutional inclination. These latter factors, especially the last,
: probably make a written format more likely and thus appropriate format for
: the LPI and CompTIA Linux+ initiatives.
:
: Stephen Holcomb
: Sr. Managing Partner - Technology Education Integration and Consulting
: Next Generation Education Services (NEXES)
: [EMAIL PROTECTED]

I think that LPI's stance is:  neither format makes a test better or worse.
Although you can do a better or worse job with either format; perhaps it's
easier to make a bad paper/multiple-choice exam.  But one bad example
doesn't condemn all instances.  Microsoft's much maligned suite of OS's
doesn't refute Linux nor BSD.

I'm not sure what "field pragmatics/logic" are... I cannot find any study in
the testing literature that directly compares the formats.  This is a bit
surprising to me but I guess if you have a paper format, you don't have a
"hands-on" format and if you've created a hands-on format, you're loathe to
compare it to paper (or you're a trainer and quantified, scientific
comparisons don't occur to you to perform).  If you have, or know of, a
paper I've missed, I'm eager to read it.

You're point about economics of the format is well taken.  If there is no
evidence (except some people's common sense) that one format is better, one
wonders how you can justify the much, much more expensive format.

But this is all tangential to the thread:  What is "live" testing and how
would one construct a measure of it?  Just having the exam using physical
equipment is not "live" anymore than laboratory psychological experiments
accurately reflect "real life" (they generally do not, in case you haven't
gone to grad school in psych).

I may not understand "live" but I fail to see how we could measure "live"
performance in any exam.   But I await the return of the original poster
with an open mind.

The MAIN failing of any exam of any format is that they take only a tiny
sample of behaviors.  Imagine: you observe someone's behavior (answering
questions) for, say, 120 minutes, and then you try to predict whether they
can unfuck your DNS at 2AM a couple years from now!  In that context --
given that they do robustly predict behavior at a low level -- simple
multiple-choice exams are amazingly magic prophetic devices.  If anyone
finds a way to predict stocks this well, I'm sure they would become the
richest person in the world within a year (thankfully, people's performance
is a lot less stochastic than stocks').

And once you adopt a sampling perspective, the larger number of items on a
"paper" test could easily outweigh their (presumably) individually lower
reliability/precision and lead to a much better, more thorough "paper" test
than a "hands-on" test at a much lower cost.  (But note that sampling seems
pretty "artificial" and not at all "live".)

I think the only way to do "live" testing would be to hire everyone who
wants to be a Linux sys. admin and see who succeeds and who fails.
Microsoft jokes aside, no company is going to do this.  This is essentially
no testing.

--
Alan D. Mead, Ph.D. / [EMAIL PROTECTED]
+217-344-2698 / fax: +217-344-9066
Nobody made a greater mistake than he who
did  nothing because he could only do a little.
--Edmund Burke / Linux + LPI = world domination



--
This message was sent from the lpi-examdev mailing list.
Send `unsubscribe lpi-examdev' in the subject to [EMAIL PROTECTED] 
to leave the list.

Reply via email to