On Wednesday 16 May 2007, Sander van Vugt wrote: > Hi List, > > Just proctered the 101 and 102 last weekend for 13 guys in the > Netherlands. Amongst them was a true Linux expert, who commented "I'm > sure I've failed". His - IMHO correct - remark: Why should I know the > difference between blah -d and blah -D if in the real world I would > do blah --help to find out how it works and use the command > appropriately within 30 seconds? I couldn't agree to him more, > especially after listening to some examples about the questions that > these people have had. Please allow me to elaborate a little.
This topic keeps coming up, and I hear such comments quite regularly. I believe they are usually founded in ignorance. But first some background: Like you, I am an experienced Linux person, I passed LPIC-1 some years ago, and I am a professional trainer. I've held LPI exam development workshops and seen first hand what it takes to come up with items for the exam. I also worked on the Ubuntu exam (more on this later). And, my 9-5 job these days is delivering the Red Hat track. I'm not an RHCX (yet) but that is coming soon and meanwhile my colleague who is RHCX is grooving me in on it, telling me what he can without breaching confidentiality. So I feel I am in a qualified position to deliver comment on what you say. Here goes. Exactly how many items did that person get on their exam that follwed that pattern? Perhaps 2 or 3 is my experience. Sometimes, Linux admins see these 2 or 3 questions and latch onto them as being faulty, exaggerating the impact in their own minds. The fact is (and this is answered on LPI's faq), such items are statistically valuable when used correctly. An 65 item exam with 35 items asking "what does such and such option do?" is of course stupid. An item asking for wild obscure options from ls or mdadm is equally stupid as no-one remembers that. But giving this screen shot: [EMAIL PROTECTED] ~ $ ls -l /usr/src/ -rw-r--r-- 1 root root 56063 Mar 31 00:40 .config -rw-r--r-- 1 root root 0 Mar 11 2005 .keep lrwxrwxrwx 1 root root 33 May 13 19:23 linux -> linux-2.6.20/ and asking essentially what is the difference between: ls -al /usr/src/linux ls -al /usr/src/linux/ ls -ald /usr/src/linux ls -ald /usr/src/linux/ is a valid question. Why? Not because it tests parrot ability to rattle off man pages but because it tests is the candidate knows about symlinks and the difference between dereferencing them or not. As an experienced trainer you would expect a certified admin to be familiar with how to make symlinks work correctly, right? If you are still not convinced, just write a simple script using them (this is an LPIC-1 objective, so it's valid in this context), get the options wrong and observe the results. > "hands on" certifications seem to make so much more sense these days. > Take the Novell CLP/CLE or the Red Hat RHCE exams, these measure real > working knowledge of real working systems. I know, this isn't > possible because we (LPI) want to be able to take exams everywhere, > even if no infrastructure is available, and do it for a reasonable > price as well. So that's a dead end. OK, I need to debunk this fallacy about the Red Hat exams. The Red Hat marketing department likes to assert that their exam tests "real working knowledge or real working systems". That doesn't mean they do, and IMNSHO they do not. The RH exam is as far removed from real life as the LPI exam is. A true real life exam would take about two weeks and would involve me throwing the candidate into a data centre with real machines, which is what we do in real life. You get only 5.5 hours to test the entire RHEL track practically, the exam items are necessarily far less complex than real life, using ideal configurations on virgin installs. Deep practical knowledge cannot be tested as the results would not be statistically valid - you do know that the Red Hat marking scripts give a binary result right? The full thing either works or it doesn't - so with a deep, realistic test the candidate could actually get almost everything right except one last little thing that's holding the solution back. This happens all the time in real life, the solution is a support call or google or maybe something else that you can't do in the exam. See now how the exam is already a significant departure from "real life"? I passed my RHCE first time after not using RH for 18 months and one week preparation. A non-trivial amount of my exam was an excellent test of my ability to rapidly extract information from man pages, and to get the result under time constraints. Which is a little different to what the marketing brochures claim :-) None of this means I have a dim view of RH or LPI's exams. Both are well put together, developed by techies just like us, who work really hard to make sure the training and exams are worthwhile, and both succeed in what they are aiming for. But those aims are different. According to LPIs faq, LPI tests in the "cognitive domain". Red Hat (my observation here) tests "can this candidate make RHEL do what we designed it to do in the sequence we designed it to be done as laid out in our training materials?" > Any alternative that would work? Well, maybe there is. Has anyone of > you ever taken a Microsoft test? In their more advanced tests, they > have scenario's and try to measure real world knowledge. For example: > let's say we want to make a question in which we want to measure > knowledge of the tar command. You can go two directions: > > 1) A user wants to make a compressed backup of his home directory. > What command would he use? > > a. tar -zfvx . > b. tar -cz . > c. tar -czf blah . I would expect an LPIC-1 to know those options intimitely well, because it is impossible to use tar/gzip correctly without knowing them. It's the same order of importance as knowing exactly what the middle pedal makes the car do on your driving test. If you don't know those facts, and need to refer to documentation to answer the question, you cannot be considered knowledgeable enough to pass and get a certificate. By the same token, I would not consider that the same candidate *has* to know that "remove existing directories before extracting directories of the same name" requires use of 'tar --recursive-unlink' as that is patently ridiculous. > Another way is by making it a tiny little scenario in which we don't > measure knowledge of options, but the ability to use the command (I > know, the question is lame, it's the idea that counts): > > 2) A user has problems making an archive of his home directory. Every > time he tries to do so, he gets an error message: "Cowardly refusing > to create an empty archive". Which is the most likely cause for this > error? > > a. he tries to make a tar for a directory that is empty > b. he has forgotten the -f option to specify the file he wants to > make the backup for > c. he has forgotten a specification like . at the end of the command > to indicate what exactly he has forgotten. Good item :-) Damn, I wish I'd thought of that when writing items a while back. I know you don't have a specific beef with tar questions on the exam, and that these examples are illustrating a point you want to make. I want to illustrate a point too, that items asking for a specific datum are perfectly valid as they are a cognitive test. It can be overdone, but I don't believe LPI does. In fact, the how-to I had to read before writing LPI exam items did cover this very topic and made suggestions on how to write a worthwhile item. The general idea of what you say above was one of the suggestions (amongst others) > I know, asking questions in this way makes LPI 1 easier compared to > what it is now. This adds to the thought that Matt has had recently, > that maybe the level should be somewhat lower. If we *really* want to > make LPIC-1 a junior level admin certification, we shouldn't ask > about options no one ever uses, we should ask about things that are > used in real life. Small scenario questions are so much more real. How much lower do you think LPIC-1 should go? If you want a lower cert, there's always Linux+ ..... A developer colleague and myself came to a shocking conclusion last week after looking around at people in the industry who don't have a clue: "What we do IS rocket science, it IS hard, and monkey-see monkey-do just doesn't work for us." (There is an OS out there that tries it that way, but it ain't Linux.) Our exams (industry-wide) should reflect this fundamental. > To finish this, a fact that shocked me. Just to do a self test, I did > some of those preparations that you can find on the internet myself. > I'm not new to Linux, I'm using it since 1993 and I hold all relevant > Linux certifications and passed LPIC-1 a long time ago. Also I'm a > trainer, preparing junior admins for their jobs as a Linux admin at > least a week a month, doing that for more than seven years now. Guess > what my results were? Yes, I failed. Either I must be a very stupid > person / drank to much beer the evening before, or something really > is wrong. Nah, all that happened is that the test exams you wrote suck. I check those things myself every now and again, and mostly every one I ever found just makes so many basic testing errors it isn't funny. The one at linux-praxis.de is pretty good though, but was never psychometrically validated like LPI does for their exams. So all is not lost and you don't need to give up on the beer -- Alan McKinnon alan at gmail dot com _______________________________________________ lpi-examdev mailing list [email protected] http://list.lpi.org/cgi-bin/mailman/listinfo/lpi-examdev
