You mean like verified experience and CE (Continuing Education) credits,
like some  certification programs and licenses?

Again, man-hours cost to review, even if only a subset done at random.
Suggest and plan and it will be looked at. :)

- bjs


--
Sent from my Essential PH-1, please excuse any typos
Bryan J Smith - http://linkedin.com/in/bjsmith



On Fri, Apr 19, 2019, 14:38 BHL <[email protected]> wrote:

> What about the number of hours you worked in Linux [to be checked by a
> LPIC-2 or LPIC-3] (Like IIBA/PMI certs)?
>
>
> Sent with ProtonMail <https://protonmail.com> Secure Email.
>
> ‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐
> On Friday, April 19, 2019 5:01 AM, Bryan Smith <[email protected]> wrote:
>
> You mean like articles on how psychometrics and other things are used in
> LPI's approach, followed by blog articles and testimonies? :)
>
> Let's face it, LPI has all that information, and more out there. But LPI
> does not have marketing dollars. LPI relies heavily on word-of-mouth.
>
> This too has been to a pulp over decades. :)
>
> - bjs
>
>
> --
> Sent from my Essential PH-1, please excuse any typos
> Bryan J Smith - http://linkedin.com/in/bjsmith
>
>
>
> On Fri, Apr 19, 2019, 04:55 Stephan Wenderlich <[email protected]>
> wrote:
>
>> Instead of discussing this topic again and again, LPI should do its
>> homework and take care about a serious cert guide which is accurate and
>> well designed.
>>
>> On 19.04.19 11:33, Alan McKinnon wrote:
>> > On 2019/04/19 10:04, Simone Piccardi wrote:
>> >> Il 16/04/19 14:45, Mark Clarke ha scritto:
>> >>> I would suggest that its not an either or approach. We could have a
>> >>> part that is multiple choice and a practical part. The practical
>> >>> part doesn't have to be under exam conditions. It could be a task
>> >>> like write a bash script that does x or some other assignemtn. The
>> >>> student is given 2 days to do the task and submit the
>> >>> script/assignement and the testing can be automated.
>> >>>
>> >> And how do you avoid having the student getting "help" from a friend?
>> >
>> >
>> > That's an excellent point.
>> >
>> > Another is how will an automated tester account for every variation
>> > that the candidate might have or do? Perhaps a candidate might
>> > validate an IP Address (sensible) and naturally uses Python with
>> > netaddr. Automated testing is likely to fail and the assignment,
>> > whilst correct, is marked wrong. Now manual intervention is needed and
>> > that means salaries. The cost of an exam just multiplies many times.
>> >
>> > I've stayed out of this current discussion as it rears it's head every
>> > few years and never goes anywhere. Such discussions are tiring.
>> >
>> > Someone earlier mentioned the perception that hands-on testing is
>> > better. I very much agree that it is a perception. It might not be true.
>> >
>> > So what is hands-on testing good for? It's great for testing if a
>> > candidate can perform a series of predetermined steps in response to a
>> > given situation to produce a determined result. Hence why we test
>> > student pilots with it. And electricians, scuba divers and almost
>> > every action a sailor will do on the job (when sailors can't pass
>> > these tests, other sailors die).
>> >
>> > It's why RedHat, Cisco and SuSE use practical tests - those distros
>> > provide specific tools to do specific functions and the candidate can
>> > rely on the tools to be present and work correctly. To do task X on
>> > RHEL regarding selinux, RHEL provides a tool, and it will be present
>> > on the test machine. The candidate is required to show they can drive
>> > the tool to produce the result RedHat demonstrated in the course.
>> >
>> > In truth, this has very little to do with results, it has everything
>> > to do with the tool and how it is used, and the result is a
>> > side-effect. RedHat never puts anything in their low and mid level
>> > exams that is not covered in sufficient detail in their course
>> > materials, to do so would be very unfair. You can't expect someone to
>> > perform a task they were not taught how to do.
>> >
>> > If we look at LPI's mission, we see that it is to a large degree
>> > exactly opposite to the above. LPI is not about RHEL tools, it is
>> > about the candidate proving they understand Linux systems within the
>> > scope of the level tested. Because the scope is not bound to a
>> > specific distro or release, testing has to be done on a somewhat
>> > abstract, conceptual level. There is nothing wrong with measuring the
>> > extent of conceptual knowledge and this is what LPI does.
>> >
>> > Testing conceptual knowledge is not inherently better or worse than
>> > practical testing, they are simply different. Both have their place
>> > and they are answers to different questions about candidates and
>> > should not be conflated.
>> >
>> >
>> _______________________________________________
>> lpi-examdev mailing list
>> [email protected]
>> https://list.lpi.org/mailman/listinfo/lpi-examdev
>>
>
> _______________________________________________
> lpi-examdev mailing list
> [email protected]
> https://list.lpi.org/mailman/listinfo/lpi-examdev
_______________________________________________
lpi-examdev mailing list
[email protected]
https://list.lpi.org/mailman/listinfo/lpi-examdev

Reply via email to