I like the approach my University had for CS degrees, learning a
language is up to you.

You'd get no credit if you took a course in any computer language (it
wouldn't count towards your degree), and the little bit of actual
language learning would happen under labs but not be a formal thing in
class.

The default language at the time was Ada, which even back then was not
that practical (job wise) but I though was a good language to use for
teaching. Some courses required you to know C, it was key for OS level
courses and lower level programming tasks, and in other classes other
languages would be used based on the subject matter.

The idea was our professors would always tell us that languages come
and go, but the core principles would be more stable. At the time, I
didn't fully appreciate that piece of advice, it seemed "academic" and
impractical.

Now in one of my first interviews at a large Telecom, it became clear
why this was important. They would take a room full of developers and
teach us a language in 30-45 minutes. All the basic stuff, how to
declare use variables, loop constructs, functions. Then they would
give us a test to write programs in this made up language, all in
paper, nothing gets compiled or run (the language didn't exist).

To me, the test was a complete piece of cake, but to the vast majority
of the people in the room, I could see them struggling.

It's nice to pick some default general languages for teaching, but I
think a lot of programs seem to obsess over this a bit too much when
they should be covering the fundamentals.

Having said that, I really do think a compiled language is a core
thing somebody should be exposed to, very very early too.

-- 
You received this message because you are subscribed to the Google Groups "The 
Java Posse" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/javaposse?hl=en.

Reply via email to