Jerry Stuckle wrote:
On 10/15/2013 1:21 PM, Miles Fidelman wrote:
Jerry Stuckle wrote:
Programmers nowadays do not have to manage computer's memory too,
but it
seem that when they know how low level works they write more robust
programs.
Not necessarily. I've seen great programmers who don't know or worry
about the internals. And I've seen poor programmers who grew up
building their own hardware. There is little relationship between
knowledge of the underlying hardware and ability to program.
Unless you care about things like performance or resiliancy. Gaming, big
data analysis, real-time control, anything that does physical i/o, etc.,
etc., etc.
Application programmers do not do real-time control, physical I/O,
etc. Those are system programmers. Gaming is a very specialized
area, which most programmers never get into. Same with big data
analysis - although that is normally done on large supercomputers - or
at least mainframes (some people think 1M rows of data in a database
is "big").
None of which has anything to do with the OSI layers or programming.
They are all sysadmin functions.
Again, unless you need to write network code. Or write a distributed
application.
Again, system programmers. And even with a distributed application
you don't need to know about how the network works.
Ok, we're back into semantics. You're talking about "coders" as opposed
to "software engineers" - a very limited and low-level skill set, one
level above spreadsheet jockeys. Certainly not any kind of engineering
discipline. (Also, as a definitional aside, last time I checked,
"systems programming" referred to writing operating systems and such - a
very focused, albeit complicated, activity.)
In any case....
The programmers where I'm currently working - application systems for
buses (vehicle location, engine monitoring and diagnostics, scheduling,
passenger information) -- yeah, they have to worry about things like how
often vehicles send updates over-the-air, the vageries of data
transmission over cell networks (what failure modes to account for),
etc., etc., etc.
When I worked on military systems - trainers, weapons control, command &
control, intelligence, ..... - you couldn't turn your head without
having to deal with "real world" issues - both of the hardware and
networks one was running on, and the external world you had to interact
with.
If you think anybody can code a halfway decent distributed application,
without worrying about latency, transmission errors and recovery,
network topology, and other aspects of the underlying "stuff" - I'd sure
like some of what you've been smoking.
Oh, and by the way, an awful lot of big data applications are run on
standard x86 hardware - in clusters and distributed across networks.
Things like network topology, file system organization (particularly
vis-a-vis how data is organized on disk) REALLY impacts performance.
I might also mention all the folks who have been developing algorithms
that take advantage of the unique characteristics of graphics processors
(or is algorithm design outside your definition of "programming" as well?).
Miles Fidelman
--
In theory, there is no difference between theory and practice.
In practice, there is. .... Yogi Berra
--
To UNSUBSCRIBE, email to debian-user-requ...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listmas...@lists.debian.org
Archive: http://lists.debian.org/525d88db.3090...@meetinghouse.net