Ken Foskey was once rumoured to have said: [snip] > If we work on numbers. My code generally runs on a 2% error rate (all > right I am making 2% up but the number is not important...). I rewrote > an application of 2,300 lines of C code (not a comment in sight) into > 230 lines of perl code (with lots of comments). Now with my 2% error > rate I had 5 errors in my code (about right). Now the 2,300 lines would > have had 50 errors given the same error rate. Don't use C because you > will create more code, more code equals more bugs. > > The moral is use the correct language for the job. Performance is just > a few thousand dollars in hardware away. Take $40K as your base (for > example) and a server of $5K you need to save about 7 weeks work to pay > for it. It is not a lot of time with 10 times the number of errors in > your code. Bugs cost the time in development, correcting bugs in > production code is far far worse.
argh! no! *spew* Yes, C is rarely the right language. However, Perl, Python and Ruby are also often the wrong language. [try finding variable name spellos or type parity problems quickly in these languages - in a nice polite language like C, the compiler would find it nice and quickly for you]. The reason why we see C being used so much is that we lack a statically typed high level language that doesn't completely suck ass. For some time now I've been tempted to learn Ada95 for this purpose. Shame its so unpopular. C. -- SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/ Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html
