Re: Experiences/guidance on teaching Python as a first programming language
In article 20131216213225.2006b30246e3a08ee241a...@gmx.net, Wolfgang Keller felip...@gmx.net wrote: And ever after that experience, I avoided all languages that were even remotely similar to C, such as C++, Java, C#, Javascript, PHP etc. I think that's disappointing, for two reasons. Firstly, C syntax isn't that terrible. It's not just the abysmally appalling, hideously horrifying syntax. At about everything about C is just *not* made for human beings imho. It's just an un-language that gets at about everything wrong. Sort of like Microsoft's products. Sincerely, Wolfgang I don't see how you could create a better high-level LOW-LEVEL language. And that pointer * syntax is really ingenious. (After all, the guys who created it and those who first used it (at Bell Labs) WERE all geniuses!) David -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
In article mailman.4286.1387291924.18130.python-l...@python.org, Neil Cerutti ne...@norwich.edu wrote: On 2013-12-17, Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote: I would really like to see good quality statistics about bugs per program written in different languages. I expect that, for all we like to make fun of COBOL, it probably has few bugs per unit-of-useful-work-done than the equivalent written in C. I can't think of a reference, but I to recall that bugs-per-line-of-code is nearly constant; it is not language dependent. So, unscientifically, the more work you can get done in a line of code, then the fewer bugs you'll have per amount of work done. -- Neil Cerutti Makes no sense to me. I can't imagine that errors per 100 lines is anywhere near as high with a language that has garbage collection and type checking as with one that has neither. David -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Thu, 19 Dec 2013 19:38:51 -0500, Roy Smith wrote: Does anybody ever use D? I looked at it a few years ago. It seemed like a very good concept. Sort of C++, with the worst of the crap torn out. If nothing else, with the preprocessor torn out :-) Did it ever go anywhere? Apparently Facebook are now working with it: http://www.fastcolabs.com/3019948/more-about-d-language-and-why-facebook-is-experimenting-with-it -- Steven -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 12/20/13 6:58 PM, Dennis Lee Bieber wrote: On 20 Dec 2013 02:16:05 GMT, Steven D'Aprano steve+comp.lang.pyt...@pearwood.info declaimed the following: 2) Even for kernel developers, I believe that systems languages should be safe by default. You ought to have to explicitly disable (say) bounds checking in critical sections of code, rather than explicitly enable it. Or worse, have to program your own bounds checking -- especially if the compiler is permitted to silently disregard it if you make one tiny mistake. I wonder how BLISS falls into that... Have to read the rest of http://en.wikipedia.org/wiki/BLISS (while I had 22 years on VMS, it was mostly F77, a touch of F90, C, Pascal, and some DCL; but never used BLISS) Bliss is even lower-level than C. It made the too-consistent choice of having names mean the same thing on the left-hand side of an assignment as on the right-hand side. A name meant the address of a variable, so to access the value of a variable, you had to dereference it with the dot operator, much like the unary asterisk in C. C: a = b Bliss: a = .b C: a = a + 1 Bliss: a = .a + 1 C: a = *b Bliss: a = ..b C: a = b Bliss: a = b It was far too common to forget the dots... -- Ned Batchelder, http://nedbatchelder.com -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Thu, 19 Dec 2013 19:41:00 +1300, Gregory Ewing greg.ew...@canterbury.ac.nz wrote: But it's not above inferring a dereferencing operation when you call a function via a pointer. If f is a pointer to a function, then f(a) is equivalent to (*f)(a) If the compiler can do that for function calls, there's no reason it couldn't do it for member access as well. Quite right. And I recall being confounded by the function pointer syntax; it never fit in my mental model of how the rest of C worked. Anyway I was not intending to defend C choices, merely to point out an advantage this choice gave me. On a language without garbage collection, the indirection was very important to keep in mind. -- DaveA -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
I find it frustrating that Pythonistas shy away from regex as much as they do. I find regular expression syntax frustrating. ;- As long as I have the choice, I still prefer syntax like e.g. VerbalExpressions. That's made for actual humans like me. Sincerely, Wolfgang -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
I've never heard C syntax reviled quite so intensely. What syntax do you like, out of curiosity? Pascal, Python, if written by someone who uses semantic identifiers and avoids to use C(++)/Java-isms. I've seen Eiffel as well (without understanding it) and it didn't look ridiculous to me. Nor did a recent dialect of Cobol (since someone else mentioned it) horrify me at first sight to the point all those C-derivatives do. I also get to use SQL a bit (instead of those query builders that I consider as garbage), although that's just for databases of course. Verbosity is definitely A Good Thing. In fact, thinking of it, a really good language should imho *require* verbosity (how about a *minimum* length - or maybe even a dictionary-based sanity check - for identifiers?), since that already keeps all those lazy morons away who think that shortcuts are cool. Sincerely, Wolfgang -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Wed, 18 Dec 2013 07:23:54 -0800, Ethan Furman wrote: On 12/18/2013 12:18 AM, Steven D'Aprano wrote: And yes, I'm being pedantic. No, you're being an ass. My my, it doesn't take much of a challenge to the Holy Church Of C to bring out the personal attacks. -- Steven -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Wed, 18 Dec 2013 19:51:26 +1100, Chris Angelico wrote: On Wed, Dec 18, 2013 at 7:18 PM, Steven D'Aprano st...@pearwood.info wrote: You want to know why programs written in C are so often full of security holes? One reason is undefined behaviour. The C language doesn't give a damn about writing *correct* code, it only cares about writing *efficient* code. Consequently, one little error, and does the compiler tell you that you've done something undefined? No, it turns your validator into a no-op -- silently: I disagree about undefined behaviour causing a large proportion of security holes. I didn't actually specify large proportion, that's your words. But since you mention crashes: Maybe it produces some, but it's more likely to produce crashes or inoperative codde. *Every* crash is a potential security hole. Not only is a denial of service, but a fatal exception[1] is a sign that arbitrary memory has been executed as if it were code, or an illegal instruction executed. Every such crash is a potential opportunity for an attacker to run arbitrary code. There are only two sorts of bugs: bugs with exploits, and bugs that haven't been exploited *yet*. I think you are severely under-estimating the rule of undefined behaviour in C on security vulnerabilities. I quote from Silent Elimination of Bounds Checks: Most of the security vulnerabilities described in my book, Secure Coding in C and C++, Second Edition, are the result of exploiting undefined behavior in code. http://www.informit.com/articles/article.aspx?p=2086870 Undefined behaviour interferes with the ability of the programmer to understand causality with respect to his source code. That makes bugs of all sorts more likely, including buffer overflows. Earlier this year, four researchers at MIT analysed how undefined behaviour is effecting software, and they found that C compilers are becoming increasingly aggressive at optimizing such code, resulting in more bugs and vulnerabilities. They found 32 previously unknown bugs in the Linux kernel, 9 in Postgres and 5 in Python. http://www.itworld.com/security/380406/how-your-compiler-may-be-compromising-application-security I believe that the sheer number of buffer overflows in C is more due to the language semantics than the (lack of) skill of the programmers. C the language pushes responsibility for safety onto the developer. Even expert C programmers cannot always tell what their own code will do. Why else do you think there are so many applications for checking C code for buffer overflows, memory leaks, buggy code, and so forth? Because even expert C programmers cannot detect these things without help, and they don't get that help from the language or the compiler. [...] Apart from the last one (file system atomicity, not a C issue at all), every single issue on that page comes back to one thing: fixed-size buffers and functions that treat a char pointer as if it were a string. In fact, that one fundamental issue - the buffer overrun - comes up directly when I search Google for 'most common security holes in c code' I think that you have missed the point that buffer overflows are often a direct consequence of the language. For example: http://www.kb.cert.org/vuls/id/162289 Quote: Some C compilers optimize away pointer arithmetic overflow tests that depend on undefined behavior without providing a diagnostic (a warning). Applications containing these tests may be vulnerable to buffer overflows if compiled with these compilers. The truly frightening thing about this is that even if the programmer tries to write safe code that checks the buffer length, the C compiler is *allowed to silently optimize that check away*. Python is actually *worse* than C in this respect. You've got to be joking. I know this particular one is reasonably well known now, but how likely is it that you'll still see code like this: def create_file(): f = open(., w) f.write(...) f.write(...) f.write(...) Looks fine, is nice and simple, does exactly what it should. And in (current versions of) CPython, this will close the file before the function returns, so it'd be perfectly safe to then immediately read from that file. But that's undefined behaviour. No it isn't. I got chastised for (allegedly) conflating undefined and implementation-specific behaviour. In this case, whether the file is closed or not is clearly implementation-specific behaviour, not undefined. An implementation is permitted to delay closing the file. It's not permitted to erase your hard drive. Python doesn't have an ISO standard like C, so where the documentation doesn't define the semantics of something, CPython behaves as the reference implementation. CPython allows you to simultaneously open the same file for reading and writing, in which case subsequent reads and writes will deterministically depend on the precise timing of when
Re: Experiences/guidance on teaching Python as a first programming language
On Fri, Dec 20, 2013 at 3:20 AM, Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote: On Wed, 18 Dec 2013 19:51:26 +1100, Chris Angelico wrote: On Wed, Dec 18, 2013 at 7:18 PM, Steven D'Aprano st...@pearwood.info wrote: You want to know why programs written in C are so often full of security holes? One reason is undefined behaviour. The C language doesn't give a damn about writing *correct* code, it only cares about writing *efficient* code. Consequently, one little error, and does the compiler tell you that you've done something undefined? No, it turns your validator into a no-op -- silently: I disagree about undefined behaviour causing a large proportion of security holes. I didn't actually specify large proportion, that's your words. But since you mention crashes: You implied that it's a significant cause of security holes. I counter by saying that most security holes come from well-defined behaviour. I think you are severely under-estimating the rule of undefined behaviour in C on security vulnerabilities. I quote from Silent Elimination of Bounds Checks: Most of the security vulnerabilities described in my book, Secure Coding in C and C++, Second Edition, are the result of exploiting undefined behavior in code. http://www.informit.com/articles/article.aspx?p=2086870 I don't intend to buy the book to find out what he's talking about. All I know is that the one single most common cause of problems in C, the buffer overrun, is NOT exploiting undefined behavior, an nor are several other common problems (as described in my previous message). Earlier this year, four researchers at MIT analysed how undefined behaviour is effecting software, and they found that C compilers are becoming increasingly aggressive at optimizing such code, resulting in more bugs and vulnerabilities. They found 32 previously unknown bugs in the Linux kernel, 9 in Postgres and 5 in Python. http://www.itworld.com/security/380406/how-your-compiler-may-be-compromising-application-security Yes, those are issues. Not nearly as large as the ones that _don't_ involve your compiler hurting you, except that CPython had proper memory-usage discipline and didn't have the more glaring bugs. I believe that the sheer number of buffer overflows in C is more due to the language semantics than the (lack of) skill of the programmers. C the language pushes responsibility for safety onto the developer. Even expert C programmers cannot always tell what their own code will do. Why else do you think there are so many applications for checking C code for buffer overflows, memory leaks, buggy code, and so forth? Because even expert C programmers cannot detect these things without help, and they don't get that help from the language or the compiler. I agree. The lack of a native string type is fundamental to probably 99% of C program bugs. (Maybe I'm exaggerating, but I reckon it'll be ball-park.) But at no point do these programs or programmers *exploit* undefined behaviour. They might run into it when things go wrong, but by that time, things have already gone wrong. Example: int foo() { char buffer[80]; gets(buffer); return buffer[0]=='A'; } So long as the user enters no more than 79 characters, this function's perfectly well defined. It's vulnerable because user input can trigger a problem, but if anyone consciously exploits compiler-specific memory layouts, it's the attacker, and *NOT* the original code. On the flip side, this code actually does depend on undefined behaviour: int bar() { char buffer[5]; char tmp; memset(buffer,0,6); return tmp; } This code is always going to go past its buffer, and if 'tmp' happens to be the next thing in memory, it'll be happily zeroed. I'm pretty sure I saw code like this on thedailywtf.com a while back. Python is actually *worse* than C in this respect. You've got to be joking. Trolling, more than joking, but as usual, there is a grain of truth in what I say. I know this particular one is reasonably well known now, but how likely is it that you'll still see code like this: def create_file(): f = open(., w) f.write(...) f.write(...) f.write(...) Looks fine, is nice and simple, does exactly what it should. And in (current versions of) CPython, this will close the file before the function returns, so it'd be perfectly safe to then immediately read from that file. But that's undefined behaviour. No it isn't. I got chastised for (allegedly) conflating undefined and implementation-specific behaviour. In this case, whether the file is closed or not is clearly implementation-specific behaviour, not undefined. An implementation is permitted to delay closing the file. It's not permitted to erase your hard drive. The problem is that delaying closing the file is a potentially major issue, if the file is about to be reopened. And it _is_ undefined behaviour that one particular Python
Re: Experiences/guidance on teaching Python as a first programming language
On Wed, 18 Dec 2013 17:33:49 -0500, Terry Reedy wrote: On 12/18/2013 3:18 AM, Steven D'Aprano wrote: We don't know what locals()['spam'] = 42 will do inside a function, I am mystified that you would write this. Context is everything. locals() doesn't just return any old dictionary. It returns a dictionary of variables. Locals() will Update and return a dictionary representing the current local symbol table. The only thing unspecified is the relation between the 'current local symbol table' and the *dict* that 'represents' it. Precisely. Given that a dict is returned, the rest is unambiguous. unlike the C case, we can reason about it: - it may bind 42 to the name spam; somedict['spam'] = 42 will do exactly that. We're not talking about setting items in an arbitrary dict. We're talking about setting variables using locals(), and in that case, writing to locals() does not guarantee to bind the value to the *name*. def test(): spam = 23 locals()[spam] = 42 assert spam == 42 test() passes the assertion in IronPython 2.6, but fails in CPython 2.7 and 3.4, and Jython 2.5. - it may raise a runtime exception; Absolutely not. I don't know of any Python implementation which does so, but the documentation says: The contents of this dictionary should not be modified so it is hardly beyond the realm of possibility that some implementation may choose to treat it as an error and raise an exception. - it may even be a no-op; Absolutely not. In the example I show above, it is a no-op. The dict returned by locals is modified and then immediately garbage-collected. There are no side- effects. Should some implementation decide to compile that away as dead code, it would be perfectly allowed to. (Well, assuming that it determined first that locals() actually was the built-in and not some substitute, either by static analysis or runtime testing.) It wouldn't surprise me if PyPy was capable of doing that today. -- Steven -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Wed, 18 Dec 2013 17:15:30 +, Mark Lawrence wrote: On 18/12/2013 08:18, Steven D'Aprano wrote: The C99 standard lists 191 different kinds of undefined behavior, including what happens when there is an unmatched ' or on a line of source code. No compile-time error, no run-time error, just blindingly fast and correct (according to the standard) code that does the wrong thing. Plenty of compile-time warnings depending on the compiler, which the CPython core devs take a great deal of trouble to eliminate on every buildbot. Correct. The *great deal of trouble* part is important. Things which are the responsibility of the language and compiler in (say) Java, D, Rust, Go, etc. are the responsibility of the programmer with C. I mention these languages as they are all intended to be safer languages than C while still being efficient. Whether they succeed or not is another question. Now, I wish to be absolutely clear. There are certain programming areas where squeezing out every last iota of performance is important, and to do so may require making some compromises on correctness or safety. I find the C standard's position on undefined behaviour to be irresponsible, but, hey, maybe it is justified on the basis that C is a systems language intended for use in writing performance-critical operating system kernels, device drivers and similar. It's fine for Python to promise that nothing you do will ever cause a segfault, but for a language used to write kernels and device drivers, you probably want something more powerful and less constrained. But why is so much non-performance critical code written in C? Why so many user-space applications? History has shown us that the decision to prefer efficiency-by-default rather than correctness-by-default has been a disaster for software safety and security. C language practically is the embodiment of premature optimization: the language allows compilers to silently throw your code away in order to generate efficient code by default, whether you need it or not. -- Steven -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Fri, Dec 20, 2013 at 4:06 AM, Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote: Should some implementation decide to compile that away as dead code, it would be perfectly allowed to. (Well, assuming that it determined first that locals() actually was the built-in and not some substitute, either by static analysis or runtime testing.) Hmm. I'm not sure how safe it is to optimize that sort of thing away in Python. Is there any way to be truly sure that locals is still the built-in, and if there isn't, is there any advantage to optimizing it out with some sort of check to see if it should be de-optimized now? But yes, in theory you're right. Mutating locals() could be optimized out. ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Thursday, December 19, 2013 9:46:26 AM UTC+5:30, Roy Smith wrote: rusi wrote: Soon the foo has to split into foo1.c and foo2.c. And suddenly you need to understand: 1. Separate compilation 2. Make (which is separate from 'separate compilation') 3. Header files and libraries and the connection and difference It's pretty common here to have people ask questions about how import works. How altering sys.path effects import. Why is import not finding my module? You quickly get into things like virtualenv, and now you've got modules coming from your source tree, from your vitualenv, from your system library. You need to understand all of that to make it all work. Yes agreed. Python is far from stellar in this regard. Just as distutils got into the core at 2.3(??) now at 3.3 virtualenv(+pip+wheel) is getting in. Belated but better late than never. None of that is specific to C. Virtually any language (including Python) allows a program to be split up into multiple source files. If you're running all but the most trivial example, you need to know how to manage these multiple files and how the pieces interact. Thats a strange thing to say. In the abstract every language that allows for significant programs supports separate units/modules. Somewhere those units will map onto system entities -- usually though not always files (think of PL-SQL inside Oracle). Even assuming files, the lines drawn between interior (to the language) and exterior (OS-facing) are vastly different. C, Pascal, Python, Java, SML, APL -- all very different in this regard. Just adding this: Different languages do their modularizing and packaging differently (what I earlier said) in order to achieve different tradeoffs. Here's a thread by a competent programmer who switched from Lisp to C++. https://groups.google.com/forum/#!topic/ledger-cli/Mjky9AvrRKU He clearly says that while he loves Lisp the language, its packaging facilities lost out to C++ and so he rewrote his whole app in C++. -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Fri, Dec 20, 2013 at 4:12 AM, Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote: But why is so much non-performance critical code written in C? Why so many user-space applications? Very good question! I don't have an answer. There are a few maybe-answers, but they mostly come down to programmer didn't know of a viable alternative. When I wrote RosMud, I wrote it in C++, because I was making tweaks to an existing C++ program (Gmud) and because I thought that that was the way to make it run fast enough for what I needed. Seven years on (or will be, come January), I've learned how much can be done in Python, Pike, and other high level languages, and RosMud's successor is not written in C. Maybe part of the answer comes from people who've learned based on old hardware. Growing up in the 80s on an Epson XT-clone, I wrote code in BASIC, C, and assembly. Now, most of my performance problems in BASIC were because of flawed algorithms (it's amazing how slowly an O(n*n) algorithm will run, isn't it!), but I could imagine someone growing up learning C is the only way to make code run fast and then going on to teach the next generation of programmers to use C, without necessarily even explaining why. But that's just speculation. All I know is, even if you do need to write in C for some reason (your preferred language doesn't have bindings for some library, maybe), chances are you can write the tiniest bit of code that way, and do the rest in a high level language. ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 2013-12-19, Chris Angelico ros...@gmail.com wrote: On Fri, Dec 20, 2013 at 4:12 AM, Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote: But why is so much non-performance critical code written in C? Why so many user-space applications? Very good question! I don't have an answer. There are a few maybe-answers, but they mostly come down to programmer didn't know of a viable alternative. I believe it was Andrew Plotkin (glk, Glulxe, lots of other stuff) who said that writing good C requires something like brain-damage. Once you have acquired the brain-damage, writing C code is no problem; in fact, it feels darn good. And another thing: How many other languages have their very own calling convention? -- Neil Cerutti -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Fri, Dec 20, 2013 at 5:40 AM, Neil Cerutti ne...@norwich.edu wrote: And another thing: How many other languages have their very own calling convention? Pascal does (sometimes called the Win32 convention). ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
Wolfgang Keller wrote: In fact, thinking of it, a really good language should imho *require* verbosity (how about a *minimum* length - or maybe even a dictionary-based sanity check - for identifiers?), since that already keeps all those lazy morons away who think that shortcuts are cool. No, that wouldn't be a really good language, that would be a language designed by someone with a very shallow understanding of what makes programs understandable. A piece of code such as for (i = 0; i numThings; i++) total[i] += things[i]; is NOT improved by rewriting it as for (theLoopIndex = 0; theLoopIndex numThings; theLoopIndex++) total[theLoopIndex] += things[theLoopIndex]; Quite the reverse, IMO. -- Greg -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Fri, Dec 20, 2013 at 7:42 AM, Gregory Ewing greg.ew...@canterbury.ac.nz wrote: A piece of code such as for (i = 0; i numThings; i++) total[i] += things[i]; is NOT improved by rewriting it as for (theLoopIndex = 0; theLoopIndex numThings; theLoopIndex++) total[theLoopIndex] += things[theLoopIndex]; Quite the reverse, IMO. Wholeheartedly agreed. The only improvement I would make would be to declare i in the if block (valid in C++ and some of the more recent C standards), to emphasize the locality of the variable. ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Thu, 19 Dec 2013 04:50:54 +, Mark Lawrence wrote: If C is such a crap language, what does it says for the thousands of languages that never got anywhere? Or did C simply have a far larger sales and marketing budget? :) The sociology of computer languages is a fascinating topic. Like any technology, it's a mix of factors. Why did VHS defeat Betamax when all the experts agreed Betamax was the better system? How did Windows take over IT? The advantages of C in the 1970s and 80s included: - although portable C code is a sad joke, compared to most of the languages that came before it, C *is* portable; - C compilers can be small, efficient and fast, although they weren't as small, efficient and fast as (say) TurboPascal; - the machine code they generated was acceptably lightweight and fast, although not as lightweight and fast as (say) Forth; - C was an open standard at a time when computing was big enough that open standards were becoming important; - C did (and still does) have some areas where it is quite advantageous, like systems programming; - C benefited from it's close association with Unix, where Unix went, so did C; - Unix made some universities a lot of money, hence they had a motive to support C with both money and attention; - C was associated with universities, so people learned C and then taught C to the next generation of students, who went on to introduce C to industry; and - C (like Perl) falls into the hacker-machismo sweet-spot, where it is just challenging enough to still be fun without being either too easy or too hard. It is low-level enough to allow premature optimization (without being as low as assembly language, which is too low-level to be fun) and gives the freedom to play code golf and write amazingly obfuscated code. So C is a language that allows hackers to show off. Some of those reasons also applied to Lisp, and remember that in the 1970s and even 80s Lisp compilers were at least as efficient as C compilers. I believe there are two factors that lead to C becoming more popular than Lisp. The first is Worse Is Better: http://www.jwz.org/doc/worse-is-better.html The second is that, despite all the weird punctuation and digraphs and even trigraphs, C fits the mental space of English-speakers better than Lisp. To the average programmer, C is a more natural syntax and programming model than Lisp. -- Steven -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
In article 52b365b6$0$6512$c3e8da3$54964...@news.astraweb.com, Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote: [some stuff] where Unix went, so did C; [some more stuff] What he said. -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
In article 52b328f7$0$6512$c3e8da3$54964...@news.astraweb.com, Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote: Correct. The *great deal of trouble* part is important. Things which are the responsibility of the language and compiler in (say) Java, D, Rust, Go, etc. are the responsibility of the programmer with C. Does anybody ever use D? I looked at it a few years ago. It seemed like a very good concept. Sort of C++, with the worst of the crap torn out. If nothing else, with the preprocessor torn out :-) Did it ever go anywhere? Now, I wish to be absolutely clear. There are certain programming areas where squeezing out every last iota of performance is important, and to do so may require making some compromises on correctness or safety. I find the C standard's position on undefined behaviour to be irresponsible, but, hey, maybe it is justified on the basis that C is a systems language intended for use in writing performance-critical operating system kernels, device drivers and similar. It's fine for Python to promise that nothing you do will ever cause a segfault, but for a language used to write kernels and device drivers, you probably want something more powerful and less constrained. I disagree entirely (but respectfully). If you want to get down to the hardware where you can fiddle bits, you want as little getting between you and the silicon as possible. Every time you add a safety feature, you put another layer of *stuff* between you and the machine. That's not to say it's the right language to be writing applications. -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Fri, 20 Dec 2013 00:38:51 -, Roy Smith r...@panix.com wrote: I disagree entirely (but respectfully). If you want to get down to the hardware where you can fiddle bits, you want as little getting between you and the silicon as possible. Every time you add a safety feature, you put another layer of *stuff* between you and the machine. That's not to say it's the right language to be writing applications. +1 -- Rhodri James *-* Wildebeest Herder to the Masses -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Thu, 19 Dec 2013 19:38:51 -0500, Roy Smith wrote: In article 52b328f7$0$6512$c3e8da3$54964...@news.astraweb.com, Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote: Correct. The *great deal of trouble* part is important. Things which are the responsibility of the language and compiler in (say) Java, D, Rust, Go, etc. are the responsibility of the programmer with C. Does anybody ever use D? I looked at it a few years ago. It seemed like a very good concept. Sort of C++, with the worst of the crap torn out. If nothing else, with the preprocessor torn out :-) Did it ever go anywhere? There are still people using D. Like most niche languages, it's in a niche :-) Now, I wish to be absolutely clear. There are certain programming areas where squeezing out every last iota of performance is important, and to do so MAY REQUIRE MAKING SOME COMPROMISES ON CORRECTNESS OR SAFETY. I find the C standard's position on undefined behaviour to be irresponsible, but, hey, MAYBE IT IS JUSTIFIED on the basis that C is a systems language intended for use in writing performance-critical operating system kernels, device drivers and similar. It's fine for Python to promise that nothing you do will ever cause a segfault, but for a language used to write kernels and device drivers, you probably want something more powerful and less constrained. Emphasis added. I disagree entirely (but respectfully). If you want to get down to the hardware where you can fiddle bits, you want as little getting between you and the silicon as possible. Every time you add a safety feature, you put another layer of *stuff* between you and the machine. I think that if you re-read what I wrote, you actually agree with me. With the following two provisos: 1) There is a tendency among some programmers to premature optimization and coding machismo where correctness is a distant fourth place behind speed, memory use, and code size -- and security doesn't even place. For those programmers, I want to get down to the hardware often has nothing to do with *needing* to get down to the hardware. Screw 'em. 2) Even for kernel developers, I believe that systems languages should be safe by default. You ought to have to explicitly disable (say) bounds checking in critical sections of code, rather than explicitly enable it. Or worse, have to program your own bounds checking -- especially if the compiler is permitted to silently disregard it if you make one tiny mistake. That's not to say it's the right language to be writing applications. I find it interesting to note that the utter failure of C programmers to deal with the consequences of buffer overflows has lead Intel to put pointer safety into hardware. http://software.intel.com/en-us/articles/introduction-to-intel-memory-protection-extensions There is little reason to believe that a safer language would necessarily be slower in practice. C has had 40 years of development to get to where it is now. With a fraction of the development of C, Scala code gets to within a factor of 2-3 of the equivalent C++ code, Go to within a factor of 5-7, and even Java to within a factor of 3-4. Okay, so Java has had oodles of optimization development too, so that's probably about as good as it will get. Imagine if newer languages like Go and Rust had even a quarter of the development effort as C and C++. http://readwrite.com/2011/06/06/cpp-go-java-scala-performance-benchmark -- Steven -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Tue, 17 Dec 2013 22:49:43 -0500, Paul Smith wrote: On Wed, 2013-12-18 at 01:33 +, Steven D'Aprano wrote: And What does 'implementation-specific undefined behaviour' actually mean in practice?, another common question when dealing with C. Only asked by people who haven't had it explained. There's undefined behavior, and there's implementation-specific behavior, but it is impossible to have implementation-specific undefined behavior. Of course it is possible. An implementation has to do *something*, even if it's not defined anywhere. Even if that something is crash, or emit no code, or display a compile-time error. I think you're making a distinction that doesn't apply to the plain-English meaning of the words I was using: the behaviour is undefined by the standard and specific to that implementation. And, the definitions are simple to understand: undefined behavior means that if your program invokes it, there is no definition of what will happen. This is buggy code. Yes, it is buggy code, but nevertheless it often works the way people expect it to work, and so through carelessness or ignorance programmers rely on it. If you've ever written i+1 without a guard for the case that i is INT_MAX, you're guilty of that too. The C99 standard lists 191 different kinds of undefined behavior, including what happens when there is an unmatched ' or on a line of source code. You want to know why programs written in C are so often full of security holes? One reason is undefined behaviour. The C language doesn't give a damn about writing *correct* code, it only cares about writing *efficient* code. Consequently, one little error, and does the compiler tell you that you've done something undefined? No, it turns your validator into a no-op -- silently: http://code.google.com/p/nativeclient/issues/detail?id=245 No compile-time error, no run-time error, just blindingly fast and correct (according to the standard) code that does the wrong thing. Implementation-specific behavior means that the standard requires the implementation to do some well-defined thing, but the standard does not define exactly what it must be. You can go look up what your implementation will do in its documentation (the standard requires that it be documented), but you can't assume the same thing will happen in another implementation. This is non-portable code. So much for the promise of C to be portable :-) It's a very rare language indeed that has no undefined or implementation-specific behaviors. Java? Ada? But indeed, most languages do have odd corners where odd things happen. Including Python, as you point out. But C has so many of them, and they affect *nearly everything*. The aim of C is to write fast code, and if it happens to be correct, that's a bonus. C compilers will compromise on safety and correctness in order to be fast. Then end result is usually one of two outcomes: - the programmer spends a lot of time and effort to manually guard against the undefined behaviour, thus slowing down the code; - or he doesn't, and has bugs and security vulnerabilities in the code. Python gets to cheat by having one reference implementation. Every time you've had to go try something out in the Python interpreter because the documentation didn't provide the details you needed, that WAS implementation-specific behavior. The situation is quite different though. Python makes at least one implicit promise: nothing you write in pure Python can possibly cause a segfault. No buffer overflows for you! We don't know what locals()['spam'] = 42 will do inside a function, but unlike the C case, we can reason about it: - it may bind 42 to the name spam; - it may raise a runtime exception; - it may even be a no-op; But even if it is a no-op, the Python compiler doesn't have carte blanche to do anything it likes with the entire function, as a C compiler has. C has more indeterminacy than Python. You never have to wonder what the lifetime of an object is, Since C isn't object oriented, the lifetime of objects in C is, um, any number you like. The lifetime of objects in some language with no objects is ONE MILLION YEARS!!! is as good as any other vacuously true statement. The implication that only an object oriented language could have a concept of object lifetimes is false. Only object-oriented languages have *objects*. C does not have objects, it has values. And yes, I'm being pedantic. [...] or be mystified by which of the 7 signatures of Foo.foo() are going to get called, Is that even possible in C? If Foo is a struct, and Foo.foo a member, I don't think C has first-class functions and so Foo.foo can't be callable. Of course that's valid C. It's true that C doesn't have first-class functions, but it supports invoking functions through pointers and you can store functions in data members, pass functions as arguments, and return functions from
Re: Experiences/guidance on teaching Python as a first programming language
On Wed, 18 Dec 2013 13:11:58 +1100, Chris Angelico wrote: On Wed, Dec 18, 2013 at 12:33 PM, Steven D'Aprano st...@pearwood.info wrote: On Tue, 17 Dec 2013 19:32:20 -0500, Roy Smith wrote: There's very few mysteries in C. Apart from What the hell does this piece of code actually do?. It's no coincidence that C, and Perl which borrows a lot of syntax from C, are the two champion languages for writing obfuscated code. I thought APL would beat both of them, though you're right that the International Obfuscoted Python Code Contest would be a quite different beast. But maybe it'd be just as viable... a competent programmer can write unreadable code in any language. And What does 'implementation-specific undefined behaviour' actually mean in practice?, another common question when dealing with C. You mean like mutating locals()? The only difference is that there are a lot more implementations of C than there are of Python (especially popular and well-used implementations). There are plenty of things you shouldn't do in Python, but instead of calling them implementation-specific undefined behaviour, we call them consenting adults and shooting yourself in the foot. And most importantly, how many asterisks do I need, and where do I put them? (only half joking). The one differentiation that I don't like is between the . and - operators. The distinction feels like syntactic salt. There's no context when both are valid, save in C++ where you can create a pointer-like object that implements the - operator (and has the . operator for its own members). You never have to wonder what the lifetime of an object is, Since C isn't object oriented, the lifetime of objects in C is, um, any number you like. The lifetime of objects in some language with no objects is ONE MILLION YEARS!!! is as good as any other vacuously true statement. Lifetime still matters. The difference between automatic and static variables is lifetime - you come back into this function and the same value is there waiting for you. Call it values or things instead of objects if it makes you feel better, but the consideration is identical. (And in C++, it becomes critical, with object destructors being used to release resources. So you need to know.) or be mystified by which of the 7 signatures of Foo.foo() are going to get called, Is that even possible in C? If Foo is a struct, and Foo.foo a member, I don't think C has first-class functions and so Foo.foo can't be callable. But if I'm wrong, and it is callable, then surely with no arguments there can only be one signature that Foo.foo() might call, even if C supported generic functions, which I don't believe it does. Well, okay. In C you can't have Foo.foo(). Hah, well according to Paul Smith's example code you can. So either: - it's possible to be an experienced C programmer and still have fundamental gaps in your knowledge about basic concepts like dotted function calls; - or Paul's sample code was not what he claimed it to be; - or maybe the whole thing is undefined and we're all right! C both does and doesn't allow Foo.foo() function calls, *sometimes at the same time*. -- Steven -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Wed, Dec 18, 2013 at 7:22 PM, Steven D'Aprano st...@pearwood.info wrote: Well, okay. In C you can't have Foo.foo(). Hah, well according to Paul Smith's example code you can. So either: - it's possible to be an experienced C programmer and still have fundamental gaps in your knowledge about basic concepts like dotted function calls; - or Paul's sample code was not what he claimed it to be; - or maybe the whole thing is undefined and we're all right! C both does and doesn't allow Foo.foo() function calls, *sometimes at the same time*. - or it's possible to be an experienced C and C++ programmer and have your mind just blank out about which things you can do in C and which were added in C++, especially when you've been up all night and are wired on energy drinks. Mea culpa. My brain looked at that and thought it was a member function, which is a C++ feature, and forgot that it could be a straight-up function pointer, which is a C feature. Paul's correct, I'm wrong. ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Wed, Dec 18, 2013 at 7:18 PM, Steven D'Aprano st...@pearwood.info wrote: You want to know why programs written in C are so often full of security holes? One reason is undefined behaviour. The C language doesn't give a damn about writing *correct* code, it only cares about writing *efficient* code. Consequently, one little error, and does the compiler tell you that you've done something undefined? No, it turns your validator into a no-op -- silently: I disagree about undefined behaviour causing a large proportion of security holes. Maybe it produces some, but it's more likely to produce crashes or inoperative codde. The example you cite is a rare one where it's actually security code; that's why it's a security vulnerability. If the same operation were flagging, say, that this value needed to be saved to disk, then the same bug would result in stuff not getting saved on that platform, which isn't a security risk. Here's something from CERN about C and security: https://security.web.cern.ch/security/recommendations/en/codetools/c.shtml Apart from the last one (file system atomicity, not a C issue at all), every single issue on that page comes back to one thing: fixed-size buffers and functions that treat a char pointer as if it were a string. In fact, that one fundamental issue - the buffer overrun - comes up directly when I search Google for 'most common security holes in c code' (second hit, Wikipedia Buffer overflow page). Here's another page listing security concerns: http://www.makelinux.net/alp/085 First entry: Buffer overruns. Second: File system races. Third: Improper quoting of shell commands. Not one of the above pages, nor any other that I came across as I was skimming, mentioned anything involving undefined behaviour. Every one of them is issues with properly-defined behaviour - unless you count specific details of memory layout. If you know that automatic variables are stored on the stack, then you can blow some buffer and overwrite the return value. But that's the *attacker* depending on undefined behaviour, not the *programmer*, who simply has a bug in his code (something that's able to write more to the buffer than there's room for). Python is actually *worse* than C in this respect. I know this particular one is reasonably well known now, but how likely is it that you'll still see code like this: def create_file(): f = open(., w) f.write(...) f.write(...) f.write(...) Looks fine, is nice and simple, does exactly what it should. And in (current versions of) CPython, this will close the file before the function returns, so it'd be perfectly safe to then immediately read from that file. But that's undefined behaviour. Python does not guarantee that this will work, so this might work for years and then break when it's run under Jython, or it might even work in Jython too, but there's just one specific situation where the file's read immediately after being written, and that one case fails. I think that's used at least as often as any of C's pieces of undefined behaviour. ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Tuesday, December 17, 2013 4:42:07 PM UTC+5:30, Oscar Benjamin wrote: On 17 December 2013 00:39, rusi wrote: I had a paper some years ago on why C is a horrible language *to teach with* http://www.the-magus.in/Publications/chor.pdf Thanks for this Rusi, I just read it and it describes very well what I think about our own C course. My choice quote from the beginning would be When the irrelevant becomes significant, the essentials become obscured and incomprehensible. (BTW is there any reason that the document is repeated twice in the same pdf?) Thanks for the heads-up -- some pdf generation issues I guess Is it ok now? Yeah I could clean up some more formatting some more but its 25 years now and Ive forgotten my troff!! More important the tone is not what I would use today. The point I was trying to make then was: C is an unsuitable language to TEACH PROGRAMMING WITH because it fills students' brains with irreleventia Once one knows the stuff, C is a NEAT programming language. IOW its a question of learning-curve not the content. -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 18 December 2013 09:18, rusi rustompm...@gmail.com wrote: (BTW is there any reason that the document is repeated twice in the same pdf?) Thanks for the heads-up -- some pdf generation issues I guess Is it ok now? Yes. Also it definitely reads better without the twocolumn format. Oscar -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 18 Dec 2013 08:22:58 GMT, Steven D'Aprano st...@pearwood.info wrote: On Wed, 18 Dec 2013 13:11:58 +1100, Chris Angelico wrote: The one differentiation that I don't like is between the . and - operators. The distinction feels like syntactic salt. There's no context when both are valid, save in C++ where you can create a pointer-like object that implements the - operator (and has the . operator for its own members). Funny you should say that in the middle of a discussion about lifetime. In C, when you do the - thing, you're now in a different struct with a potentially different lifetime. If p is a local, with auto lifetime, then so is p.x So, although the two are mutually exclusive, there's valuable information hidden in the required choice. -- DaveA -- https://mail.python.org/mailman/listinfo/python-list
Re: Re: Experiences/guidance on teaching Python as a first programming language
On Wed, Dec 18, 2013 at 11:53 PM, Dave Angel da...@davea.name wrote: Funny you should say that in the middle of a discussion about lifetime. In C, when you do the - thing, you're now in a different struct with a potentially different lifetime. If p is a local, with auto lifetime, then so is p.x So, although the two are mutually exclusive, there's valuable information hidden in the required choice. Sure, but you can figure out whether p is a local struct or a local pointer to some other struct by looking at its declaration. Do you also need to look at every usage of it? We don't adorn every / with a marker saying whether we're dividing ints or floats, and that's something that could be potentially useful (float division of two ints being what Py3 does). Why adorn pointer usage? ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 2013-12-18, Chris Angelico ros...@gmail.com wrote: On Wed, Dec 18, 2013 at 11:53 PM, Dave Angel da...@davea.name wrote: Funny you should say that in the middle of a discussion about lifetime. In C, when you do the - thing, you're now in a different struct with a potentially different lifetime. If p is a local, with auto lifetime, then so is p.x So, although the two are mutually exclusive, there's valuable information hidden in the required choice. Sure, but you can figure out whether p is a local struct or a local pointer to some other struct by looking at its declaration. Do you also need to look at every usage of it? We don't adorn every / with a marker saying whether we're dividing ints or floats, and that's something that could be potentially useful (float division of two ints being what Py3 does). Why adorn pointer usage? Indeed. Golang allows . to do member lookup for both structs and pointers to structs. The - syntax perhaps was needful in the days before function prototypes. -- Neil Cerutti -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 12/18/2013 12:18 AM, Steven D'Aprano wrote: On Tue, 17 Dec 2013 22:49:43 -0500, Paul Smith wrote: On Wed, 2013-12-18 at 01:33 +, Steven D'Aprano wrote: On 12/17/2013 04:32 PM, Roy Smith wrote: You never have to wonder what the lifetime of an object is, Since C isn't object oriented, the lifetime of objects in C is, um, any number you like. The lifetime of objects in some language with no objects is ONE MILLION YEARS!!! is as good as any other vacuously true statement. The implication that only an object oriented language could have a concept of object lifetimes is false. Only object-oriented languages have *objects*. C does not have objects, it has values. The word 'object' has many more meanings than the one implied by Object Oriented Programming, as you well know. And yes, I'm being pedantic. No, you're being an ass. -- ~Ethan~ -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Wednesday, December 18, 2013 8:53:54 PM UTC+5:30, Ethan Furman wrote: On 12/18/2013 12:18 AM, Steven D'Aprano wrote: On Tue, 17 Dec 2013 22:49:43 -0500, Paul Smith wrote: On Wed, 2013-12-18 at 01:33 +, Steven D'Aprano wrote: On 12/17/2013 04:32 PM, Roy Smith wrote: You never have to wonder what the lifetime of an object is, Since C isn't object oriented, the lifetime of objects in C is, um, any number you like. The lifetime of objects in some language with no objects is ONE MILLION YEARS!!! is as good as any other vacuously true statement. The implication that only an object oriented language could have a concept of object lifetimes is false. Only object-oriented languages have *objects*. C does not have objects, it has values. The word 'object' has many more meanings than the one implied by Object Oriented Programming, as you well know. And yes, I'm being pedantic. No, you're being an ass. Is this discussion REALLY happening...??? In a non-programmer/layman forum it would be completely normal However given that we are supposedly a programmer list I am incredulous Here is some innocuous looking python code: A def draw_helper(canvas, level, p1, p2, p3): if level == 1: canvas.create_polygon(p1, p2, p3) else: p4 = midpoint(p1, p2) p5 = midpoint(p2, p3) p6 = midpoint(p1, p3) draw_helper(canvas, level - 1, p1, p4, p6) draw_helper(canvas, level - 1, p4, p2, p5) draw_helper(canvas, level - 1, p6, p5, p3) And here is what happens when you run it B http://homes.cs.washington.edu/~reges/python/sierpinski8.png (More here http://homes.cs.washington.edu/~reges/python/) Can you really say that what you see in B you can infer from A WITHOUT RUNNING IT?? The above is the subject that is technically called 'complexity' in math terms. If we allow the term 'complex' to be more general (like the argument about 'object') then this becomes the pain and beauty, the mystery and horror of programming -- seemingly trivial code when seen as a PROGRAM can endlessly evolve into unimaginable complexity when elaborated into a PROCESS. So when Chris/Roy are talking of the simplicity of C's lifetime rules they are talking of the primitive building blocks to make and understand program-texts. And when Steven/Devin are talking of the complexity of the same they are talking of the arcane results that emerge when those programs run. And from here its a small step to understand why python's slightly more complicated semantics result in so much less complexity than C's seemingly simple rules: C has a double complexity generator -- stack + heap vs python only having a 'managed' heap. Analogously if the Sierpinsky triangle above were flattened into 1-d there would be nothing to note about it. Like python: Boring 'weenie' language... Never segfaults -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 18/12/2013 08:18, Steven D'Aprano wrote: The C99 standard lists 191 different kinds of undefined behavior, including what happens when there is an unmatched ' or on a line of source code. No compile-time error, no run-time error, just blindingly fast and correct (according to the standard) code that does the wrong thing. Plenty of compile-time warnings depending on the compiler, which the CPython core devs take a great deal of trouble to eliminate on every buildbot. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 2013-12-18, Chris Angelico ros...@gmail.com wrote: Well, okay. In C you can't have Foo.foo(). If Foo is a structure with a field named foo that is a pointer to a function, then you can indeed have Foo.foo(). -- Grant Edwards grant.b.edwardsYow! It's OKAY -- I'm an at INTELLECTUAL, too. gmail.com -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 2013-12-18, Roy Smith r...@panix.com wrote: In article l8pvsl$60h$1...@reader1.panix.com, Grant Edwards invalid@invalid.invalid wrote: Ideally, you should also have written at least one functioning compiler before learning C as well. Why? I've never written a compiler. I've written plenty of C. I don't see how my lack of compiler writing experience has hindered my ability to write C. I've always felt that there are features in C that don't make a lot of sense until you've actually implemented a compiler -- at which point it becomes a lot more obvious why some thing are done certain ways. Maybe that's just me. I had written a compiler before I learned C, and there were things that made perfect sense to me that seemed to confuse others I worked with who were learning C at the same time. -- Grant Edwards grant.b.edwardsYow! It's a hole all the at way to downtown Burbank! gmail.com -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 18/12/2013 18:00, Grant Edwards wrote: On 2013-12-18, Chris Angelico ros...@gmail.com wrote: Well, okay. In C you can't have Foo.foo(). If Foo is a structure with a field named foo that is a pointer to a function, then you can indeed have Foo.foo(). Complete fooey :) -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 18/12/2013 18:05, Grant Edwards wrote: On 2013-12-18, Roy Smith r...@panix.com wrote: In article l8pvsl$60h$1...@reader1.panix.com, Grant Edwards invalid@invalid.invalid wrote: Ideally, you should also have written at least one functioning compiler before learning C as well. Why? I've never written a compiler. I've written plenty of C. I don't see how my lack of compiler writing experience has hindered my ability to write C. I've always felt that there are features in C that don't make a lot of sense until you've actually implemented a compiler -- at which point it becomes a lot more obvious why some thing are done certain ways. Maybe that's just me. I had written a compiler before I learned C, and there were things that made perfect sense to me that seemed to confuse others I worked with who were learning C at the same time. I've never contemplated writing a compiler, let alone actually written one. It's like the comments along the lines of you can't call yourself a programmer until you've mastered regular expressions. Some of my mates who work on air traffic management systems have maybe never heard of a regex but who cares, I certainly don't. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence -- https://mail.python.org/mailman/listinfo/python-list
Re: Re: Experiences/guidance on teaching Python as a first programming language
On Thu, 19 Dec 2013 01:55:10 +1100, Chris Angelico ros...@gmail.com wrote: Sure, but you can figure out whether p is a local struct or a local pointer to some other struct by looking at its declaration. Do you also need to look at every usage of it? C is a glorified macro assembler. So the - operator is not analogous to the dot operator, it's Syntactic sugar: p- a. Is really (*p).a -- DaveA -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 12/18/2013 3:18 AM, Steven D'Aprano wrote: We don't know what locals()['spam'] = 42 will do inside a function, I am mystified that you would write this. Locals() will Update and return a dictionary representing the current local symbol table. The only thing unspecified is the relation between the 'current local symbol table' and the *dict* that 'represents' it. Given that a dict is returned, the rest is unambiguous. unlike the C case, we can reason about it: - it may bind 42 to the name spam; somedict['spam'] = 42 will do exactly that. - it may raise a runtime exception; Absolutely not. - it may even be a no-op; Absolutely not. -- Terry Jan Reedy -- https://mail.python.org/mailman/listinfo/python-list
Re: Re: Experiences/guidance on teaching Python as a first programming language
On Wed, 18 Dec 2013 14:55:10 -, Chris Angelico ros...@gmail.com wrote: On Wed, Dec 18, 2013 at 11:53 PM, Dave Angel da...@davea.name wrote: Funny you should say that in the middle of a discussion about lifetime. In C, when you do the - thing, you're now in a different struct with a potentially different lifetime. If p is a local, with auto lifetime, then so is p.x So, although the two are mutually exclusive, there's valuable information hidden in the required choice. Sure, but you can figure out whether p is a local struct or a local pointer to some other struct by looking at its declaration. Do you also need to look at every usage of it? We don't adorn every / with a marker saying whether we're dividing ints or floats, and that's something that could be potentially useful (float division of two ints being what Py3 does). Why adorn pointer usage? Because explicit is better than implicit? -- Rhodri James *-* Wildebeest Herder to the Masses -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 18 December 2013 22:33, Terry Reedy tjre...@udel.edu wrote: On 12/18/2013 3:18 AM, Steven D'Aprano wrote: We don't know what locals()['spam'] = 42 will do inside a function, I am mystified that you would write this. Locals() will Update and return a dictionary representing the current local symbol table. The only thing unspecified is the relation between the 'current local symbol table' and the *dict* that 'represents' it. Given that a dict is returned, the rest is unambiguous. It's not unambiguous. The full wording is: ''' locals() Update and return a dictionary representing the current local symbol table. Free variables are returned by locals() when it is called in function blocks, but not in class blocks. Note: The contents of this dictionary should not be modified; changes may not affect the values of local and free variables used by the interpreter. ''' The part that says changes may ... is deliberately ambiguous; the author didn't want to impose too strong a constraint on any particular implementation. unlike the C case, we can reason about it: - it may bind 42 to the name spam; somedict['spam'] = 42 will do exactly that. That's not what is usually meant by name-binding. - it may raise a runtime exception; Absolutely not. Agreed. - it may even be a no-op; Absolutely not. Incorrect. The code in question is: locals()['spam'] = 42 and it is semantically a no-op. The index assignment on a temporary dict may actually be performed by e.g. the CPython interpreter but it is really just dead code. Oscar -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Tue, 17 Dec 2013 15:51:44 -, Wolfgang Keller felip...@gmx.net wrote: The only issue for me was to figure out how to do in C what I already knew in Pascal. And I had to waste a *lot* more time and mental effort to mess with that language than it took for me to learn *both* the basics of programming per se *and* Pascal in the first class at my home university. It's sounds like you made, and are carrying on making, one of the classic mistakes of software engineering: you're trying to write one language in the style of another. It is possible to write C code as if it were Pascal, but it's a painful process and it won't be pretty. It's far better to use a language as it is rather than as you want it to be. -- Rhodri James *-* Wildebeest Herder to the Masses -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Thu, Dec 19, 2013 at 11:49 AM, Rhodri James rho...@wildebst.org.uk wrote: It's sounds like you made, and are carrying on making, one of the classic mistakes of software engineering Never get into a flame war in Asia, and never go up against a C programmer when segfaults are on the line! ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
In article l8so4d$snu$2...@reader1.panix.com, Grant Edwards invalid@invalid.invalid wrote: On 2013-12-18, Roy Smith r...@panix.com wrote: In article l8pvsl$60h$1...@reader1.panix.com, Grant Edwards invalid@invalid.invalid wrote: Ideally, you should also have written at least one functioning compiler before learning C as well. Why? I've never written a compiler. I've written plenty of C. I don't see how my lack of compiler writing experience has hindered my ability to write C. I've always felt that there are features in C that don't make a lot of sense until you've actually implemented a compiler -- at which point it becomes a lot more obvious why some thing are done certain ways. Example? I suspect what you mean is, There are some things that don't make sense until you understand computer architecture. -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
In article mailman.4372.1387390692.18130.python-l...@python.org, Mark Lawrence breamore...@yahoo.co.uk wrote: I've never contemplated writing a compiler, let alone actually written one. It's like the comments along the lines of you can't call yourself a programmer until you've mastered regular expressions. Who makes comments like that? As far as I can tell, I'm the resident regexphile on this newsgroup, and I certainly don't say that. I find it frustrating that Pythonistas shy away from regex as much as they do. Yes, Python strings have a rich set of built-in operations which provide easy ways to do a lot of things traditionally done with regexes in other languages. Regex is a powerful tool, and programmers will improve their skill set by knowing how to use them. But that's not the same as saying you can't be a programmer if you don't know regex. -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
In article l8snr8$snu$1...@reader1.panix.com, Grant Edwards invalid@invalid.invalid wrote: On 2013-12-18, Chris Angelico ros...@gmail.com wrote: Well, okay. In C you can't have Foo.foo(). If Foo is a structure with a field named foo that is a pointer to a function, then you can indeed have Foo.foo(). Sigh. This has gone off in a direction I never intended. What I meant was that in C++, when you write call a method by name, it can sometimes be difficult to know exactly what method is being called. Between inheritance, optional parameters, automatic type promotion, default constructors, and maybe a few other things I've forgotten, even if you've got all the signatures of foo() in front of you, it can sometimes be hard to figure out which one the compiler will pick. And that sort of confusion never happens in C. -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 19/12/2013 01:49, Roy Smith wrote: In article mailman.4372.1387390692.18130.python-l...@python.org, Mark Lawrence breamore...@yahoo.co.uk wrote: I've never contemplated writing a compiler, let alone actually written one. It's like the comments along the lines of you can't call yourself a programmer until you've mastered regular expressions. Who makes comments like that? As far as I can tell, I'm the resident regexphile on this newsgroup, and I certainly don't say that. I find it frustrating that Pythonistas shy away from regex as much as they do. Yes, Python strings have a rich set of built-in operations which provide easy ways to do a lot of things traditionally done with regexes in other languages. Regex is a powerful tool, and programmers will improve their skill set by knowing how to use them. But that's not the same as saying you can't be a programmer if you don't know regex. Idiots make comments like that, I've seen them in the past, and no, I can't remember where :) As for me I'm not a regexphobe, more a stringmethodphile. But I'm not going to use a dozen string methods when one fairly simple regex will do the same job in one hit. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Thursday, December 19, 2013 7:10:53 AM UTC+5:30, Roy Smith wrote: Grant Edwards wrote: I've always felt that there are features in C that don't make a lot of sense until you've actually implemented a compiler -- at which point it becomes a lot more obvious why some thing are done certain ways. Example? I suspect what you mean is, There are some things that don't make sense until you understand computer architecture. One way of rephrasing what Grant is saying is: You cannot be a C programmer without being a system programmer This certainly includes machine (hardware) architecture. But it includes much else besides, which can generally be subsumed under the rubric toolchain A python programmer can write foo.py and run: $ python foo.py A C programmer writes foo.c and has to run the sequence: $ gcc foo.c $ a.out So far the difference looks minimal. However it does not stop here. Soon the foo has to split into foo1.c and foo2.c. And suddenly you need to understand: 1. Separate compilation 2. Make (which is separate from 'separate compilation') 3. Header files and libraries and the connection and difference Now if youve taught a few classes you will know what a hell each of these is. In particular, every C teacher struggles with: stdio.h is the standard library And all this has not yet touched the labyrinths of linker errors with the corresponding magic spells called ranlib, nm etc Got past all this kid-stuff? Now for the Great Initiation into Manhood -- autoconf So... Is all this core computer science? Or is it the curiosities of 40 year old technology? -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
In article 07c6e6a3-c5f4-4846-9551-434bdaba8...@googlegroups.com, rusi rustompm...@gmail.com wrote: Soon the foo has to split into foo1.c and foo2.c. And suddenly you need to understand: 1. Separate compilation 2. Make (which is separate from 'separate compilation') 3. Header files and libraries and the connection and difference None of that is specific to C. Virtually any language (including Python) allows a program to be split up into multiple source files. If you're running all but the most trivial example, you need to know how to manage these multiple files and how the pieces interact. It's pretty common here to have people ask questions about how import works. How altering sys.path effects import. Why is import not finding my module? You quickly get into things like virtualenv, and now you've got modules coming from your source tree, from your vitualenv, from your system library. You need to understand all of that to make it all work. -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Thu, Dec 19, 2013 at 3:16 PM, Roy Smith r...@panix.com wrote: It's pretty common here to have people ask questions about how import works. How altering sys.path effects import. Why is import not finding my module? You quickly get into things like virtualenv, and now you've got modules coming from your source tree, from your vitualenv, from your system library. You need to understand all of that to make it all work. Python might one day want to separate system paths from local paths, to give the same effect as: #include stdio.h #include local_config.h where the current directory won't be searched for stdio.h. But other than that, it's exactly the same consideration in either Python or C. ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Thursday, December 19, 2013 6:19:04 AM UTC+5:30, Rhodri James wrote: On Tue, 17 Dec 2013 15:51:44 -, Wolfgang Keller wrote: The only issue for me was to figure out how to do in C what I already knew in Pascal. And I had to waste a *lot* more time and mental effort to mess with that language than it took for me to learn *both* the basics of programming per se *and* Pascal in the first class at my home university. It's sounds like you made, and are carrying on making, one of the classic mistakes of software engineering: you're trying to write one language in the style of another. It is possible to write C code as if it were Pascal, but it's a painful process and it won't be pretty. It's far better to use a language as it is rather than as you want it to be. Yes but the reverse is also true: Sometimes the best code in language L is first conceptualized in design-language D and then 'coded' into L. When we were students D was called 'flow-charts' Gone out of fashion today and replaced by UML. Now I expect the majority on this list to not care for UML. However the idea of a separate design language is not negated by the fact that UML is overkill and silly. eg Saw this (on the Erlang mailing list) In some Australian university (in the 90s) 2 sems of Cobol was replaced by 1 sem Scheme + 1 sem Cobol. Students learnt more Cobol in the second arrangement than in the first. [Note: 'More Cobol' not 'More Programming'] Now if you were to ask those *students* I would expect similar emotions towards Cobol as Wolfgang is expressing towards C. That is however a separate issue :D -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Thursday, December 19, 2013 9:46:26 AM UTC+5:30, Roy Smith wrote: rusi wrote: Soon the foo has to split into foo1.c and foo2.c. And suddenly you need to understand: 1. Separate compilation 2. Make (which is separate from 'separate compilation') 3. Header files and libraries and the connection and difference It's pretty common here to have people ask questions about how import works. How altering sys.path effects import. Why is import not finding my module? You quickly get into things like virtualenv, and now you've got modules coming from your source tree, from your vitualenv, from your system library. You need to understand all of that to make it all work. Yes agreed. Python is far from stellar in this regard. Just as distutils got into the core at 2.3(??) now at 3.3 virtualenv(+pip+wheel) is getting in. Belated but better late than never. None of that is specific to C. Virtually any language (including Python) allows a program to be split up into multiple source files. If you're running all but the most trivial example, you need to know how to manage these multiple files and how the pieces interact. Thats a strange thing to say. In the abstract every language that allows for significant programs supports separate units/modules. Somewhere those units will map onto system entities -- usually though not always files (think of PL-SQL inside Oracle). Even assuming files, the lines drawn between interior (to the language) and exterior (OS-facing) are vastly different. C, Pascal, Python, Java, SML, APL -- all very different in this regard. -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 19/12/2013 04:29, rusi wrote: On Thursday, December 19, 2013 6:19:04 AM UTC+5:30, Rhodri James wrote: On Tue, 17 Dec 2013 15:51:44 -, Wolfgang Keller wrote: The only issue for me was to figure out how to do in C what I already knew in Pascal. And I had to waste a *lot* more time and mental effort to mess with that language than it took for me to learn *both* the basics of programming per se *and* Pascal in the first class at my home university. It's sounds like you made, and are carrying on making, one of the classic mistakes of software engineering: you're trying to write one language in the style of another. It is possible to write C code as if it were Pascal, but it's a painful process and it won't be pretty. It's far better to use a language as it is rather than as you want it to be. Yes but the reverse is also true: Sometimes the best code in language L is first conceptualized in design-language D and then 'coded' into L. When we were students D was called 'flow-charts' Gone out of fashion today and replaced by UML. Now I expect the majority on this list to not care for UML. However the idea of a separate design language is not negated by the fact that UML is overkill and silly. eg Saw this (on the Erlang mailing list) In some Australian university (in the 90s) 2 sems of Cobol was replaced by 1 sem Scheme + 1 sem Cobol. Students learnt more Cobol in the second arrangement than in the first. [Note: 'More Cobol' not 'More Programming'] Now if you were to ask those *students* I would expect similar emotions towards Cobol as Wolfgang is expressing towards C. That is however a separate issue :D If C is such a crap language, what does it says for the thousands of languages that never got anywhere? Or did C simply have a far larger sales and marketing budget? :) -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Thursday, December 19, 2013 10:20:54 AM UTC+5:30, Mark Lawrence wrote: On 19/12/2013 04:29, rusi wrote: On Thursday, December 19, 2013 6:19:04 AM UTC+5:30, Rhodri James wrote: On Tue, 17 Dec 2013 15:51:44 -, Wolfgang Keller wrote: The only issue for me was to figure out how to do in C what I already knew in Pascal. And I had to waste a *lot* more time and mental effort to mess with that language than it took for me to learn *both* the basics of programming per se *and* Pascal in the first class at my home university. It's sounds like you made, and are carrying on making, one of the classic mistakes of software engineering: you're trying to write one language in the style of another. It is possible to write C code as if it were Pascal, but it's a painful process and it won't be pretty. It's far better to use a language as it is rather than as you want it to be. Yes but the reverse is also true: Sometimes the best code in language L is first conceptualized in design-language D and then 'coded' into L. When we were students D was called 'flow-charts' Gone out of fashion today and replaced by UML. Now I expect the majority on this list to not care for UML. However the idea of a separate design language is not negated by the fact that UML is overkill and silly. eg Saw this (on the Erlang mailing list) In some Australian university (in the 90s) 2 sems of Cobol was replaced by 1 sem Scheme + 1 sem Cobol. Students learnt more Cobol in the second arrangement than in the first. [Note: 'More Cobol' not 'More Programming'] Now if you were to ask those *students* I would expect similar emotions towards Cobol as Wolfgang is expressing towards C. That is however a separate issue :D If C is such a crap language, what does it says for the thousands of languages that never got anywhere? Or did C simply have a far larger sales and marketing budget? :) Are you addressing that to me? [Assuming you are a good boy who does not use GG-crap and knows the laws of snipping and attributing I am taking it so :D ] No I am not in the 'C-is-crap' camp. Very far into the opposite actually. What would you say to someone who says: - food is crap to eat - air is crap to breathe C is crap technology is analogous. If you are using python its likely CPython. Whats the C there? If you are connected to the net the modem likely runs a linux. Coded in? I am an Luddite -- dont touch computers. Right. The car I drive probably has embedded chips... Embeded linux. No Amish/Luddite is not enough to say No! to C You'd have to be completely isolated from every connection with modern civilization. So python programmers employ the 'black-cat' squad of GvR and gang to shield us from C. Because they are good at it we can afford to ignore it. No, No C is no option. The only option is at how many removes we keep away from it. -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 19/12/2013 05:09, rusi wrote: On Thursday, December 19, 2013 10:20:54 AM UTC+5:30, Mark Lawrence wrote: On 19/12/2013 04:29, rusi wrote: On Thursday, December 19, 2013 6:19:04 AM UTC+5:30, Rhodri James wrote: On Tue, 17 Dec 2013 15:51:44 -, Wolfgang Keller wrote: The only issue for me was to figure out how to do in C what I already knew in Pascal. And I had to waste a *lot* more time and mental effort to mess with that language than it took for me to learn *both* the basics of programming per se *and* Pascal in the first class at my home university. It's sounds like you made, and are carrying on making, one of the classic mistakes of software engineering: you're trying to write one language in the style of another. It is possible to write C code as if it were Pascal, but it's a painful process and it won't be pretty. It's far better to use a language as it is rather than as you want it to be. Yes but the reverse is also true: Sometimes the best code in language L is first conceptualized in design-language D and then 'coded' into L. When we were students D was called 'flow-charts' Gone out of fashion today and replaced by UML. Now I expect the majority on this list to not care for UML. However the idea of a separate design language is not negated by the fact that UML is overkill and silly. eg Saw this (on the Erlang mailing list) In some Australian university (in the 90s) 2 sems of Cobol was replaced by 1 sem Scheme + 1 sem Cobol. Students learnt more Cobol in the second arrangement than in the first. [Note: 'More Cobol' not 'More Programming'] Now if you were to ask those *students* I would expect similar emotions towards Cobol as Wolfgang is expressing towards C. That is however a separate issue :D If C is such a crap language, what does it says for the thousands of languages that never got anywhere? Or did C simply have a far larger sales and marketing budget? :) Are you addressing that to me? No, I never address individuals. As far as I'm concerned I'm sending to an entire newsgroup/mailing list. [Assuming you are a good boy who does not use GG-crap and knows the laws of snipping and attributing I am taking it so :D ] Please cut the sarcastic crap. No I am not in the 'C-is-crap' camp. Very far into the opposite actually. What would you say to someone who says: - food is crap to eat - air is crap to breathe C is crap technology is analogous. If you are using python its likely CPython. Whats the C there? If you are connected to the net the modem likely runs a linux. Coded in? I am an Luddite -- dont touch computers. Right. The car I drive probably has embedded chips... Embeded linux. No Amish/Luddite is not enough to say No! to C You'd have to be completely isolated from every connection with modern civilization. So python programmers employ the 'black-cat' squad of GvR and gang to shield us from C. Because they are good at it we can afford to ignore it. No, No C is no option. The only option is at how many removes we keep away from it. I've no idea what most of the above is meant to mean. Have you been reading too much RR or Joseph McCarthy? -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
Roy Smith wrote: even if you've got all the signatures of foo() in front of you, it can sometimes be hard to figure out which one the compiler will pick. And conversely, sometimes the compiler will have a hard time figuring out which one you want it to pick! I had an experience in Java recently where a library author had provided two overloads of a function, that at first sight could be disambiguated by argument types. But for a certain combination of types it was ambiguous, and I was unlucky enough to want to use that particular combination, and the compiler insisted on picking the wrong one. As far as I could see, it was *impossible* to call the other overload with those parameter types. I ended up resorting to copying the whole function and giving it another name, just so I could get it called. -- Function overloading: Just say no. Greg -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
In article bhff4qf21f...@mid.individual.net, Gregory Ewing greg.ew...@canterbury.ac.nz wrote: Roy Smith wrote: even if you've got all the signatures of foo() in front of you, it can sometimes be hard to figure out which one the compiler will pick. And conversely, sometimes the compiler will have a hard time figuring out which one you want it to pick! I had an experience in Java recently where a library author had provided two overloads of a function, that at first sight could be disambiguated by argument types. But for a certain combination of types it was ambiguous, and I was unlucky enough to want to use that particular combination, and the compiler insisted on picking the wrong one. As far as I could see, it was *impossible* to call the other overload with those parameter types. I ended up resorting to copying the whole function and giving it another name, just so I could get it called. BTDT. We were doing a huge network management application. There was an SNMP_Manager class, which had three or four different constructors, each one taking a dozen or more arguments, many of them optional. I finally got fed up with eternally trying to figure out which constructor was being called and replaced them with a series of factory functions: construct_for_traps(), construct_for_polling(), etc. -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
Roy Smith於 2013年12月19日星期四UTC+8下午12時16分26秒寫道: In article 07c6e6a3-c5f4-4846-9551-434bdaba8...@googlegroups.com, rusi rustompm...@gmail.com wrote: Soon the foo has to split into foo1.c and foo2.c. And suddenly you need to understand: 1. Separate compilation 2. Make (which is separate from 'separate compilation') 3. Header files and libraries and the connection and difference None of that is specific to C. Virtually any language (including Python) allows a program to be split up into multiple source files. If you're running all but the most trivial example, you need to know how to manage these multiple files and how the pieces interact. It's pretty common here to have people ask questions about how import works. How altering sys.path effects import. Why is import not finding my module? You quickly get into things like virtualenv, and now you've got modules coming from your source tree, from your vitualenv, from your system library. You need to understand all of that to make it all work. OK, just any novice can take the BOA and WXPYTHON packages to implement an editor in 1 to 3 hours, but that is trivial in Delphi and object pascal long time ago. The GUI to python scrit generation engine is the smarter way to let the mass interested in programming. -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
Roy Smith wrote: I suspect what you mean is, There are some things that don't make sense until you understand computer architecture. An example of that kind of thing from a different perspective: I learned Z80 assembly language by first learning Z80 *machine* language (my homebrew computer didn't have an assembler, so I had to write my own and hand assemble it (after writing my own DOS (after building my own disk controller... etc!))). If you just look at the Z80 architecture at the assembly language level, the rules for which instructions go with which registers and which addressing modes seem very haphazard. But because I knew how the instructions were encoded, I never had any trouble remembering the allowable combinations. If it didn't have an encoding, you couldn't do it! -- Greg -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
Dave Angel wrote: C is a glorified macro assembler. So the - operator is not analogous to the dot operator, it's Syntactic sugar: p- a. Is really (*p).a But it's not above inferring a dereferencing operation when you call a function via a pointer. If f is a pointer to a function, then f(a) is equivalent to (*f)(a) If the compiler can do that for function calls, there's no reason it couldn't do it for member access as well. If I remember rightly, Ada not only does implicit dereferencing like this, it doesn't even have an explicit dereferencing operator! If you want to refer to the whole record pointed to by p, you have to say 'p.all'. BTW, the whole notion of a pointer to a function is redundant in C, since you can't do anything with what it points to other than call it. The equivalent concept in Modula, for example, is just called a function type, not a pointer-to- function type. Similarly in most languages that have functions as values. -- Greg -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Monday 16 December 2013 20:30:47 Mark Lawrence did opine: On 17/12/2013 01:06, Roy Smith wrote: In article b64dd8cc-a7b5-47d2-9fe6-d2bd6e432...@googlegroups.com, Rick Johnson rantingrickjohn...@gmail.com wrote: Dovetails are nothing more than sadistic nostalgia -- they give old men a chubby and young men a nightmare. There is nothing more satisfying than cutting a set of dovetails by hand and having them glide together like silk, the first time you test-fit them, with no daylight visible anywhere. Someday, mine will be like that :-) I suspect that your manual skills are rather better than mine. One of my favourite expressions, perhaps because I only ever heard my dad use it, is like watching a cow handle a shotgun. I'll plead to using a jig, and figure I have a good fit when I have to drive it together with a deadblow hammer. Cheers, Gene -- There are four boxes to be used in defense of liberty: soap, ballot, jury, and ammo. Please use in that order. -Ed Howdershelt (Author) Genes Web page http://geneslinuxbox.net:6309/gene IBM's original motto: Cogito ergo vendo; vendo ergo sum. A pen in the hand of this president is far more dangerous than 200 million guns in the hands of law-abiding citizens. -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 17 December 2013 00:39, rusi rustompm...@gmail.com wrote: On Tuesday, December 17, 2013 5:58:12 AM UTC+5:30, Ned Batchelder wrote: On 12/16/13 3:32 PM, Wolfgang Keller wrote: And ever after that experience, I avoided all languages that were even remotely similar to C, such as C++, Java, C#, Javascript, PHP etc. Thanks for sharing your experiences Wolfgang. I think many of my students have a similar experience after learning C and it is interesting to hear it from your perspective many years later. I was also taught C as an undergrad but having already learned Java, C and C++ before arriving at University I found the C course very easy so my own experience is not representative. Many of the other students at that time found the course too hard and just cheated on all the assignments (I remember one students offering to fix/finish anyone's assignment in exchange for a bottle of cider!). I think that's disappointing, for two reasons. Firstly, C syntax isn't that terrible. It's not just the abysmally appalling, hideously horrifying syntax. At about everything about C is just *not* made for human beings imho. I've never heard C syntax reviled quite so intensely. What syntax do you like, out of curiosity? I had a paper some years ago on why C is a horrible language *to teach with* http://www.the-magus.in/Publications/chor.pdf Thanks for this Rusi, I just read it and it describes very well what I think about our own C course. My choice quote from the beginning would be When the irrelevant becomes significant, the essentials become obscured and incomprehensible. (BTW is there any reason that the document is repeated twice in the same pdf?) As a case in point one of my tutees asked for help with his C assignment last week. I looked at his code and it was a complete mess. I explained roughly what it should look like and he explained that he had had so much trouble figuring out how to get the compiler to pass a pair of strings into a function that he had given up and used global variables instead. He's just not ready yet to get an intuitive understanding of where to put the asterisks in order to make it work - and as you point out in that paper the rules for where the asterisks go are hopelessly inconsistent. A couple of weeks before, another of my tutees brought their assignment which was about dynamic memory allocation (~7 weeks into her first programming course). She had just written something like char *x = (char*)malloc(31*sizeof(char)); for a global x at the top of the file. So the message about dynamic memory allocation was entirely lost in the details of C: dynamic memory allocation means using malloc. These types of problems are compounded by the fact that the current C course uses automated marking so a program that produces the correct output gets full marks even if it is terribly written and the student entirely misses the point - another thing about this course that definitely needs to change. I believe people did not get then (and still dont) that bad for - beginner education (CS101) - intermediate -- compilers, OS, DBMS etc - professional software engineering are all almost completely unrelated Agreed. Oscar -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Tue, 17 Dec 2013 11:12:07 +, Oscar Benjamin wrote: These types of problems are compounded by the fact that the current C course uses automated marking so a program that produces the correct output gets full marks even if it is terribly written and the student entirely misses the point This suggests that even the lecturers can't read C, and so have got one of their post-grad students to write an automated tester so they don't have to. Only-half-joking-ly y'rs, -- Steven -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Tue, Dec 17, 2013 at 10:12 PM, Oscar Benjamin oscar.j.benja...@gmail.com wrote: I was also taught C as an undergrad but having already learned Java, C and C++ before arriving at University I found the C course very easy so my own experience is not representative. Many of the other students at that time found the course too hard and just cheated on all the assignments (I remember one students offering to fix/finish anyone's assignment in exchange for a bottle of cider!). Student cheats on assignment and gets, in effect, a fraudulent certification. (Piece of paper claims competence, competence doesn't exist.) Graduating student shows certification to employer. Employer hires ex-student, because employer doesn't know good code from bad (hence hiring someone). Ex-student writes a pile of junk, then leaves for a better opportunity. Real programmer is hired, or seconded from another project, to fix a few small issues in ex-student's code. Lunatic asylum gains another patient. It's all too common. I'd like to tell people that they're only cheating themselves, but the trouble is, they're cheating other people a lot more. ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
Rick Johnson rantingrickjohn...@gmail.com wrote: Dovetails are nothing more than sadistic nostalgia -- they give old men a chubby and young men a nightmare. There is nothing more satisfying than cutting a set of dovetails by hand and having them glide together like silk, the first time you test-fit them, with no daylight visible anywhere. This dove-tailer understands Rapid Application Development http://woodwork.ars-informatica.ca/tool.php?art=dovetail_video Frank Klausz's three-minute dovetails using a bow saw -- Stanley C. Kitching Human Being Phoenix, Arizona --- This email is free from viruses and malware because avast! Antivirus protection is active. http://www.avast.com -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 2013-12-17, Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote: I would really like to see good quality statistics about bugs per program written in different languages. I expect that, for all we like to make fun of COBOL, it probably has few bugs per unit-of-useful-work-done than the equivalent written in C. I can't think of a reference, but I to recall that bugs-per-line-of-code is nearly constant; it is not language dependent. So, unscientifically, the more work you can get done in a line of code, then the fewer bugs you'll have per amount of work done. -- Neil Cerutti -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
In article mailman.4286.1387291924.18130.python-l...@python.org, Neil Cerutti ne...@norwich.edu wrote: On 2013-12-17, Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote: I would really like to see good quality statistics about bugs per program written in different languages. I expect that, for all we like to make fun of COBOL, it probably has few bugs per unit-of-useful-work-done than the equivalent written in C. Well, there was that little Y2K thing... -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 17/12/2013 14:54, Roy Smith wrote: In article mailman.4286.1387291924.18130.python-l...@python.org, Neil Cerutti ne...@norwich.edu wrote: On 2013-12-17, Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote: I would really like to see good quality statistics about bugs per program written in different languages. I expect that, for all we like to make fun of COBOL, it probably has few bugs per unit-of-useful-work-done than the equivalent written in C. Well, there was that little Y2K thing... Design assumption made 30 to 40 years earlier It'll only have to be in the field for six months :) -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Tue, 17 Dec 2013 09:54:41 -0500, Roy Smith wrote: In article mailman.4286.1387291924.18130.python-l...@python.org, Neil Cerutti ne...@norwich.edu wrote: On 2013-12-17, Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote: I would really like to see good quality statistics about bugs per program written in different languages. I expect that, for all we like to make fun of COBOL, it probably has few bugs per unit-of-useful-work-done than the equivalent written in C. Well, there was that little Y2K thing... Oh come on, how were people in the 1990s supposed to predict that they would be followed by the year 2000??? That's a good point, but that wasn't a language issue, it was a program design issue. Back in the 70s and 80s, when saving two digits per date field seemed to be a sensible thing to do, people simply didn't imagine that their programs would still be used in the year 1999[1]. That's not the same sort of bug as (say) C buffer overflows, or SQL code injection attacks. It's not like the COBOL language defined dates as having only two digits. [1] What gets me is that even in the year 1999, there were still programmers writing code that assumed two-digit years. I have it on good authority from somebody working as an external consultant for a bank in 1999 that he spent most of 1998 and 1999 fixing *brand new code* written by the bank's own staff. You'd think that having lived through that experience would have shaken his belief that private enterprise does everything better, and the bigger the corporation the better they do it, but apparently not. Go figure. -- Steven -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
It's not just the abysmally appalling, hideously horrifying syntax. At about everything about C is just *not* made for human beings imho. I've never heard C syntax reviled quite so intensely. What syntax do you like, out of curiosity? Pascal, Python, if written by someone who uses semantic identifiers and avoids to use C(++)/Java-isms. I've seen Eiffel as well (without understanding it) and it didn't look ridiculous to me. In short, syntax that contains the strict minimum of special characters (delimiting lists etc. with brackets is ok to me), and almost exclusively human readable words. Although, if you push it to the extreme; Applescript is nice to read, but much less nice to write imho... :-/ C, C++, Java, Javascript, PHP, Perl etc., however, are just unspeakable expletives. rant BTW; Yes, I do *hate* those C(++)-isms (or Java-isms) that have started to sneak into Python in the past ~10 years. Using e.g. == for comparisons is just braindead. Use := for assignments instead, because that's mathematical syntax. And that @ for decorators is, well, who proposed it? I'd like to cut off all his fingers with a bolt cutter. The same for people who use augmented assignments, syntax shortcuts or abbrvtd idtfrs. Ship them all to Fukushima, one way, no return ticket. Learn to touch-type, get an editor with decent syntax completion or just stop wreaking havoc to the world economy with your laziness. Code is read a hundred times more often than it is typed. /rant Sincerely, Wolfgang -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 17/12/2013 15:24, Steven D'Aprano wrote: On Tue, 17 Dec 2013 09:54:41 -0500, Roy Smith wrote: In article mailman.4286.1387291924.18130.python-l...@python.org, Neil Cerutti ne...@norwich.edu wrote: On 2013-12-17, Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote: I would really like to see good quality statistics about bugs per program written in different languages. I expect that, for all we like to make fun of COBOL, it probably has few bugs per unit-of-useful-work-done than the equivalent written in C. Well, there was that little Y2K thing... Oh come on, how were people in the 1990s supposed to predict that they would be followed by the year 2000??? That's a good point, but that wasn't a language issue, it was a program design issue. Back in the 70s and 80s, when saving two digits per date field seemed to be a sensible thing to do, people simply didn't imagine that their programs would still be used in the year 1999[1]. That's not the same sort of bug as (say) C buffer overflows, or SQL code injection attacks. It's not like the COBOL language defined dates as having only two digits. [1] What gets me is that even in the year 1999, there were still programmers writing code that assumed two-digit years. I have it on good authority from somebody working as an external consultant for a bank in 1999 that he spent most of 1998 and 1999 fixing *brand new code* written by the bank's own staff. You'd think that having lived through that experience would have shaken his belief that private enterprise does everything better, and the bigger the corporation the better they do it, but apparently not. Go figure. I was in charge of the team at work that had to make all code Y2K compliant. I discovered the one bug that to my knowledge slipped through the net. Four years later back at the same place on contract I fixed the fix!!! -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
I was also taught C as an undergrad but having already learned Java, C and C++ before arriving at University I found the C course very easy so my own experience is not representative. Many of the other students at that time found the course too hard and just cheated on all the assignments (I remember one students offering to fix/finish anyone's assignment in exchange for a bottle of cider!). The problem with the C class wasn't that it was hard. I had passed my Pascal class, which taught nearly exactly the same issues with straight As before (without ever having writeen any source code ever before). And by standard cognitive testing standards, I'm not exactly considered to be an idiot. The only issue for me was to figure out how to do in C what I already knew in Pascal. And I had to waste a *lot* more time and mental effort to mess with that language than it took for me to learn *both* the basics of programming per se *and* Pascal in the first class at my home university. C is just a kafkaesque mess invented by a sadistic pervert who must have regularly consumed illegal substances for breakfast. Its only reason to exist seems to be that apparently it's ridiculously easy to implement a compiler for it. Although, as a professional developer once told me, most C compilers are garbage. One student in the C class (who had been doing software development for years before he came to university) jokingly passed around samples of valid C code by email. Most of them looked like uuencoded binaries (this was in the early-to-mid 90s), but they all compiled and produced an output. Except that *no one* (including professors) was able to predict the output without actually running the compiled code. In the classroom lectures parallel to the C exercises, the professor spent most of his time explaining what *not* to do because... Heck, why does a language provide features resp. allow their use in ways that are known to be bottomless cans of worms. These types of problems are compounded by the fact that the current C course uses automated marking so a program that produces the correct output gets full marks even if it is terribly written and the student entirely misses the point - another thing about this course that definitely needs to change. In our classes, when a program was correct, but e.g. you used just *one* single non-semantic identifier (such as an i for a loop index), you got *automatically* zero points for that exercise. Other absolutely mandatory requirements were about minimum commenting etc. Less comment lines than code was very likely to yield zero points for that exercise as well. Sincerely, Wolfgang -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Tue, Dec 17, 2013 at 10:35 AM, Mark Lawrence breamore...@yahoo.co.uk wrote: I was in charge of the team at work that had to make all code Y2K compliant. I discovered the one bug that to my knowledge slipped through the net. Four years later back at the same place on contract I fixed the fix!!! From around 1997 till 2000 all I did was fix Y2K bugs. I'm pretty sure I got them all. For one client I fixed well over 200. After the new year came and nothing broke, the owner of the company said You made such a big deal about this Y2K stuff, and it turned out not to be a problem at all. -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Tuesday, December 17, 2013 9:51:07 PM UTC+5:30, larry@gmail.com wrote: On Tue, Dec 17, 2013 at 10:35 AM, Mark Lawrence wrote: I was in charge of the team at work that had to make all code Y2K compliant. I discovered the one bug that to my knowledge slipped through the net. Four years later back at the same place on contract I fixed the fix!!! From around 1997 till 2000 all I did was fix Y2K bugs. I'm pretty sure I got them all. For one client I fixed well over 200. After the new year came and nothing broke, the owner of the company said You made such a big deal about this Y2K stuff, and it turned out not to be a problem at all. Hahaha -- Very funny and serious. Ive been actually experienced being kicked out of job for writing decent working code and not making a big deal of it. Comes back the start of the thread -- What do we teach students? Should we teach how to write the best possible code and as effortlessly as possible? Or should we also teach how to make a fuss, how to pretend to (over)work while actually (under)delivering? In a Utopia this would not be a question at all. But we dont live in Utopia... [And there are languages WAY better than C... C++ for example] -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 2013-12-17, Wolfgang Keller felip...@gmx.net wrote: I was also taught C as an undergrad but having already learned Java, C and C++ before arriving at University I found the C course very easy so my own experience is not representative. Many of the other students at that time found the course too hard and just cheated on all the assignments (I remember one students offering to fix/finish anyone's assignment in exchange for a bottle of cider!). The problem with the C class wasn't that it was hard. I had passed my Pascal class, which taught nearly exactly the same issues with straight As before (without ever having writeen any source code ever before). And by standard cognitive testing standards, I'm not exactly considered to be an idiot. I agree that C is a awful pedagogical language. When I was in university, the first language for Computer Science or Computer Engineering students was Pascal. After that, there were classes that surveyed Prolog, SNOBOL, LISP, Modula, APL, FORTRAN, COBOL, etc. If you were an other engineering/science major, you learned FORTRAN first (and last). I think there may also have been some business types who were taught BASIC. C wasn't taught at all. When I graduated and started doing real-time embedded firmware, the choices were Generally C or Pascal. The first projects I did were in Pascal, but I learned C because the development host was a PDP-11 running Unix and I needed to write some small (non embedded) utilities. Today, all my embedded work is in C. Python fell out of style for some reason, but (with a few extensions) it was a fine language for embedded work as well. I've always thought C was a great language for low-level, bare-metal, embedded stuff -- but teaching it to first or second year computer science students is just insane. C has a certain minimalist orthogonality that I have always found pleasing. [People who smile wistfully when they think about the PDP-11 instruction word layouts probably know what I mean.] But, exposure to C should wait until you have a firm grasp of basic algorithms and data structures and are proficient in assembly language for a couple different architectures. Ideally, you should also have written at least one functioning compiler before learning C as well. -- Grant Edwards grant.b.edwardsYow! Maybe we could paint at GOLDIE HAWN a rich PRUSSIAN gmail.comBLUE -- -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 17 December 2013 15:51, Wolfgang Keller felip...@gmx.net wrote: I was also taught C as an undergrad but having already learned Java, C and C++ before arriving at University I found the C course very easy so my own experience is not representative. Many of the other students at that time found the course too hard and just cheated on all the assignments (I remember one students offering to fix/finish anyone's assignment in exchange for a bottle of cider!). The problem with the C class wasn't that it was hard. I had passed my Pascal class, which taught nearly exactly the same issues with straight As before (without ever having writeen any source code ever before). And by standard cognitive testing standards, I'm not exactly considered to be an idiot. Please don't misunderstand me: I'm certainly not saying that you're an idiot. Also I'm sure many of the students on my course would have fared better on a course that was using e.g. Python instead of C. Well actually come to think of it some of the other students were pretty stupid. The lecturer had explained that they were using a plagiarism detector so if you copy-paste code from someone else they could catch you out for cheating. A few people took that literally and thought that it could detect copy-pasting (in plain text files!). The rumour went round that it would be okay if you printed out the code and then typed it back in. For some reason they didn't bother running the plagiarism detector until about 6 weeks into the course by which time ~20% of submissions were exact duplicates of at least one other (according to the lecturer who announced that all such students would get zero marks for those assignments). Oscar -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Tue, Dec 17, 2013 at 11:59 AM, Grant Edwards invalid@invalid.invalid wrote: On 2013-12-17, Wolfgang Keller felip...@gmx.net wrote: I was also taught C as an undergrad but having already learned Java, C and C++ before arriving at University I found the C course very easy so my own experience is not representative. Many of the other students at that time found the course too hard and just cheated on all the assignments (I remember one students offering to fix/finish anyone's assignment in exchange for a bottle of cider!). I did that, but my fee was a case of beer. The problem with the C class wasn't that it was hard. I had passed my Pascal class, which taught nearly exactly the same issues with straight As before (without ever having writeen any source code ever before). And by standard cognitive testing standards, I'm not exactly considered to be an idiot. I agree that C is a awful pedagogical language. When I was in university, the first language for Computer Science or Computer Engineering students was Pascal. After that, there were classes that surveyed Prolog, SNOBOL, LISP, Modula, APL, FORTRAN, COBOL, etc. If you were an other engineering/science major, you learned FORTRAN first (and last). I think there may also have been some business types who were taught BASIC. C wasn't taught at all. It wasn't for me either when I went to college in the late 1970's. Pascal first, then FORTRAN, then IBM 360 assembler. That was all the formal language training I had. (I had taught myself BASIC in high school.) -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 17/12/2013 17:18, Larry Martell wrote: On Tue, Dec 17, 2013 at 11:59 AM, Grant Edwards invalid@invalid.invalid wrote: On 2013-12-17, Wolfgang Keller felip...@gmx.net wrote: I was also taught C as an undergrad but having already learned Java, C and C++ before arriving at University I found the C course very easy so my own experience is not representative. Many of the other students at that time found the course too hard and just cheated on all the assignments (I remember one students offering to fix/finish anyone's assignment in exchange for a bottle of cider!). I did that, but my fee was a case of beer. Pay bottle get monkey? -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 17/12/2013 16:59, Grant Edwards wrote: I've always thought C was a great language for low-level, bare-metal, embedded stuff -- but teaching it to first or second year computer science students is just insane. C has a certain minimalist orthogonality that I have always found pleasing. [People who smile wistfully when they think about the PDP-11 instruction word layouts probably know what I mean.] I agree with you here, but wasn't there a tie-in between C and the rise of Unix via universities, or am I barking in the wrong forest? But, exposure to C should wait until you have a firm grasp of basic algorithms and data structures and are proficient in assembly language for a couple different architectures. Ideally, you should also have written at least one functioning compiler before learning C as well. I never had a problem with C as I'd written assembler for RCA 1802, Ferranti F110L and DEC/VAX, plus CORAL 66. Hum, a bit of a fib there, I recall vainly struggling with a C for loop before I finally realised I'd effectively written a CORAL 66 one, page 50 here http://www.xgc.com/manuals/pdf/xgc-c66-rm.pdf for (ouch!!!) anyone who's interested. Using a Whitesmith's pre-ANSI C compiler didn't exactly help me either. IIRC printf was spelt format and all the formatting codes were different to what became standard C. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Tuesday, December 17, 2013 8:21:39 PM UTC+5:30, Neil Cerutti wrote: On 2013-12-17, Steven D'Aprano wrote: I would really like to see good quality statistics about bugs per program written in different languages. I expect that, for all we like to make fun of COBOL, it probably has few bugs per unit-of-useful-work-done than the equivalent written in C. I can't think of a reference, but I to recall that bugs-per-line-of-code is nearly constant; it is not language dependent. So, unscientifically, the more work you can get done in a line of code, then the fewer bugs you'll have per amount of work done. Enter the (One-Liner) Dragon! http://www.youtube.com/watch?v=a9xAKttWgP4 -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Wed, Dec 18, 2013 at 5:03 AM, rusi rustompm...@gmail.com wrote: On Tuesday, December 17, 2013 8:21:39 PM UTC+5:30, Neil Cerutti wrote: I can't think of a reference, but I to recall that bugs-per-line-of-code is nearly constant; it is not language dependent. So, unscientifically, the more work you can get done in a line of code, then the fewer bugs you'll have per amount of work done. Enter the (One-Liner) Dragon! http://www.youtube.com/watch?v=a9xAKttWgP4 Some languages work differently with lines, cramming more onto a single line while still having more code. What's nearly constant is bugs per amount of code, except that it's practically impossible to measure how much code you've produced. So there are a few exceptions to the lines of code metric, a few languages that jump around a bit on the scale. ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Tuesday 17 December 2013 12:23:28 Cousin Stanley did opine: Rick Johnson rantingrickjohn...@gmail.com wrote: Dovetails are nothing more than sadistic nostalgia -- they give old men a chubby and young men a nightmare. There is nothing more satisfying than cutting a set of dovetails by hand and having them glide together like silk, the first time you test-fit them, with no daylight visible anywhere. This dove-tailer understands Rapid Application Development http://woodwork.ars-informatica.ca/tool.php?art=dovetail_video Frank Klausz's three-minute dovetails using a bow saw Frank is a Master, and too many people never really learn to use a bow saw. However I'd expect that joint would take more glue to fill, I've had a single drop of TB-III extrude from every edge of one of my jig made joints. But, I really think we are just a tad off topic. Cheers, Gene -- There are four boxes to be used in defense of liberty: soap, ballot, jury, and ammo. Please use in that order. -Ed Howdershelt (Author) Genes Web page http://geneslinuxbox.net:6309/gene UNIX is hot. It's more than hot. It's steaming. It's quicksilver lightning with a laserbeam kicker. -- Michael Jay Tucker A pen in the hand of this president is far more dangerous than 200 million guns in the hands of law-abiding citizens. -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 2013-12-17, Mark Lawrence breamore...@yahoo.co.uk wrote: On 17/12/2013 16:59, Grant Edwards wrote: I've always thought C was a great language for low-level, bare-metal, embedded stuff -- but teaching it to first or second year computer science students is just insane. C has a certain minimalist orthogonality that I have always found pleasing. [People who smile wistfully when they think about the PDP-11 instruction word layouts probably know what I mean.] I agree with you here, but wasn't there a tie-in between C and the rise of Unix via universities, or am I barking in the wrong forest? Yes, I think the popularity of Unix on university campuses is what caused the migration from Pascal to C for freshman programming classes. IIRC, there were decent Pascal compilers for Unix back then, so I still think it was a big mistake. Later on when studying low level OS stuff would have been a fine time to introduce C if required for logistical reasons. -- Grant Edwards grant.b.edwardsYow! Vote for ME -- I'm at well-tapered, half-cocked, gmail.comill-conceived and TAX-DEFERRED! -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Tue, Dec 17, 2013 at 1:20 PM, Chris Angelico ros...@gmail.com wrote: On Wed, Dec 18, 2013 at 5:03 AM, rusi rustompm...@gmail.com wrote: On Tuesday, December 17, 2013 8:21:39 PM UTC+5:30, Neil Cerutti wrote: I can't think of a reference, but I to recall that bugs-per-line-of-code is nearly constant; it is not language dependent. So, unscientifically, the more work you can get done in a line of code, then the fewer bugs you'll have per amount of work done. If its true that bugs per line of code is more or less a constant, I think the key is that some languages are more expressive than others. So, in assembler, you are moving data around registers, and doing basic math, etc. It takes a lot of code to get something done. So maybe more bugs. Moving up the ladder to C, which is in a way high level assembly language, you get more done in few lines. Python or other languages maybe do more per line than C (eg the for loop in python does a lot with very little code because of python having iterable stuff built in) So, if you have a language that is expressive and fits your programming needs, you will have less to debug -- not because you don't make as many errors, but the good code just does more for you -- Joel Goldstick http://joelgoldstick.com -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
In article 20131217165144.39bf9ba1cd4e4f27a9689...@gmx.net, Wolfgang Keller felip...@gmx.net wrote: C is just a kafkaesque mess invented by a sadistic pervert who must have regularly consumed illegal substances for breakfast. Don't be absurd. C is a perfectly good language for the kinds of things it's meant for. It lets you get down close to the hardware while still using rational flow control and program structure, and being reasonably portable. There's very few mysteries in C. You never have to wonder what the lifetime of an object is, or be mystified by which of the 7 signatures of Foo.foo() are going to get called, or just what operation x + y is actually going to perform. If you maim yourself with a razor-sharp chisel, do you blame the chisel for being a bad tool? -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
In article l8pvsl$60h$1...@reader1.panix.com, Grant Edwards invalid@invalid.invalid wrote: Ideally, you should also have written at least one functioning compiler before learning C as well. Why? I've never written a compiler. I've written plenty of C. I don't see how my lack of compiler writing experience has hindered my ability to write C. -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Tue, 17 Dec 2013 19:32:20 -0500, Roy Smith wrote: There's very few mysteries in C. Apart from What the hell does this piece of code actually do?. It's no coincidence that C, and Perl which borrows a lot of syntax from C, are the two champion languages for writing obfuscated code. And What does 'implementation-specific undefined behaviour' actually mean in practice?, another common question when dealing with C. And most importantly, how many asterisks do I need, and where do I put them? (only half joking). You never have to wonder what the lifetime of an object is, Since C isn't object oriented, the lifetime of objects in C is, um, any number you like. The lifetime of objects in some language with no objects is ONE MILLION YEARS!!! is as good as any other vacuously true statement. or be mystified by which of the 7 signatures of Foo.foo() are going to get called, Is that even possible in C? If Foo is a struct, and Foo.foo a member, I don't think C has first-class functions and so Foo.foo can't be callable. But if I'm wrong, and it is callable, then surely with no arguments there can only be one signature that Foo.foo() might call, even if C supported generic functions, which I don't believe it does. (You can simulate something rather like generic functions using pointers, but that's it.) or just what operation x + y is actually going to perform. With no operator overloading, that one at least is correct. -- Steven -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Wed, Dec 18, 2013 at 12:33 PM, Steven D'Aprano st...@pearwood.info wrote: On Tue, 17 Dec 2013 19:32:20 -0500, Roy Smith wrote: There's very few mysteries in C. Apart from What the hell does this piece of code actually do?. It's no coincidence that C, and Perl which borrows a lot of syntax from C, are the two champion languages for writing obfuscated code. I thought APL would beat both of them, though you're right that the International Obfuscoted Python Code Contest would be a quite different beast. But maybe it'd be just as viable... a competent programmer can write unreadable code in any language. And What does 'implementation-specific undefined behaviour' actually mean in practice?, another common question when dealing with C. You mean like mutating locals()? The only difference is that there are a lot more implementations of C than there are of Python (especially popular and well-used implementations). There are plenty of things you shouldn't do in Python, but instead of calling them implementation-specific undefined behaviour, we call them consenting adults and shooting yourself in the foot. And most importantly, how many asterisks do I need, and where do I put them? (only half joking). The one differentiation that I don't like is between the . and - operators. The distinction feels like syntactic salt. There's no context when both are valid, save in C++ where you can create a pointer-like object that implements the - operator (and has the . operator for its own members). You never have to wonder what the lifetime of an object is, Since C isn't object oriented, the lifetime of objects in C is, um, any number you like. The lifetime of objects in some language with no objects is ONE MILLION YEARS!!! is as good as any other vacuously true statement. Lifetime still matters. The difference between automatic and static variables is lifetime - you come back into this function and the same value is there waiting for you. Call it values or things instead of objects if it makes you feel better, but the consideration is identical. (And in C++, it becomes critical, with object destructors being used to release resources. So you need to know.) or be mystified by which of the 7 signatures of Foo.foo() are going to get called, Is that even possible in C? If Foo is a struct, and Foo.foo a member, I don't think C has first-class functions and so Foo.foo can't be callable. But if I'm wrong, and it is callable, then surely with no arguments there can only be one signature that Foo.foo() might call, even if C supported generic functions, which I don't believe it does. Well, okay. In C you can't have Foo.foo(). But if that were Foo_foo(), then the point would be better made, because C will have just one function of that name (barring preprocessor shenanigans, of course). In C++, the types of its arguments may affect which function is called (polymorphism), and the dot notation works, too; but C++ does this without having first-class functions, so that part of your response is immaterial. In C++, Foo.foo() will always call a function foo defined in the class of which Foo is an instance. Very simple, and static type analysis will tell you exactly which function that is. Things do get a bit messier with pointers, because a function might be virtual or not virtual; C++ gives us the simple option (non-virtual functions) that most high level languages don't (C++'s virtual functions behave the same way as Python member functions do). ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Tue, Dec 17, 2013 at 4:32 PM, Roy Smith r...@panix.com wrote: There's very few mysteries in C. You never have to wonder what the lifetime of an object is Yes you do. Lifetimes are hard, because you need to malloc a lot, and there is no defined lifetime for pointers -- they could last for just the lifetime of a stack frame, or until the end of the program, or anywhere in-between, and it's impossible to know for sure, and if you get it wrong your program crashes. So there's all these conventions you have to come up with like borrowing and owning, but they aren't compiler-enforced, so you still have to figure it out, and you will get it wrong. Successors like C++ mitigate these issues with destructors (allowing heap-allocated stuff to be tied to the lifetime of a stack), and smart pointers and so on. , or be mystified by which of the 7 signatures of Foo.foo() are going to get called C still has overloaded functions, just fewer of them. It'll still mystify you when you encounter it, though. http://www.robertgamble.net/2012/01/c11-generic-selections.html , or just what operation x + y is actually going to perform. I don't know. Will it do float addition? int addition? size_t addition? How does coercion work? + can do many different things, it's not just a straight translation to an obvious machine instruction. If you maim yourself with a razor-sharp chisel, do you blame the chisel for being a bad tool? If I didn't need it to be that sharp, then yes. -- Devin -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Wed, Dec 18, 2013 at 1:33 PM, Devin Jeanpierre jeanpierr...@gmail.com wrote: On Tue, Dec 17, 2013 at 4:32 PM, Roy Smith r...@panix.com wrote: There's very few mysteries in C. You never have to wonder what the lifetime of an object is Yes you do. Lifetimes are hard, because you need to malloc a lot, and there is no defined lifetime for pointers -- they could last for just the lifetime of a stack frame, or until the end of the program, or anywhere in-between, and it's impossible to know for sure, and if you get it wrong your program crashes. So there's all these conventions you have to come up with like borrowing and owning, but they aren't compiler-enforced, so you still have to figure it out, and you will get it wrong. Successors like C++ mitigate these issues with destructors (allowing heap-allocated stuff to be tied to the lifetime of a stack), and smart pointers and so on. Wrong. A pointer is a scalar value, usually some kind of integer, and its lifetime is the same as any other scalar. Heap memory's lifetime is also very simple: it lasts until freed. (Though technically that's not even a part of the language - malloc/free are just functions. Not that it matters. Anyway, C++ has the new and delete operators, which are part of the language.) There are conventions to prevent memory leaks, but those are mere conventions. It's simple in the same way that a toy electric motor is simple: you apply current to it, and it spins. There's so little that it can do that it HAS to be simple. ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Tue, Dec 17, 2013 at 7:01 PM, Chris Angelico ros...@gmail.com wrote: On Wed, Dec 18, 2013 at 1:33 PM, Devin Jeanpierre jeanpierr...@gmail.com wrote: Yes you do. Lifetimes are hard, because you need to malloc a lot, and there is no defined lifetime for pointers -- they could last for just the lifetime of a stack frame, or until the end of the program, or anywhere in-between, and it's impossible to know for sure, and if you get it wrong your program crashes. So there's all these conventions you have to come up with like borrowing and owning, but they aren't compiler-enforced, so you still have to figure it out, and you will get it wrong. Successors like C++ mitigate these issues with destructors (allowing heap-allocated stuff to be tied to the lifetime of a stack), and smart pointers and so on. Wrong. A pointer is a scalar value, usually some kind of integer, and its lifetime is the same as any other scalar. The duration of a pointer's validity is far more interesting, and that is why it is the primary meaning of the term pointer lifetime. Also, it's obviously what I meant. Heap memory's lifetime is also very simple: it lasts until freed. Sometimes simple things are hard to use correctly. I only said it was hard, not complicated. -- Devin -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Wed, Dec 18, 2013 at 2:12 PM, Devin Jeanpierre jeanpierr...@gmail.com wrote: Wrong. A pointer is a scalar value, usually some kind of integer, and its lifetime is the same as any other scalar. The duration of a pointer's validity is far more interesting, and that is why it is the primary meaning of the term pointer lifetime. Also, it's obviously what I meant. Heap memory's lifetime is also very simple: it lasts until freed. Sometimes simple things are hard to use correctly. I only said it was hard, not complicated. Sure, which is why I went on to discuss the block of memory pointed to. But the rules are a lot simpler than in Python, where something exists until... uhh... the system feels like disposing of it. At which point __del__ will probably be called, but we can't be sure of that. All we know about an object's lifetime in Python is that it will continue to live so long as we're using it. And then multiprocessing and fork make it messier, but that's true in any language. The original point was that C has no mysteries. I posit that this is true because C's rules are so simple. It might well be harder to work in this system (taking it to an extreme, Brainf* is about the simplest Turing-complete language possible, and it's virtually impossible to write good code in it), but it has no mysteries. ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Wed, 2013-12-18 at 01:33 +, Steven D'Aprano wrote: And What does 'implementation-specific undefined behaviour' actually mean in practice?, another common question when dealing with C. Only asked by people who haven't had it explained. There's undefined behavior, and there's implementation-specific behavior, but it is impossible to have implementation-specific undefined behavior. And, the definitions are simple to understand: undefined behavior means that if your program invokes it, there is no definition of what will happen. This is buggy code. Implementation-specific behavior means that the standard requires the implementation to do some well-defined thing, but the standard does not define exactly what it must be. You can go look up what your implementation will do in its documentation (the standard requires that it be documented), but you can't assume the same thing will happen in another implementation. This is non-portable code. It's a very rare language indeed that has no undefined or implementation-specific behaviors. Python gets to cheat by having one reference implementation. Every time you've had to go try something out in the Python interpreter because the documentation didn't provide the details you needed, that WAS implementation-specific behavior. You never have to wonder what the lifetime of an object is, Since C isn't object oriented, the lifetime of objects in C is, um, any number you like. The lifetime of objects in some language with no objects is ONE MILLION YEARS!!! is as good as any other vacuously true statement. The implication that only an object oriented language could have a concept of object lifetimes is false. Another, less hyperbolic way of saying this is that in C, the lifetime of objects is _exactly as long as you specify_. Heap objects come into existence when you explicitly create them, and they go out of existence when you explicitly destroy them. If you don't destroy them, they never go away. If you destroy them more than once, that's undefined behavior. Stack objects are even simpler. or be mystified by which of the 7 signatures of Foo.foo() are going to get called, Is that even possible in C? If Foo is a struct, and Foo.foo a member, I don't think C has first-class functions and so Foo.foo can't be callable. Of course that's valid C. It's true that C doesn't have first-class functions, but it supports invoking functions through pointers and you can store functions in data members, pass functions as arguments, and return functions from other functions, so Foo.foo can certainly be callable. ~$ cat /tmp/foo.c #include stdio.h struct Foo { void (*foo)(); }; void foobar(void) { printf(foobar\n); } int main() { struct Foo Foo = { foobar }; Foo.foo(); return 0; } $ gcc -Wall -o /tmp/foo /tmp/foo.c $ /tmp/foo foobar -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Wed, 18 Dec 2013 01:33:03 +, Steven D'Aprano wrote: or just what operation x + y is actually going to perform. With no operator overloading, that one at least is correct. Actually, I stand corrected. I was completely mistaken about that. The C operation x + y is undefined if the addition overflows. A valid C compiler can produce whatever code it damn well feels like in the case of overflow. Oh, and in case you think that integer overflow in C will always follow two's complement semantics, such that INT_MAX+1 = INT_MIN, you are wrong. That's not guaranteed either. Clang and gcc have a flag, -fwrapv, to force defined behaviour on integer overflow, but that's not part of the C standard and not all C compilers will do the same. -- Steven -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
And ever after that experience, I avoided all languages that were even remotely similar to C, such as C++, Java, C#, Javascript, PHP etc. I think that's disappointing, for two reasons. Firstly, C syntax isn't that terrible. It's not just the abysmally appalling, hideously horrifying syntax. At about everything about C is just *not* made for human beings imho. It's just an un-language that gets at about everything wrong. Sort of like Microsoft's products. Sincerely, Wolfgang -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On Mon, 16 Dec 2013 20:32:25 -, Wolfgang Keller felip...@gmx.net wrote: And ever after that experience, I avoided all languages that were even remotely similar to C, such as C++, Java, C#, Javascript, PHP etc. I think that's disappointing, for two reasons. Firstly, C syntax isn't that terrible. It's not just the abysmally appalling, hideously horrifying syntax. At about everything about C is just *not* made for human beings imho. It's an excellent macro assembler, so of course it's not made for human beings. Unfortunately the software development took one look at it and exclaimed, It's a hammer, all our problems must be nails! And the rest is C++. -- Rhodri James *-* Wildebeest Herder to the Masses -- https://mail.python.org/mailman/listinfo/python-list
Re: Experiences/guidance on teaching Python as a first programming language
On 12/16/13 3:32 PM, Wolfgang Keller wrote: And ever after that experience, I avoided all languages that were even remotely similar to C, such as C++, Java, C#, Javascript, PHP etc. I think that's disappointing, for two reasons. Firstly, C syntax isn't that terrible. It's not just the abysmally appalling, hideously horrifying syntax. At about everything about C is just *not* made for human beings imho. I've never heard C syntax reviled quite so intensely. What syntax do you like, out of curiosity? It's just an un-language that gets at about everything wrong. Sort of like Microsoft's products. Sincerely, Wolfgang -- Ned Batchelder, http://nedbatchelder.com -- https://mail.python.org/mailman/listinfo/python-list