On Friday, March 11, 2016 at 3:28:40 AM UTC-6, Steven D'Aprano wrote: > That's one theory. Another theory is: no they shouldn't, all attributes > should be public. That most accurately models actual physical objects and > maximizes the usefulness of the attribute. > > People over-use private, and if you google, you will find many people > asking "how can I access private variables and methods?".
If "people" are trying to access private variables, then one, or both, of these has occurred: (1) The interface was poorly written, and/or (2) The user of the interface is hell bent on shooting himself in the foot! > Python offers a middle ground: where encapsulation is important for safety, > use it, otherwise rely on trust and a gentleman's agreement. We're all > adults here: if you use my internals, that's your choice, and you can live > with the consequences. > > In practice this means that internal details of classes written in C are > completely hidden from Python code unless explicitly made public. Why? > Because if the caller messes with the internal data of a C class, they can > cause a segmentation fault, which can execute arbitrary code. This is a > serious problem, and Python has a philosophy of Absolutely No Seg Faults. First you go off blabbing about how all attributes should be public, then you do an about face, and admit that very bad things can happen when internals are exposed. Does anyone else find this argument to be absurd? > It should be impossible to cause the Python interpreter to seg fault or > dump core (except via ctypes, which is special). SPECIAL CASES ARE *NOT* SPECIAL ENOUGH TO BREAK THE RULES! > But what about Python classes? You can't cause a seg fault in Python code. Are you sure about that? Heck, i posted code quite a few years back that "seg faulted like a mutha". Do you want to retract your statement, or will i need to search the archives, and then stuff the link down your big fat mouth? > So Python's philosophy is: > > - trying to *enforce* private data is a waste of time, people will find a > way around it, even if it is a horrible nasty hack; > > - so instead, rely on trust: we mark "private" attributes with a leading > underscore, and trust that users won't abuse them; > > - if they do, the consequences are on their head, not ours: you have no > responsibility to users who misuse your private attributes. I theory, that is a great philosophy, but in practice, it encourages people to be lazy, and they end up writing horrible interfaces. Why spend the time to write a proper interface, when, in the back of your mind, you know you can reach into that public cookie jar *ANYTIME* you want to. It's like when an overweight person goes on a diet, but still has cookies, cakes and ice-cream in the kitchen -- the temptation to be a slob is just too great! Imagine if indention were optional in Python, or if Python allowed braces... Even with all the benefits of indention over braces, quite a few idiots would still use the damn braces! Heck, i knew this one Ruby coder, who was a longtime C coder, and at the end of every line, he religiously placed a semicolon there. And i told him, "dude, you're *NOT* writing C code anymore, so stop putting those damn trailing semicolons in your source!". But the arrogant bassturd refused to stop. So i thought: "Hmm, maybe it's just a bad habit", but no, he freely admitted that he conscientiously placed them in hos source because: "he liked them". There are two main methods that a language can ruin you as a programmer. One method is to encourage you to be a "Lazy Larry", and the other is to mold you into a "habitual Harry". Python has mostly eradicated the "habitual Harry" disease, but it still actively promotes the "lazy Larry" disease. Yes, pure OOP is onerous to the programmer, but it is onerous for a reason. And that *REASON* is to create rigid interfaces! Anytime a standard is handed down, people are going to whine. "WAH, standards are hard!" "WAH, it takes too long!" "WAH, my fingers hurt!" But what these whiners don't realize, is that, either they put in the effort of writing solid interfaces *NOW*, or they will be forced to debug, explain, and repair the awful interfaces *LATER*. In the end, there is no shortcut to excellence. Which brings me back to my motto: "Great interfaces *NEVER* happen by accident --shite happens, interfaces don't!" Yes, creating proper interfaces requires a lot of thought, a lot of testing, and yes, a whoooole lot of typing. But if thinking, and testing, and typing are not your thing, then perhaps you'd be happier smoking marijuana and working as sales associate at your local GAP store. -- https://mail.python.org/mailman/listinfo/python-list