> On Dec 22, 2015, at 1:31 PM, Kevin Ballard via swift-evolution 
> <swift-evolution@swift.org> wrote:
> 
> UIKit classes aren't subclassable because of a lack of sealed-by-default, 
> they're subclassable because until very recently there was no way to mark 
> classes as not being subclassable. Many classes in UIKit are designed to 
> handle subclassing, but some classes are not designed to handle that, and are 
> explicitly documented as saying that you shouldn't subclass them because they 
> won't work right. Assuming a future in which UIKit is reimplemented in Swift, 
> every class today that is not explicitly documented as saying "don't subclass 
> this" would presumably be marked as inheritable. And the classes that say 
> "don't subclass this" may very well be final/sealed, and that's perfectly OK 
> because subclassing them doesn't work properly anyway.
> 
> Whether the language has final-by-default/sealed-by-default doesn't really 
> affect this in any way. I guarantee you that once Apple starts putting Swift 
> code in their frameworks, every single framework class Apple releases is 
> going to make an explicit decision about being final/sealed vs inheritable, 
> and the language defaults won't affect that one bit.
> 
> The only thing that final-by-default/sealed-by-default is going to affect is 
> third-party code that is released without carefully thinking about this 
> issue. And really, any code that is released that doesn't explicitly think 
> about this issue is likely to have issues with subclassing anyway (because 
> the developer didn't give any thought to whether subclasses would be allowed 
> and likely made assumptions in various places that it wouldn't, even if they 
> didn't consciously think about it). That said, I do think it's a really good 
> idea for anyone releasing libraries to make an explicit decision about 
> whether the class should be final/sealed or inheritable.
> 
> One benefit I haven't seen mentioned about this proposal is it makes it 
> obvious when the author has made an explicit decision. Today there's no 
> keyword for inheritable, so if a class doesn't say `final` you can't know if 
> that was explicit or implicit. But with this proposal, we'll have to gain an 
> `inheritable` keyword (and presumably a `sealed` keyword if we get sealed 
> behavior), which means every single class can be annotated with a keyword 
> indicating an explicit decision was made. Sure, people could still leave off 
> the keyword when they explicitly choose the default behavior, but we could 
> encourage a habit of always adding the keyword even if it's for the default 
> behavior just to indicate that this decision was intentional.
> 
> -Kevin Ballard

Thanks for this Kevin.  I have been trying to make these arguments throughout 
the thread but maybe haven’t been as eloquent as you are here.  The value of 
documenting the author's “statement of intent” and the value of knowing there 
cannot be any subclasses when no annotation exists should not be underestimated.

I’ve been following the thread very closely and have noticed six primary 
arguments against final by default:

1) Workarounds for framework “bugs”.  I put “bugs” in quotes because I think it 
is likely that sometimes it is not exactly a bug, but misunderstood or disliked 
behavior that is being “worked around".  No need to rehash this.  It’s been 
beaten to death.  It’s also irrelevant as it’s clear that the default is almost 
certainly going to be at least `sealed`.  It’s also irrelevant because Apple’s 
frameworks are currently written in Objective-C, not Swift, and when Apple 
begins writing frameworks in Swift they are very likely to be thinking 
carefully about subclassing as part of the API contract decision.

2) Flexibility.  If I don’t need inheritance when I first write a type and 
later realize I do need it, I have to revisit the type and add an `inheritable` 
annotation later.  This is closely related to argument #2 and mostly relevant 
during prototyping (argument #5).  IMO when this scenario arises you should 
have to revisit the original type.  If you don’t you are asking for trouble as 
inheritance was not considered when it was written.  Adding an `inheritable` 
annotation is trivial when compared to the analysis that should be performed 
when it becomes a superclass.

3) Annoyance.  Some consider it to be annoying to have to annotate a class 
declaration in order to inherit from it.  People stating this argument either 
are either writing a lot of superclasses or are so bothered by the need to 
annotate their type declarations that they avoid `final` and its related 
benefits when that is really the right thing for their class.  For me 
personally, `final` is the right thing for most classes I write.  I also think 
adding a `final` annotation is the right thing to do if you’re not sure whether 
it will be a superclass or not.  The need to modify the annotation will remind 
you that you haven’t fully considered inheritance in your design yet.

4) Testing.  This is solvable with behavior similar to @testable.  It should 
not influence the decision about what the default is for production code.

5) Prototyping.  This should also not influence the decision about what the 
default is for production code.  I would not have a problem with a prototyping 
environment allowing `inheritable` by default (maybe a Playground mode?).  
There could even be a tool that migrates the prototype to a real project and 
adds the `inheritable` annotation where necessary.  Regardless of what happens 
here, the prototyping problem can and should be solved independently of the 
production language and should not influence the default is used in and impacts 
production code.

6) Education.  There may be some value in allowing inheritance by default in 
education settings, especially early on.  I view this as being quite similar to 
the prototyping case and again should not have an influence on the default that 
professionals use in production code.

If I have missed any of the major arguments against making `final` the default 
please respond with them.  I would be happy to add them to the list.

I don’t find any of these arguments compelling.  The only one that is really 
relevant to production code (once you accept the reality of #1) is annoyance 
which I consider a minor complaint that is definitely not relevant to my code 
and is likely not relevant to many other people’s code as well.  

On the other hand, the argument for `final` as default is compelling IMHO.  As 
has been stated many times, inheritance is part of an API contract that needs 
to be considered as clearly as anything else.  This still applies when the API 
is internal to an app.

Final by default greatly improves our ability to get up to speed on a new 
codebase quickly and reason about the code:

1) When I look at an unannotated class declaration I know there are no 
subclasses.  I don’t have to search the code to look for subclasses.  
2) I know that where there are superclasses, the author was reminded by the 
language to consider inheritance.  They may have made mistakes in the design of 
the superclass, but at least the language gave them a subtle reminder that they 
need to think about it.  
3) I also know there will not be any subclasses in the future unless someone 
adds an `inheritable` annotation (in which case they are responsible for 
considering the implications of that).  The `inheritable` annotation also 
serves as a good prompt for code reviews to consider the implications of that 
annotation if / when the class becomes an intentional superclass.

Of course all of these advantages also apply to a codebase where I am the sole 
author and maintainer when I come back to it year or two later and have 
forgotten some details.

One consideration that has not been definitively established one way or the 
other is frequency of use.  In application code are there usually more classes 
that are superclasses (or could reasonably be a superclass in the future 
without additional analysis and design)?  Or are there usually more classes 
that are `final`, effectively final, or should be final, at least until further 
analysis and design has been performed?  

In my experience the reality is that the majority of my classes inherit from 
UIKit classes, but are not themselves superclasses.  I don’t claim to speak for 
anyone else, but I think we would find that to be the most common pattern if we 
looked at the question closely.

I hope this is a reasonably accurate summary of the positions on both sides of 
this.

Matthew



> 
> On Tue, Dec 22, 2015, at 09:03 AM, Paul Cantrell via swift-evolution wrote:
>> Joe’s and Brent’s writeups copied below really get to the heart of this for 
>> me. This is a tough question, and I find myself torn. I’m sympathetic to 
>> both lines of argument.
>> 
>> It’s not entirely true that “you can never take back” overridability — you 
>> can make a breaking API change with a new major version, of course — but 
>> it’s a compelling point nonetheless. One default is clearly safer than the 
>> other, at least for the library author. “Safe by default” is indeed the 
>> Swift MO. (Well, except for array subscripting. And any public API involving 
>> tuples, for which any change, even type widening, is a breaking change. 
>> And….OK, it’s not absolute, but “safe by default” is the MO 95% of the 
>> time.) “Final by default” just seems more Swift-like to me.
>> 
>> Despite that, Joe, I have to agree with Brent on his central point: the 
>> perspective that comes from spending a lot of time _writing_ libraries is 
>> very different from one who spend more time _using_ them. Yes, UIKit is not 
>> going to be rewritten in Swift anytime soon, but Brent is rightly imagining 
>> a future where the Swift way is The Way.
>> 
>> I weigh the safety argument against the many long hours I’ve spent beating 
>> my head against library behaviors, wishing I could read UIKit’s source code, 
>> wishing I could tweak that one little thing that I can’t control, and being 
>> grateful for the dubious workaround that saves the day — yes, even when a 
>> subsequent OS update breaks it. I know what I’m getting into when I solve a 
>> problem with a hack, and if the hack ships, it’s only because I weighed the 
>> risks and benefits. We app developers rely on swizzling, dubious 
>> subclassing, and (especially) undocumented behavior far more often than any 
>> of us would like. It is just part of the reality of making things ship — and 
>> an important part of the success of Apple’s app ecosystem.
>> 
>> This debate reminds me of something that often happens when a 
>> humans-and-paper process moves to software. When the software starts 
>> rigorously enforcing all the rules the humans were theoretically following 
>> all along, and it turns out that quite a lot of in-the-moment nuanced human 
>> judgement was crucial to making everything work. With nuance removed, things 
>> fall apart — and instead of things at last achieving the rigor that seemed 
>> so desirable in theory, the process has to explicitly loosen. (At the local 
>> coffee shop, a new iPad-based POS system suddenly made it an “uh let me get 
>> the manager” moment when I want to get the off-menu half-sized oatmeal I’ve 
>> always got for my toddler.)
>> 
>> I’m not totally opposed to final by default. Joe’s arguments sway me in 
>> principle. In practice, if Swift does indeed moves us toward “less wiggle 
>> room, less hackable” by default, then that wiggle room _will_ have to come 
>> from somewhere else: perhaps more open sourcing and more forking, or faster 
>> turnaround on fixes from library authors, or a larger portion of time spent 
>> by library authors explicitly exposing and documenting customization points. 
>> The new effort involved for library authors is nothing to sneeze at.
>> 
>> Cheers,
>> 
>> Paul
>> 
>> 
>>> On Dec 22, 2015, at 9:46 AM, Joe Groff via swift-evolution 
>>> <swift-evolution@swift.org> wrote:
>>> 
>>> I think a lot of people in this thread are conflating "final by default" or 
>>> "sealed by default" with "sealed everywhere". No matter what the default 
>>> is, the frameworks aren't going to suddenly rewrite themselves in Swift 
>>> with final everything; Objective-C will still be what it is. Furthermore, 
>>> we're only talking about language defaults; we're not taking away the 
>>> ability for frameworks to make their classes publicly subclassable or 
>>> dynamically overrideable. That's a policy decision for framework authors to 
>>> make. The goal of "sealed by default" is to make sure the language doesn't 
>>> make promises on the developer's behalf that they weren't ready to keep. 
>>> ObjC's flexibility is valuable, and Apple definitely takes advantage of it 
>>> internally all over place; Apple also has an army of compatibility 
>>> engineers to make sure framework changes work well with existing software. 
>>> Not every developer can afford that maintenance burden/flexibility 
>>> tradeoff, though, so that flexibility is something you ought to opt in to. 
>>> You can always safely add public subclassability and dynamic 
>>> overrideability in new framework versions, but you can never take them back.
>> 
>> 
>>> On Dec 22, 2015, at 12:31 AM, Brent Royal-Gordon via swift-evolution 
>>> <swift-evolution@swift.org> wrote:
>>> 
>>> Just imagine going through UIKit and marking every class inheritable *by 
>>> hand*—no cheating with a script—and you'll have some idea of the additional 
>>> burden you'll be imposing on developers as they write their code. The 
>>> proposals that every single method should be explicitly marked as 
>>> overridable are even worse; frankly, I don't think I'd want to use Swift if 
>>> you forced me to put a `virtual` keyword on every declaration.
>>> 
>>> I worry that the team's use of Swift to build the standard library, and 
>>> their close association with teams building OS frameworks, is biasing the 
>>> language a little bit. I think that, in all likelihood, most Swift code is 
>>> in individual applications, and most libraries are not published outside of 
>>> a single team. If I'm right, then most Swift code will probably be quite 
>>> tolerant of small but technically "breaking" ABI changes, such as making a 
>>> class `final`, or (as mentioned in another thread) making a closure 
>>> `@noescape`.
>>> 
>>> That won't be true of published library code, of course. But published 
>>> library code is a small minority of the Swift code people will write, and 
>>> it already will require greater scrutiny and more careful design. 
>>> 
>>> There is already a good opportunity to reflect on whether or not an API 
>>> should be `final`. It's when you put the `public` keyword on it. I think 
>>> programmers will have a better, easier time writing their code if, in this 
>>> case, we put a little bit of trust in them, rather than erecting yet 
>>> another hoop they must jump through.
>>> 
>>> Perhaps we could even provide a "strict interfaces" mode that published 
>>> frameworks can turn on, which would require you to declare the heritability 
>>> of every class and member. But even that may not be a good idea, because I 
>>> also suspect that, in the field, most published libraries probably have to 
>>> be extended in ways the library's author did not expect or anticipate. 
>>> 
>>> This means doing some dangerous overriding, yes. But a UI that breaks after 
>>> an iOS upgrade is not nearly as dangerous to my business as a three-month 
>>> delay while I reimplement half of UIKit because someone in Cupertino 
>>> thought they knew what I need better than I do and turned off—or even 
>>> worse, *left turned off without a single thought*—subclassing of 
>>> UIBarButtonItem.
>>> 
>>> The bottom line is this: Your users like Swift's strictures when they're 
>>> helpful. *This stricture is not helpful.* Library users don't accidentally 
>>> subclass things, and with the `override` keyword in Swift, they don't 
>>> accidentally override them either. And where it truly is important, for 
>>> safety or for speed, to prevent subclassing, we already have `final`. 
>>> Making it the default is less safety than suffering.
>> _______________________________________________
>> swift-evolution mailing list
>> swift-evolution@swift.org
>> https://lists.swift.org/mailman/listinfo/swift-evolution
> _______________________________________________
> swift-evolution mailing list
> swift-evolution@swift.org
> https://lists.swift.org/mailman/listinfo/swift-evolution

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution

Reply via email to