Re: [swift-evolution] $self

2016-09-28 Thread Sean Heber via swift-evolution
Yeah, hard to say how extreme it needs to be to justify this sort of change. 
Does running into unintentional retain cycles when using closures for event 
handlers *constantly* count as extreme? :)

In a typical garbage collected language, using closures like this is common and 
a non-issue, but in Swift the ownership in these situations is something you 
have to actively think about which is a distraction when you want to spend your 
time thinking about what you're building instead. It reminds me of manual 
memory management (which of course it is).

l8r
Sean 

Sent from my iPad

> On Sep 28, 2016, at 7:48 PM, Jay Abbott  wrote:
> 
> Sean, yeah that's kind of what I was suggesting for @escaping closures - it 
> should be the default... but it is a breaking change and as Chris pointed out 
> that would need extreme justification... plus it would be a behaviour change 
> with no syntax change, so code that was never "upgraded" would be very 
> difficult to tell the original intention. I like the idea but I don't know if 
> I can come up with "extreme justification" for it.
> 
>> On Thu, 29 Sep 2016 at 01:46 Sean Heber  wrote:
>> Now that I think about this, wouldn't that be a better default behavior? All 
>> captures are this "required" type which means all closures are typed as 
>> optional. To override that behavior, you'd have to explicitly declare a weak 
>> or unowned capture instead and if you did that for all reference captures, 
>> the closure's type would be non-optional as they are now. Seems like that'd 
>> be safer. I'll shut up now.
>> 
>> l8r
>> Sean
>> 
>> Sent from my iPad
>> 
>>> On Sep 28, 2016, at 7:25 PM, Sean Heber  wrote:
>>> 
>>> Pretty sure this is all way out of scope, but something about this made me 
>>> think about this idea (which maybe isn't unique or is maybe even 
>>> unworkable), but imagine something like where a new capture type is added 
>>> such as "required" (for lack of another name right now). And the way this 
>>> works is similar to unowned, but it makes the entire closure "weak" in such 
>>> a way that the moment any of the required captures go nil, any references 
>>> to that closure instance also effectively become nil.
>>> 
>>> So modifying the example:
>>> 
>>> func viewDidLoad() {
>>> self.loginForm.onSubmit = {[required self]
>>>  let f = self.loginForm
>>>  self.startLoginRequest(email:f.email.text, pwd:f.pwd.text)
>>> }
>>> }
>>> 
>>> So in this case, "required self" means self is effectively "unowned" but 
>>> any references to this closure would have to be weak optional like: weak 
>>> var onSubmit: (()->Void)? So that when the view controller gets 
>>> deallocated, the closure goes with it and references become nil.
>>> 
>>> l8r
>>> Sean
>>> 
>>> Sent from my iPad
>>> 
 On Sep 28, 2016, at 6:42 PM, Jay Abbott via swift-evolution 
  wrote:
 
 It could potentially be a breaking change if the default for @escaping 
 closures were made to be weak-capturing.
 
 Since the weak-capturing pattern is only really desirable for @escaping 
 closures, and (I think) it would be the usual preference, could @escaping 
 also imply weak-capturing for all references (not just self)? Then there 
 would be another syntax for strong-capturing-escaping closures. 
 Non-escaping closures could a) strongly capture references; or b) existing 
 strong references stay strong and weak ones stay weak, meaning no 
 ref-counts need to change at all when passing them.
 
 
> On Thu, 29 Sep 2016 at 00:06 Paul Jack via swift-evolution 
>  wrote:
> So previously there were a few threads on the "strong self/weak self
> dance" but they didn't seem to get anywhere. For instance:
> 
> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160201/008713.html
> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160215/010759.html
> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160208/009972.html
> 
> ...and possibly others.
> 
> I'd like to propose something even easier (and more specific) than all
> of the above discussions. Specifically, I'd like to introduce a new
> automagic closure variable, $self, whose presence in a closure would
> cause that closure to weakly capture self in a safe manner.
> 
> As a concrete example, let's imagine a UIViewController for a login
> form. Under this proposal, the following code:
> 
> func viewDidLoad() {
> self.loginForm.onSubmit = {
>  let f = $self.loginForm
>  $self.startLoginRequest(email:f.email.text, pwd:f.pwd.text)
> }
> }
> 
> ...would be treated by the compiler as equivalent to:
> 
> func viewDidLoad() {
> self.loginForm.onSubmit = {
>  [weak self] in
>  

Re: [swift-evolution] $self

2016-09-28 Thread Charles Srstka via swift-evolution
> On Sep 28, 2016, at 7:48 PM, Jay Abbott via swift-evolution 
>  wrote:
> 
> Sean, yeah that's kind of what I was suggesting for @escaping closures - it 
> should be the default... but it is a breaking change and as Chris pointed out 
> that would need extreme justification... plus it would be a behaviour change 
> with no syntax change, so code that was never "upgraded" would be very 
> difficult to tell the original intention. I like the idea but I don't know if 
> I can come up with "extreme justification" for it.

How do reference cycles created by accidental implicit captures rank in the 
list of commonly encountered runtime issues? It’s gotta be high, isn’t it? 
Especially if you limit the scope to memory leaks—it seems every time I’ve had 
to track down one of those since ARC came out, it’s been either an unintended 
closure capture or something KVO-related.

Closures are, for me, the #1 time that I switch to super-defensive coding mode. 
Requiring explicit semantics to capture variables would surely reduce a lot of 
memory issues, and also fix crashes that could be caused by objects being 
deallocated in the wrong order. The quality of code in general would improve. 
Is that extreme? I don’t know, but I certainly wouldn’t complain if it happened.

Charles

___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] $self

2016-09-28 Thread Jay Abbott via swift-evolution
Sean, yeah that's kind of what I was suggesting for @escaping closures - it
should be the default... but it is a breaking change and as Chris pointed
out that would need extreme justification... plus it would be a behaviour
change with no syntax change, so code that was never "upgraded" would be
very difficult to tell the original intention. I like the idea but I don't
know if I can come up with "extreme justification" for it.

On Thu, 29 Sep 2016 at 01:46 Sean Heber  wrote:

> Now that I think about this, wouldn't that be a better default behavior?
> All captures are this "required" type which means all closures are typed as
> optional. To override that behavior, you'd have to explicitly declare a
> weak or unowned capture instead and if you did that for all reference
> captures, the closure's type would be non-optional as they are now. Seems
> like that'd be safer. I'll shut up now.
>
> l8r
> Sean
>
> Sent from my iPad
>
> On Sep 28, 2016, at 7:25 PM, Sean Heber  wrote:
>
> Pretty sure this is all way out of scope, but something about this made me
> think about this idea (which maybe isn't unique or is maybe even
> unworkable), but imagine something like where a new capture type is added
> such as "required" (for lack of another name right now). And the way this
> works is similar to unowned, but it makes the entire closure "weak" in such
> a way that the moment any of the required captures go nil, any references
> to that closure instance also effectively become nil.
>
> So modifying the example:
>
> func viewDidLoad() {
> self.loginForm.onSubmit = {[required self]
>  let f = self.loginForm
>  self.startLoginRequest(email:f.email.text, pwd:f.pwd.text)
> }
> }
>
> So in this case, "required self" means self is effectively "unowned" but
> any references to this closure would have to be weak optional like: weak
> var onSubmit: (()->Void)? So that when the view controller gets
> deallocated, the closure goes with it and references become nil.
>
> l8r
> Sean
>
> Sent from my iPad
>
> On Sep 28, 2016, at 6:42 PM, Jay Abbott via swift-evolution <
> swift-evolution@swift.org> wrote:
>
> It could potentially be a breaking change if the default for @escaping
> closures were made to be weak-capturing.
>
> Since the weak-capturing pattern is only really desirable for @escaping
> closures, and (I think) it would be the usual preference, could @escaping
> also imply weak-capturing for all references (not just self)? Then there
> would be another syntax for strong-capturing-escaping closures.
> Non-escaping closures could a) strongly capture references; or b) existing
> strong references stay strong and weak ones stay weak, meaning no
> ref-counts need to change at all when passing them.
>
>
> On Thu, 29 Sep 2016 at 00:06 Paul Jack via swift-evolution <
> swift-evolution@swift.org> wrote:
>
>> So previously there were a few threads on the "strong self/weak self
>> dance" but they didn't seem to get anywhere. For instance:
>>
>>
>> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160201/008713.html
>>
>> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160215/010759.html
>>
>> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160208/009972.html
>>
>> ...and possibly others.
>>
>> I'd like to propose something even easier (and more specific) than all
>> of the above discussions. Specifically, I'd like to introduce a new
>> automagic closure variable, $self, whose presence in a closure would
>> cause that closure to weakly capture self in a safe manner.
>>
>> As a concrete example, let's imagine a UIViewController for a login
>> form. Under this proposal, the following code:
>>
>> func viewDidLoad() {
>> self.loginForm.onSubmit = {
>>  let f = $self.loginForm
>>  $self.startLoginRequest(email:f.email.text, pwd:f.pwd.text)
>> }
>> }
>>
>> ...would be treated by the compiler as equivalent to:
>>
>> func viewDidLoad() {
>> self.loginForm.onSubmit = {
>>  [weak self] in
>>  if let selfie = self {
>>  let f = selfie.loginForm
>>  selfie.startLoginRequest(email:f.email.text,
>>  pwd:f.pwd.text)
>>  }
>> }
>> }
>>
>> Note the "if let" there: If self no longer exists, the closure does not
>> execute at all, but if self does exist, then it exists for the entirety
>> of the execution of the closure (ie, self won't vanish as a side-effect
>> of some statement in the closure.) I think these semantics obey the
>> principle of least surprise; $self can be treated by the developer as a
>> strong reference.
>>
>> However, that does mean that $self can only be used in a closure that's
>> (a) Void or (b) Optional. In the latter case, returning nil when self
>> doesn't exist seems like reasonable/expected behavior.
>>
>> It would be a compile-time error to use both $self and normal self in
>> the same closure.
>>
>> I'd like to keep this 

Re: [swift-evolution] $self

2016-09-28 Thread Sean Heber via swift-evolution
Now that I think about this, wouldn't that be a better default behavior? All 
captures are this "required" type which means all closures are typed as 
optional. To override that behavior, you'd have to explicitly declare a weak or 
unowned capture instead and if you did that for all reference captures, the 
closure's type would be non-optional as they are now. Seems like that'd be 
safer. I'll shut up now.

l8r
Sean

Sent from my iPad

> On Sep 28, 2016, at 7:25 PM, Sean Heber  wrote:
> 
> Pretty sure this is all way out of scope, but something about this made me 
> think about this idea (which maybe isn't unique or is maybe even unworkable), 
> but imagine something like where a new capture type is added such as 
> "required" (for lack of another name right now). And the way this works is 
> similar to unowned, but it makes the entire closure "weak" in such a way that 
> the moment any of the required captures go nil, any references to that 
> closure instance also effectively become nil.
> 
> So modifying the example:
> 
> func viewDidLoad() {
> self.loginForm.onSubmit = {[required self]
>  let f = self.loginForm
>  self.startLoginRequest(email:f.email.text, pwd:f.pwd.text)
> }
> }
> 
> So in this case, "required self" means self is effectively "unowned" but any 
> references to this closure would have to be weak optional like: weak var 
> onSubmit: (()->Void)? So that when the view controller gets deallocated, the 
> closure goes with it and references become nil.
> 
> l8r
> Sean
> 
> Sent from my iPad
> 
>> On Sep 28, 2016, at 6:42 PM, Jay Abbott via swift-evolution 
>>  wrote:
>> 
>> It could potentially be a breaking change if the default for @escaping 
>> closures were made to be weak-capturing.
>> 
>> Since the weak-capturing pattern is only really desirable for @escaping 
>> closures, and (I think) it would be the usual preference, could @escaping 
>> also imply weak-capturing for all references (not just self)? Then there 
>> would be another syntax for strong-capturing-escaping closures. Non-escaping 
>> closures could a) strongly capture references; or b) existing strong 
>> references stay strong and weak ones stay weak, meaning no ref-counts need 
>> to change at all when passing them.
>> 
>> 
>>> On Thu, 29 Sep 2016 at 00:06 Paul Jack via swift-evolution 
>>>  wrote:
>>> So previously there were a few threads on the "strong self/weak self
>>> dance" but they didn't seem to get anywhere. For instance:
>>> 
>>> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160201/008713.html
>>> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160215/010759.html
>>> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160208/009972.html
>>> 
>>> ...and possibly others.
>>> 
>>> I'd like to propose something even easier (and more specific) than all
>>> of the above discussions. Specifically, I'd like to introduce a new
>>> automagic closure variable, $self, whose presence in a closure would
>>> cause that closure to weakly capture self in a safe manner.
>>> 
>>> As a concrete example, let's imagine a UIViewController for a login
>>> form. Under this proposal, the following code:
>>> 
>>> func viewDidLoad() {
>>> self.loginForm.onSubmit = {
>>>  let f = $self.loginForm
>>>  $self.startLoginRequest(email:f.email.text, pwd:f.pwd.text)
>>> }
>>> }
>>> 
>>> ...would be treated by the compiler as equivalent to:
>>> 
>>> func viewDidLoad() {
>>> self.loginForm.onSubmit = {
>>>  [weak self] in
>>>  if let selfie = self {
>>>  let f = selfie.loginForm
>>>  selfie.startLoginRequest(email:f.email.text,
>>>  pwd:f.pwd.text)
>>>  }
>>> }
>>> }
>>> 
>>> Note the "if let" there: If self no longer exists, the closure does not
>>> execute at all, but if self does exist, then it exists for the entirety
>>> of the execution of the closure (ie, self won't vanish as a side-effect
>>> of some statement in the closure.) I think these semantics obey the
>>> principle of least surprise; $self can be treated by the developer as a
>>> strong reference.
>>> 
>>> However, that does mean that $self can only be used in a closure that's
>>> (a) Void or (b) Optional. In the latter case, returning nil when self
>>> doesn't exist seems like reasonable/expected behavior.
>>> 
>>> It would be a compile-time error to use both $self and normal self in
>>> the same closure.
>>> 
>>> I'd like to keep this simple, meaning $self always does the above and
>>> nothing else. So, if you need an unowned self, you still need the
>>> original syntax; if your closure needs a non-Optional return type, you
>>> still need the original syntax; etc.
>>> 
>>> Thoughts?
>>> 
>>> -Paul
>>> ___
>>> swift-evolution mailing list
>>> swift-evolution@swift.org
>>> 

Re: [swift-evolution] $self

2016-09-28 Thread Jay Abbott via swift-evolution
I think Sean's idea for [required refName] on this is the better one in
terms of syntax and clarity of what's going on. It's fairly clear that the
required refs are weak but become strong during the closure execution, and
that since they're 'required' the closure goes away if they do.

In practice, with lazy zeroing I think it would not be viable to zero the
closure ref until calling it was attempted and the strong ref on its
required captures failed.
One for the 'deferred' pile I guess :P

On Thu, 29 Sep 2016 at 01:27 Chris Lattner  wrote:

> On Sep 28, 2016, at 4:42 PM, Jay Abbott via swift-evolution <
> swift-evolution@swift.org> wrote:
>
> It could potentially be a breaking change if the default for @escaping
> closures were made to be weak-capturing.
>
>
> Ok, but source breaking changes need extreme justification.  A primary
> goal of Swift 3 was to provide source compatibility going forward.
>
> -Chris
>
>
>
> Since the weak-capturing pattern is only really desirable for @escaping
> closures, and (I think) it would be the usual preference, could @escaping
> also imply weak-capturing for all references (not just self)? Then there
> would be another syntax for strong-capturing-escaping closures.
> Non-escaping closures could a) strongly capture references; or b) existing
> strong references stay strong and weak ones stay weak, meaning no
> ref-counts need to change at all when passing them.
>
>
> On Thu, 29 Sep 2016 at 00:06 Paul Jack via swift-evolution <
> swift-evolution@swift.org> wrote:
>
>> So previously there were a few threads on the "strong self/weak self
>> dance" but they didn't seem to get anywhere. For instance:
>>
>>
>> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160201/008713.html
>>
>> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160215/010759.html
>>
>> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160208/009972.html
>>
>> ...and possibly others.
>>
>> I'd like to propose something even easier (and more specific) than all
>> of the above discussions. Specifically, I'd like to introduce a new
>> automagic closure variable, $self, whose presence in a closure would
>> cause that closure to weakly capture self in a safe manner.
>>
>> As a concrete example, let's imagine a UIViewController for a login
>> form. Under this proposal, the following code:
>>
>> func viewDidLoad() {
>> self.loginForm.onSubmit = {
>>  let f = $self.loginForm
>>  $self.startLoginRequest(email:f.email.text, pwd:f.pwd.text)
>> }
>> }
>>
>> ...would be treated by the compiler as equivalent to:
>>
>> func viewDidLoad() {
>> self.loginForm.onSubmit = {
>>  [weak self] in
>>  if let selfie = self {
>>  let f = selfie.loginForm
>>  selfie.startLoginRequest(email:f.email.text,
>>  pwd:f.pwd.text)
>>  }
>> }
>> }
>>
>> Note the "if let" there: If self no longer exists, the closure does not
>> execute at all, but if self does exist, then it exists for the entirety
>> of the execution of the closure (ie, self won't vanish as a side-effect
>> of some statement in the closure.) I think these semantics obey the
>> principle of least surprise; $self can be treated by the developer as a
>> strong reference.
>>
>> However, that does mean that $self can only be used in a closure that's
>> (a) Void or (b) Optional. In the latter case, returning nil when self
>> doesn't exist seems like reasonable/expected behavior.
>>
>> It would be a compile-time error to use both $self and normal self in
>> the same closure.
>>
>> I'd like to keep this simple, meaning $self always does the above and
>> nothing else. So, if you need an unowned self, you still need the
>> original syntax; if your closure needs a non-Optional return type, you
>> still need the original syntax; etc.
>>
>> Thoughts?
>>
>> -Paul
>> ___
>> swift-evolution mailing list
>> swift-evolution@swift.org
>> https://lists.swift.org/mailman/listinfo/swift-evolution
>>
> ___
> swift-evolution mailing list
> swift-evolution@swift.org
> https://lists.swift.org/mailman/listinfo/swift-evolution
>
>
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] $self

2016-09-28 Thread Chris Lattner via swift-evolution

> On Sep 28, 2016, at 4:42 PM, Jay Abbott via swift-evolution 
>  wrote:
> 
> It could potentially be a breaking change if the default for @escaping 
> closures were made to be weak-capturing.

Ok, but source breaking changes need extreme justification.  A primary goal of 
Swift 3 was to provide source compatibility going forward.

-Chris


> 
> Since the weak-capturing pattern is only really desirable for @escaping 
> closures, and (I think) it would be the usual preference, could @escaping 
> also imply weak-capturing for all references (not just self)? Then there 
> would be another syntax for strong-capturing-escaping closures. Non-escaping 
> closures could a) strongly capture references; or b) existing strong 
> references stay strong and weak ones stay weak, meaning no ref-counts need to 
> change at all when passing them.
> 
> 
> On Thu, 29 Sep 2016 at 00:06 Paul Jack via swift-evolution 
> > wrote:
> So previously there were a few threads on the "strong self/weak self
> dance" but they didn't seem to get anywhere. For instance:
> 
> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160201/008713.html
>  
> 
> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160215/010759.html
>  
> 
> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160208/009972.html
>  
> 
> 
> ...and possibly others.
> 
> I'd like to propose something even easier (and more specific) than all
> of the above discussions. Specifically, I'd like to introduce a new
> automagic closure variable, $self, whose presence in a closure would
> cause that closure to weakly capture self in a safe manner.
> 
> As a concrete example, let's imagine a UIViewController for a login
> form. Under this proposal, the following code:
> 
> func viewDidLoad() {
> self.loginForm.onSubmit = {
>  let f = $self.loginForm
>  $self.startLoginRequest(email:f.email.text, pwd:f.pwd.text)
> }
> }
> 
> ...would be treated by the compiler as equivalent to:
> 
> func viewDidLoad() {
> self.loginForm.onSubmit = {
>  [weak self] in
>  if let selfie = self {
>  let f = selfie.loginForm
>  selfie.startLoginRequest(email:f.email.text,
>  pwd:f.pwd.text)
>  }
> }
> }
> 
> Note the "if let" there: If self no longer exists, the closure does not
> execute at all, but if self does exist, then it exists for the entirety
> of the execution of the closure (ie, self won't vanish as a side-effect
> of some statement in the closure.) I think these semantics obey the
> principle of least surprise; $self can be treated by the developer as a
> strong reference.
> 
> However, that does mean that $self can only be used in a closure that's
> (a) Void or (b) Optional. In the latter case, returning nil when self
> doesn't exist seems like reasonable/expected behavior.
> 
> It would be a compile-time error to use both $self and normal self in
> the same closure.
> 
> I'd like to keep this simple, meaning $self always does the above and
> nothing else. So, if you need an unowned self, you still need the
> original syntax; if your closure needs a non-Optional return type, you
> still need the original syntax; etc.
> 
> Thoughts?
> 
> -Paul
> ___
> swift-evolution mailing list
> swift-evolution@swift.org 
> https://lists.swift.org/mailman/listinfo/swift-evolution 
> 
> ___
> swift-evolution mailing list
> swift-evolution@swift.org
> https://lists.swift.org/mailman/listinfo/swift-evolution

___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] $self

2016-09-28 Thread Sean Heber via swift-evolution
Pretty sure this is all way out of scope, but something about this made me 
think about this idea (which maybe isn't unique or is maybe even unworkable), 
but imagine something like where a new capture type is added such as "required" 
(for lack of another name right now). And the way this works is similar to 
unowned, but it makes the entire closure "weak" in such a way that the moment 
any of the required captures go nil, any references to that closure instance 
also effectively become nil.

So modifying the example:

func viewDidLoad() {
self.loginForm.onSubmit = {[required self]
 let f = self.loginForm
 self.startLoginRequest(email:f.email.text, pwd:f.pwd.text)
}
}

So in this case, "required self" means self is effectively "unowned" but any 
references to this closure would have to be weak optional like: weak var 
onSubmit: (()->Void)? So that when the view controller gets deallocated, the 
closure goes with it and references become nil.

l8r
Sean

Sent from my iPad

> On Sep 28, 2016, at 6:42 PM, Jay Abbott via swift-evolution 
>  wrote:
> 
> It could potentially be a breaking change if the default for @escaping 
> closures were made to be weak-capturing.
> 
> Since the weak-capturing pattern is only really desirable for @escaping 
> closures, and (I think) it would be the usual preference, could @escaping 
> also imply weak-capturing for all references (not just self)? Then there 
> would be another syntax for strong-capturing-escaping closures. Non-escaping 
> closures could a) strongly capture references; or b) existing strong 
> references stay strong and weak ones stay weak, meaning no ref-counts need to 
> change at all when passing them.
> 
> 
>> On Thu, 29 Sep 2016 at 00:06 Paul Jack via swift-evolution 
>>  wrote:
>> So previously there were a few threads on the "strong self/weak self
>> dance" but they didn't seem to get anywhere. For instance:
>> 
>> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160201/008713.html
>> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160215/010759.html
>> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160208/009972.html
>> 
>> ...and possibly others.
>> 
>> I'd like to propose something even easier (and more specific) than all
>> of the above discussions. Specifically, I'd like to introduce a new
>> automagic closure variable, $self, whose presence in a closure would
>> cause that closure to weakly capture self in a safe manner.
>> 
>> As a concrete example, let's imagine a UIViewController for a login
>> form. Under this proposal, the following code:
>> 
>> func viewDidLoad() {
>> self.loginForm.onSubmit = {
>>  let f = $self.loginForm
>>  $self.startLoginRequest(email:f.email.text, pwd:f.pwd.text)
>> }
>> }
>> 
>> ...would be treated by the compiler as equivalent to:
>> 
>> func viewDidLoad() {
>> self.loginForm.onSubmit = {
>>  [weak self] in
>>  if let selfie = self {
>>  let f = selfie.loginForm
>>  selfie.startLoginRequest(email:f.email.text,
>>  pwd:f.pwd.text)
>>  }
>> }
>> }
>> 
>> Note the "if let" there: If self no longer exists, the closure does not
>> execute at all, but if self does exist, then it exists for the entirety
>> of the execution of the closure (ie, self won't vanish as a side-effect
>> of some statement in the closure.) I think these semantics obey the
>> principle of least surprise; $self can be treated by the developer as a
>> strong reference.
>> 
>> However, that does mean that $self can only be used in a closure that's
>> (a) Void or (b) Optional. In the latter case, returning nil when self
>> doesn't exist seems like reasonable/expected behavior.
>> 
>> It would be a compile-time error to use both $self and normal self in
>> the same closure.
>> 
>> I'd like to keep this simple, meaning $self always does the above and
>> nothing else. So, if you need an unowned self, you still need the
>> original syntax; if your closure needs a non-Optional return type, you
>> still need the original syntax; etc.
>> 
>> Thoughts?
>> 
>> -Paul
>> ___
>> swift-evolution mailing list
>> swift-evolution@swift.org
>> https://lists.swift.org/mailman/listinfo/swift-evolution
> ___
> swift-evolution mailing list
> swift-evolution@swift.org
> https://lists.swift.org/mailman/listinfo/swift-evolution
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] $self

2016-09-28 Thread Jay Abbott via swift-evolution
It could potentially be a breaking change if the default for @escaping
closures were made to be weak-capturing.

Since the weak-capturing pattern is only really desirable for @escaping
closures, and (I think) it would be the usual preference, could @escaping
also imply weak-capturing for all references (not just self)? Then there
would be another syntax for strong-capturing-escaping closures.
Non-escaping closures could a) strongly capture references; or b) existing
strong references stay strong and weak ones stay weak, meaning no
ref-counts need to change at all when passing them.


On Thu, 29 Sep 2016 at 00:06 Paul Jack via swift-evolution <
swift-evolution@swift.org> wrote:

> So previously there were a few threads on the "strong self/weak self
> dance" but they didn't seem to get anywhere. For instance:
>
>
> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160201/008713.html
>
> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160215/010759.html
>
> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160208/009972.html
>
> ...and possibly others.
>
> I'd like to propose something even easier (and more specific) than all
> of the above discussions. Specifically, I'd like to introduce a new
> automagic closure variable, $self, whose presence in a closure would
> cause that closure to weakly capture self in a safe manner.
>
> As a concrete example, let's imagine a UIViewController for a login
> form. Under this proposal, the following code:
>
> func viewDidLoad() {
> self.loginForm.onSubmit = {
>  let f = $self.loginForm
>  $self.startLoginRequest(email:f.email.text, pwd:f.pwd.text)
> }
> }
>
> ...would be treated by the compiler as equivalent to:
>
> func viewDidLoad() {
> self.loginForm.onSubmit = {
>  [weak self] in
>  if let selfie = self {
>  let f = selfie.loginForm
>  selfie.startLoginRequest(email:f.email.text,
>  pwd:f.pwd.text)
>  }
> }
> }
>
> Note the "if let" there: If self no longer exists, the closure does not
> execute at all, but if self does exist, then it exists for the entirety
> of the execution of the closure (ie, self won't vanish as a side-effect
> of some statement in the closure.) I think these semantics obey the
> principle of least surprise; $self can be treated by the developer as a
> strong reference.
>
> However, that does mean that $self can only be used in a closure that's
> (a) Void or (b) Optional. In the latter case, returning nil when self
> doesn't exist seems like reasonable/expected behavior.
>
> It would be a compile-time error to use both $self and normal self in
> the same closure.
>
> I'd like to keep this simple, meaning $self always does the above and
> nothing else. So, if you need an unowned self, you still need the
> original syntax; if your closure needs a non-Optional return type, you
> still need the original syntax; etc.
>
> Thoughts?
>
> -Paul
> ___
> swift-evolution mailing list
> swift-evolution@swift.org
> https://lists.swift.org/mailman/listinfo/swift-evolution
>
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] $self

2016-09-28 Thread Chris Lattner via swift-evolution

> On Sep 28, 2016, at 4:05 PM, Paul Jack via swift-evolution 
>  wrote:
> 
> So previously there were a few threads on the "strong self/weak self
> dance" but they didn't seem to get anywhere. For instance:
> 
> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160201/008713.html
> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160215/010759.html
> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160208/009972.html
> 
> ...and possibly others.
> 
> I'd like to propose something even easier (and more specific) than all
> of the above discussions. Specifically, I'd like to introduce a new
> automagic closure variable, $self, whose presence in a closure would
> cause that closure to weakly capture self in a safe manner.

This is an additive proposal, so we’d prefer for you to hold off until we get 
the core work for Swift 4 stage 1 under control.  We need to focus on making 
sure ABI and Source Stability takes priority.

-Chris

___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] Self behaves inconsistently in protocol method signatures

2015-12-28 Thread Developer via swift-evolution
That doesn't look like a variance issue to me, that's about the same 
"information" invariant I talked about before.  The former works because self 
resolves to an invariant type, the type of the implementing structure, which 
satisfies the requirement Self introduces.  The latter does not because Self 
indicates a level of specificity C cannot guarantee.  Self is magic, but it is 
also implemented as a generic parameter.  So think of it this way:

protocol Q {
  func bar() -> T { return Q() }
}

You wouldn't expect that to compile, would you?

~Robert Widmann

2015/12/28 12:49、Matthew Johnson  のメッセージ:

> 
>> On Dec 28, 2015, at 11:19 AM, Developer  wrote:
>> 
>> My understanding of Self is that it is a special generic parameter resolved 
>> by the type system to the type of the implementing structure.  That 
>> resolution must be invariant because the implementing structure (here, 
>> non-final classes) can choose to yank the protocol's invariants out from 
>> under you when it is subclassed.  Sure, retroactively, you can make things 
>> conform, but you also can't completely guarantee type safety with any kind 
>> of variance in Self in all cases. 
>> 
>> On the other hand, using the protocol itself in either position says that 
>> you only wish to restrict yourself to the protocol itself, not some specific 
>> implementation.  You are necessarily specifying an upper bound (here C) on 
>> the amount of "information" you can get out of the type, so it is possible 
>> to introduce variance because you will never violate the protocol's 
>> invariants by returning a subtype with a legal conformance.
>> 
>> Self doesn't mean two different things, your protocol declarations do!
> 
> My mind must be a little bit foggy this morning.  This works:
> 
> extension C: Q {
> func bar() -> Self { return self }
> }
> 
> What doesn’t work, regardless of whether C is final or not, is this:
> 
> extension C: Q {
> // Cannot convert return expression of type ‘C’ to return type ‘Self'
> func bar() -> Self { return C() }
> }
> 
> In order for classes to meet a protocol requirement with Self in the return 
> position you must specify Self (rather than the conforming type) as the 
> return type for the method.  Self in the return position of a method is 
> treated as covariant.
> 
> In order for classes to meet a protocol requirement with Self in parameter 
> position you must specify the type of the conforming class (you cannot 
> specify Self in an argument position).  Obviously the type of the conforming 
> class is invariant.
> 
> This is the sense in which Self in protocol declarations is inconsistent.  
> The requirements on conforming types are different - invariance for Self 
> parameters and covariance for Self return types.
> 
> IMO it would be much more clear if this distinction was explicit rather than 
> implicit based on the location of Self.  It would also be extremely useful in 
> some cases to be able to specify an invariant `ConformingSelf` return type.
> 
> 
>> 
>> ~Robert Widmann
>> 
>> 2015/12/28 11:49、Matthew Johnson via swift-evolution 
>>  のメッセージ:
>> 
>>> I have brought up the idea of a non-covarying Self a few times.  
>>> 
>>> I was surprised to realize that Self is actually non-covarying when used 
>>> for parameters in protocol declarations!
>>> 
>>> Here is an example demonstrating this:
>>> 
>>> protocol P {
>>>func foo(s: Self)
>>> }
>>> protocol Q {
>>>func bar() -> Self
>>> }
>>> 
>>> class C: P {
>>>// this works!  Self as an argument type in the protocol declaration 
>>> does not covary
>>>func foo(c: C) {}
>>> }
>>> 
>>> class D: C {}
>>> 
>>> extension C: Q {
>>>// method ‘bar()’ in non-final class ‘C’ must return ‘Self’ to conform 
>>> to protocol ‘Q'
>>>func bar() -> C { return self } 
>>> }
>>> 
>>> 
>>> It doesn’t make sense to allow a co-varying Self for parameters so I can 
>>> understand how the current state might have arisen.  At the same time, 
>>> using Self to mean two different things is inconsistent, confusing and it 
>>> doesn’t allow us to specify a non-covarying Self as a return type in 
>>> protocol requirements.  
>>> 
>>> As I have pointed out before, the ability to specify a non-covarying Self 
>>> as a return type would make it possible to design a protocol that can be 
>>> retroactively conformed to by non-final classes (such as those in Apple’s 
>>> frameworks).
>>> 
>>> I think it would be a very good idea to introduce a non-covarying Self 
>>> which would specify the type that adds conformance to the protocol and 
>>> require this Self to be used in places where covariance is not possible, 
>>> such as parameter types.  It would also be allowed elsewhere, such as 
>>> return types, making it easier to conform non-final classes when covariance 
>>> is not required by the protocol.
>>> 
>>> One possible name is `ConformingSelf`.  One thing I like about 

Re: [swift-evolution] Self behaves inconsistently in protocol method signatures

2015-12-28 Thread Matthew Johnson via swift-evolution

> On Dec 28, 2015, at 11:19 AM, Developer  wrote:
> 
> My understanding of Self is that it is a special generic parameter resolved 
> by the type system to the type of the implementing structure.  That 
> resolution must be invariant because the implementing structure (here, 
> non-final classes) can choose to yank the protocol's invariants out from 
> under you when it is subclassed.  Sure, retroactively, you can make things 
> conform, but you also can't completely guarantee type safety with any kind of 
> variance in Self in all cases. 
> 
> On the other hand, using the protocol itself in either position says that you 
> only wish to restrict yourself to the protocol itself, not some specific 
> implementation.  You are necessarily specifying an upper bound (here C) on 
> the amount of "information" you can get out of the type, so it is possible to 
> introduce variance because you will never violate the protocol's invariants 
> by returning a subtype with a legal conformance.
> 
> Self doesn't mean two different things, your protocol declarations do!

My mind must be a little bit foggy this morning.  This works:

extension C: Q {
func bar() -> Self { return self }
}

What doesn’t work, regardless of whether C is final or not, is this:

extension C: Q {
// Cannot convert return expression of type ‘C’ to return type ‘Self'
func bar() -> Self { return C() }
}

In order for classes to meet a protocol requirement with Self in the return 
position you must specify Self (rather than the conforming type) as the return 
type for the method.  Self in the return position of a method is treated as 
covariant.

In order for classes to meet a protocol requirement with Self in parameter 
position you must specify the type of the conforming class (you cannot specify 
Self in an argument position).  Obviously the type of the conforming class is 
invariant.

This is the sense in which Self in protocol declarations is inconsistent.  The 
requirements on conforming types are different - invariance for Self parameters 
and covariance for Self return types.

IMO it would be much more clear if this distinction was explicit rather than 
implicit based on the location of Self.  It would also be extremely useful in 
some cases to be able to specify an invariant `ConformingSelf` return type.


> 
> ~Robert Widmann
> 
> 2015/12/28 11:49、Matthew Johnson via swift-evolution 
> > のメッセージ:
> 
>> I have brought up the idea of a non-covarying Self a few times.  
>> 
>> I was surprised to realize that Self is actually non-covarying when used for 
>> parameters in protocol declarations!
>> 
>> Here is an example demonstrating this:
>> 
>> protocol P {
>>func foo(s: Self)
>> }
>> protocol Q {
>>func bar() -> Self
>> }
>> 
>> class C: P {
>>// this works!  Self as an argument type in the protocol declaration does 
>> not covary
>>func foo(c: C) {}
>> }
>> 
>> class D: C {}
>> 
>> extension C: Q {
>>// method ‘bar()’ in non-final class ‘C’ must return ‘Self’ to conform to 
>> protocol ‘Q'
>>func bar() -> C { return self } 
>> }
>> 
>> 
>> It doesn’t make sense to allow a co-varying Self for parameters so I can 
>> understand how the current state might have arisen.  At the same time, using 
>> Self to mean two different things is inconsistent, confusing and it doesn’t 
>> allow us to specify a non-covarying Self as a return type in protocol 
>> requirements.  
>> 
>> As I have pointed out before, the ability to specify a non-covarying Self as 
>> a return type would make it possible to design a protocol that can be 
>> retroactively conformed to by non-final classes (such as those in Apple’s 
>> frameworks).
>> 
>> I think it would be a very good idea to introduce a non-covarying Self which 
>> would specify the type that adds conformance to the protocol and require 
>> this Self to be used in places where covariance is not possible, such as 
>> parameter types.  It would also be allowed elsewhere, such as return types, 
>> making it easier to conform non-final classes when covariance is not 
>> required by the protocol.
>> 
>> One possible name is `ConformingSelf`.  One thing I like about this name is 
>> that it makes it very clear that it is the type that introduces protocol 
>> conformance.
>> 
>> I’m interested in hearing thoughts on this.
>> 
>> Matthew
>> ___
>> swift-evolution mailing list
>> swift-evolution@swift.org 
>> https://lists.swift.org/mailman/listinfo/swift-evolution 
>> 

___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] Self behaves inconsistently in protocol method signatures

2015-12-28 Thread Joe Groff via swift-evolution

> On Dec 28, 2015, at 8:49 AM, Matthew Johnson via swift-evolution 
>  wrote:
> 
> I have brought up the idea of a non-covarying Self a few times.  
> 
> I was surprised to realize that Self is actually non-covarying when used for 
> parameters in protocol declarations!
> 
> Here is an example demonstrating this:
> 
> protocol P {
>func foo(s: Self)
> }
> protocol Q {
>func bar() -> Self
> }
> 
> class C: P {
>// this works!  Self as an argument type in the protocol declaration does 
> not covary
>func foo(c: C) {}
> }
> 
> class D: C {}
> 
> extension C: Q {
>// method ‘bar()’ in non-final class ‘C’ must return ‘Self’ to conform to 
> protocol ‘Q'
>func bar() -> C { return self } 
> }
> 
> 
> It doesn’t make sense to allow a co-varying Self for parameters so I can 
> understand how the current state might have arisen.  At the same time, using 
> Self to mean two different things is inconsistent, confusing and it doesn’t 
> allow us to specify a non-covarying Self as a return type in protocol 
> requirements.  
> 
> As I have pointed out before, the ability to specify a non-covarying Self as 
> a return type would make it possible to design a protocol that can be 
> retroactively conformed to by non-final classes (such as those in Apple’s 
> frameworks).
> 
> I think it would be a very good idea to introduce a non-covarying Self which 
> would specify the type that adds conformance to the protocol and require this 
> Self to be used in places where covariance is not possible, such as parameter 
> types.  It would also be allowed elsewhere, such as return types, making it 
> easier to conform non-final classes when covariance is not required by the 
> protocol.
> 
> One possible name is `ConformingSelf`.  One thing I like about this name is 
> that it makes it very clear that it is the type that introduces protocol 
> conformance.
> 
> I’m interested in hearing thoughts on this.

I proposed this a while back — see "controlling protocol conformance 
inheritance" from a while back. The current rules preserve inheritability of 
the protocol conformance in all cases. `Self` in argument positions maps to the 
root class of the conformance, since a method taking `Base` can also take any 
`Derived` class and thereby satisfy the conformance for `Derived`. In return 
positions, the derived type must be produced. I think there's value in 
controlling this behavior, but the control belongs on the conformer's side, not 
the protocol's. For some class hierarchies, the subclasses are intended to be 
API themselves in order to extend behavior. Every UIView subclass is 
interesting independently, for example, and ought to satisfy `NSCoding` and 
other requirements independently. In other class hierarchies, the base class is 
intended to be the common API, and subclasses are just implementation details. 
NSString, NSURL, etc. exemplify this—for most purposes the common `NSCoding` 
implementation is sufficient for all NSStrings. Likewise, you probably want 
NSURL to conform to `StringLiteralConvertible` but don't particularly care what 
subclass you get out of the deal. I had proposed the idea of modifying a 
class's conformance declaration to allow it control whether the conformance is 
inherited:

extension NSString: static NSCoding { } // NSCoding conformance statically 
applies to only NSString, Self == NSString
extension UIView: required NSCoding { } // NSCoding conformance is required of 
all subclasses, Self <= UIView

-Joe
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] Self behaves inconsistently in protocol method signatures

2015-12-28 Thread Developer via swift-evolution
My understanding of Self is that it is a special generic parameter resolved by 
the type system to the type of the implementing structure.  That resolution 
must be invariant because the implementing structure (here, non-final classes) 
can choose to yank the protocol's invariants out from under you when it is 
subclassed.  Sure, retroactively, you can make things conform, but you also 
can't completely guarantee type safety with any kind of variance in Self in all 
cases. 

On the other hand, using the protocol itself in either position says that you 
only wish to restrict yourself to the protocol itself, not some specific 
implementation.  You are necessarily specifying an upper bound (here C) on the 
amount of "information" you can get out of the type, so it is possible to 
introduce variance because you will never violate the protocol's invariants by 
returning a subtype with a legal conformance.

Self doesn't mean two different things, your protocol declarations do!

~Robert Widmann

2015/12/28 11:49、Matthew Johnson via swift-evolution 
 のメッセージ:

> I have brought up the idea of a non-covarying Self a few times.  
> 
> I was surprised to realize that Self is actually non-covarying when used for 
> parameters in protocol declarations!
> 
> Here is an example demonstrating this:
> 
> protocol P {
>func foo(s: Self)
> }
> protocol Q {
>func bar() -> Self
> }
> 
> class C: P {
>// this works!  Self as an argument type in the protocol declaration does 
> not covary
>func foo(c: C) {}
> }
> 
> class D: C {}
> 
> extension C: Q {
>// method ‘bar()’ in non-final class ‘C’ must return ‘Self’ to conform to 
> protocol ‘Q'
>func bar() -> C { return self } 
> }
> 
> 
> It doesn’t make sense to allow a co-varying Self for parameters so I can 
> understand how the current state might have arisen.  At the same time, using 
> Self to mean two different things is inconsistent, confusing and it doesn’t 
> allow us to specify a non-covarying Self as a return type in protocol 
> requirements.  
> 
> As I have pointed out before, the ability to specify a non-covarying Self as 
> a return type would make it possible to design a protocol that can be 
> retroactively conformed to by non-final classes (such as those in Apple’s 
> frameworks).
> 
> I think it would be a very good idea to introduce a non-covarying Self which 
> would specify the type that adds conformance to the protocol and require this 
> Self to be used in places where covariance is not possible, such as parameter 
> types.  It would also be allowed elsewhere, such as return types, making it 
> easier to conform non-final classes when covariance is not required by the 
> protocol.
> 
> One possible name is `ConformingSelf`.  One thing I like about this name is 
> that it makes it very clear that it is the type that introduces protocol 
> conformance.
> 
> I’m interested in hearing thoughts on this.
> 
> Matthew
> ___
> swift-evolution mailing list
> swift-evolution@swift.org
> https://lists.swift.org/mailman/listinfo/swift-evolution
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] Self behaves inconsistently in protocol method signatures

2015-12-28 Thread Matthew Johnson via swift-evolution

> On Dec 28, 2015, at 12:04 PM, Developer  wrote:
> 
> That doesn't look like a variance issue to me, that's about the same 
> "information" invariant I talked about before.  The former works because self 
> resolves to an invariant type, the type of the implementing structure, which 
> satisfies the requirement Self introduces.  The latter does not because Self 
> indicates a level of specificity C cannot guarantee.  Self is magic, but it 
> is also implemented as a generic parameter.  So think of it this way:
> 
> protocol Q {
>   func bar() -> T { return Q() }
> }
> 
> You wouldn't expect that to compile, would you?

It actually does work for structs and for final classes because Self becomes 
invariant when used in a return type position for them.

protocol Q {
func bar() -> Self
}

final class C: Q {
func bar() -> C { return C() }
}


> 
> ~Robert Widmann
> 
> 2015/12/28 12:49、Matthew Johnson  > のメッセージ:
> 
>> 
>>> On Dec 28, 2015, at 11:19 AM, Developer >> > wrote:
>>> 
>>> My understanding of Self is that it is a special generic parameter resolved 
>>> by the type system to the type of the implementing structure.  That 
>>> resolution must be invariant because the implementing structure (here, 
>>> non-final classes) can choose to yank the protocol's invariants out from 
>>> under you when it is subclassed.  Sure, retroactively, you can make things 
>>> conform, but you also can't completely guarantee type safety with any kind 
>>> of variance in Self in all cases. 
>>> 
>>> On the other hand, using the protocol itself in either position says that 
>>> you only wish to restrict yourself to the protocol itself, not some 
>>> specific implementation.  You are necessarily specifying an upper bound 
>>> (here C) on the amount of "information" you can get out of the type, so it 
>>> is possible to introduce variance because you will never violate the 
>>> protocol's invariants by returning a subtype with a legal conformance.
>>> 
>>> Self doesn't mean two different things, your protocol declarations do!
>> 
>> My mind must be a little bit foggy this morning.  This works:
>> 
>> extension C: Q {
>> func bar() -> Self { return self }
>> }
>> 
>> What doesn’t work, regardless of whether C is final or not, is this:
>> 
>> extension C: Q {
>> // Cannot convert return expression of type ‘C’ to return type ‘Self'
>> func bar() -> Self { return C() }
>> }
>> 
>> In order for classes to meet a protocol requirement with Self in the return 
>> position you must specify Self (rather than the conforming type) as the 
>> return type for the method.  Self in the return position of a method is 
>> treated as covariant.
>> 
>> In order for classes to meet a protocol requirement with Self in parameter 
>> position you must specify the type of the conforming class (you cannot 
>> specify Self in an argument position).  Obviously the type of the conforming 
>> class is invariant.
>> 
>> This is the sense in which Self in protocol declarations is inconsistent.  
>> The requirements on conforming types are different - invariance for Self 
>> parameters and covariance for Self return types.
>> 
>> IMO it would be much more clear if this distinction was explicit rather than 
>> implicit based on the location of Self.  It would also be extremely useful 
>> in some cases to be able to specify an invariant `ConformingSelf` return 
>> type.
>> 
>> 
>>> 
>>> ~Robert Widmann
>>> 
>>> 2015/12/28 11:49、Matthew Johnson via swift-evolution 
>>> > のメッセージ:
>>> 
 I have brought up the idea of a non-covarying Self a few times.  
 
 I was surprised to realize that Self is actually non-covarying when used 
 for parameters in protocol declarations!
 
 Here is an example demonstrating this:
 
 protocol P {
func foo(s: Self)
 }
 protocol Q {
func bar() -> Self
 }
 
 class C: P {
// this works!  Self as an argument type in the protocol declaration 
 does not covary
func foo(c: C) {}
 }
 
 class D: C {}
 
 extension C: Q {
// method ‘bar()’ in non-final class ‘C’ must return ‘Self’ to conform 
 to protocol ‘Q'
func bar() -> C { return self } 
 }
 
 
 It doesn’t make sense to allow a co-varying Self for parameters so I can 
 understand how the current state might have arisen.  At the same time, 
 using Self to mean two different things is inconsistent, confusing and it 
 doesn’t allow us to specify a non-covarying Self as a return type in 
 protocol requirements.  
 
 As I have pointed out before, the ability to specify a non-covarying Self 
 as a return type would make it possible to design a protocol that can be 
 retroactively conformed to by non-final 

Re: [swift-evolution] Self behaves inconsistently in protocol method signatures

2015-12-28 Thread Matthew Johnson via swift-evolution

> On Dec 28, 2015, at 12:43 PM, Joe Groff  wrote:
> 
>> 
>> On Dec 28, 2015, at 10:31 AM, Matthew Johnson > > wrote:
>> 
>>> 
>>> On Dec 28, 2015, at 12:02 PM, Joe Groff >> > wrote:
>>> 
 
 On Dec 28, 2015, at 8:49 AM, Matthew Johnson via swift-evolution 
 > wrote:
 
 I have brought up the idea of a non-covarying Self a few times.  
 
 I was surprised to realize that Self is actually non-covarying when used 
 for parameters in protocol declarations!
 
 Here is an example demonstrating this:
 
 protocol P {
   func foo(s: Self)
 }
 protocol Q {
   func bar() -> Self
 }
 
 class C: P {
   // this works!  Self as an argument type in the protocol declaration 
 does not covary
   func foo(c: C) {}
 }
 
 class D: C {}
 
 extension C: Q {
   // method ‘bar()’ in non-final class ‘C’ must return ‘Self’ to conform 
 to protocol ‘Q'
   func bar() -> C { return self } 
 }
 
 
 It doesn’t make sense to allow a co-varying Self for parameters so I can 
 understand how the current state might have arisen.  At the same time, 
 using Self to mean two different things is inconsistent, confusing and it 
 doesn’t allow us to specify a non-covarying Self as a return type in 
 protocol requirements.  
 
 As I have pointed out before, the ability to specify a non-covarying Self 
 as a return type would make it possible to design a protocol that can be 
 retroactively conformed to by non-final classes (such as those in Apple’s 
 frameworks).
 
 I think it would be a very good idea to introduce a non-covarying Self 
 which would specify the type that adds conformance to the protocol and 
 require this Self to be used in places where covariance is not possible, 
 such as parameter types.  It would also be allowed elsewhere, such as 
 return types, making it easier to conform non-final classes when 
 covariance is not required by the protocol.
 
 One possible name is `ConformingSelf`.  One thing I like about this name 
 is that it makes it very clear that it is the type that introduces 
 protocol conformance.
 
 I’m interested in hearing thoughts on this.
>>> 
>>> I proposed this a while back — see "controlling protocol conformance 
>>> inheritance" from a while back. The current rules preserve inheritability 
>>> of the protocol conformance in all cases. `Self` in argument positions maps 
>>> to the root class of the conformance, since a method taking `Base` can also 
>>> take any `Derived` class and thereby satisfy the conformance for `Derived`. 
>>> In return positions, the derived type must be produced. I think there's 
>>> value in controlling this behavior, but the control belongs on the 
>>> conformer's side, not the protocol's. For some class hierarchies, the 
>>> subclasses are intended to be API themselves in order to extend behavior. 
>>> Every UIView subclass is interesting independently, for example, and ought 
>>> to satisfy `NSCoding` and other requirements independently. In other class 
>>> hierarchies, the base class is intended to be the common API, and 
>>> subclasses are just implementation details. NSString, NSURL, etc. exemplify 
>>> this—for most purposes the common `NSCoding` implementation is sufficient 
>>> for all NSStrings. Likewise, you probably want NSURL to conform to 
>>> `StringLiteralConvertible` but don't particularly care what subclass you 
>>> get out of the deal. I had proposed the idea of modifying a class's 
>>> conformance declaration to allow it control whether the conformance is 
>>> inherited:
>>> 
>>> extension NSString: static NSCoding { } // NSCoding conformance statically 
>>> applies to only NSString, Self == NSString
>>> extension UIView: required NSCoding { } // NSCoding conformance is required 
>>> of all subclasses, Self <= UIView
>>> 
>> 
>> If I understand correctly, what you’re saying is that you don’t think there 
>> is a reason why a protocol would want to specifically require invariance.  I 
>> would have to think about this some more but you may well be right.  The 
>> conforming-side solution would definitely work in every use case I know of.
>> 
>> I figured out how Self protocol requirements are actually treated 
>> consistently from a particular perspective.  Self is effectively treated as 
>> a covariant requirement, but becomes an invariant requirement when used in 
>> positions where covariance is not possible (parameters and return type for 
>> structs and final classes).  This makes sense and is consistent, if not 
>> totally obvious.  Is this reasonably accurate?
> 
> Yeah. It might help to think of `Self` within the protocol as referring to 
> the full range of 

Re: [swift-evolution] Self behaves inconsistently in protocol method signatures

2015-12-28 Thread Joe Groff via swift-evolution

> On Dec 28, 2015, at 10:31 AM, Matthew Johnson  wrote:
> 
>> 
>> On Dec 28, 2015, at 12:02 PM, Joe Groff > > wrote:
>> 
>>> 
>>> On Dec 28, 2015, at 8:49 AM, Matthew Johnson via swift-evolution 
>>> > wrote:
>>> 
>>> I have brought up the idea of a non-covarying Self a few times.  
>>> 
>>> I was surprised to realize that Self is actually non-covarying when used 
>>> for parameters in protocol declarations!
>>> 
>>> Here is an example demonstrating this:
>>> 
>>> protocol P {
>>>   func foo(s: Self)
>>> }
>>> protocol Q {
>>>   func bar() -> Self
>>> }
>>> 
>>> class C: P {
>>>   // this works!  Self as an argument type in the protocol declaration does 
>>> not covary
>>>   func foo(c: C) {}
>>> }
>>> 
>>> class D: C {}
>>> 
>>> extension C: Q {
>>>   // method ‘bar()’ in non-final class ‘C’ must return ‘Self’ to conform to 
>>> protocol ‘Q'
>>>   func bar() -> C { return self } 
>>> }
>>> 
>>> 
>>> It doesn’t make sense to allow a co-varying Self for parameters so I can 
>>> understand how the current state might have arisen.  At the same time, 
>>> using Self to mean two different things is inconsistent, confusing and it 
>>> doesn’t allow us to specify a non-covarying Self as a return type in 
>>> protocol requirements.  
>>> 
>>> As I have pointed out before, the ability to specify a non-covarying Self 
>>> as a return type would make it possible to design a protocol that can be 
>>> retroactively conformed to by non-final classes (such as those in Apple’s 
>>> frameworks).
>>> 
>>> I think it would be a very good idea to introduce a non-covarying Self 
>>> which would specify the type that adds conformance to the protocol and 
>>> require this Self to be used in places where covariance is not possible, 
>>> such as parameter types.  It would also be allowed elsewhere, such as 
>>> return types, making it easier to conform non-final classes when covariance 
>>> is not required by the protocol.
>>> 
>>> One possible name is `ConformingSelf`.  One thing I like about this name is 
>>> that it makes it very clear that it is the type that introduces protocol 
>>> conformance.
>>> 
>>> I’m interested in hearing thoughts on this.
>> 
>> I proposed this a while back — see "controlling protocol conformance 
>> inheritance" from a while back. The current rules preserve inheritability of 
>> the protocol conformance in all cases. `Self` in argument positions maps to 
>> the root class of the conformance, since a method taking `Base` can also 
>> take any `Derived` class and thereby satisfy the conformance for `Derived`. 
>> In return positions, the derived type must be produced. I think there's 
>> value in controlling this behavior, but the control belongs on the 
>> conformer's side, not the protocol's. For some class hierarchies, the 
>> subclasses are intended to be API themselves in order to extend behavior. 
>> Every UIView subclass is interesting independently, for example, and ought 
>> to satisfy `NSCoding` and other requirements independently. In other class 
>> hierarchies, the base class is intended to be the common API, and subclasses 
>> are just implementation details. NSString, NSURL, etc. exemplify this—for 
>> most purposes the common `NSCoding` implementation is sufficient for all 
>> NSStrings. Likewise, you probably want NSURL to conform to 
>> `StringLiteralConvertible` but don't particularly care what subclass you get 
>> out of the deal. I had proposed the idea of modifying a class's conformance 
>> declaration to allow it control whether the conformance is inherited:
>> 
>> extension NSString: static NSCoding { } // NSCoding conformance statically 
>> applies to only NSString, Self == NSString
>> extension UIView: required NSCoding { } // NSCoding conformance is required 
>> of all subclasses, Self <= UIView
>> 
> 
> If I understand correctly, what you’re saying is that you don’t think there 
> is a reason why a protocol would want to specifically require invariance.  I 
> would have to think about this some more but you may well be right.  The 
> conforming-side solution would definitely work in every use case I know of.
> 
> I figured out how Self protocol requirements are actually treated 
> consistently from a particular perspective.  Self is effectively treated as a 
> covariant requirement, but becomes an invariant requirement when used in 
> positions where covariance is not possible (parameters and return type for 
> structs and final classes).  This makes sense and is consistent, if not 
> totally obvious.  Is this reasonably accurate?

Yeah. It might help to think of `Self` within the protocol as referring to the 
full range of types [BaseClass, Self] that are required to conform to the 
protocol. Covariance and contravariance push the interval in opposite 
directions; in argument position, you need to cover the range by specifying its 
upper bound