Re: [swift-evolution] TrigonometricFloatingPoint/MathFloatingPoint protocol?

2017-08-04 Thread Taylor Swift via swift-evolution
There was a sign error due to switching to an underlying approximation of
sin() to evaluate cos(). Here’s the actual output


On Fri, Aug 4, 2017 at 4:56 PM, Taylor Swift  wrote:

> update:
>
> I’ve managed to improve the algorithm to the point where it’s arguably
> more accurate than Glibc.cos(_:), and runs just as fast. Removing one term
> makes the Swift implementation faster than _cos(_:), but worses the
> divergence by like 23% (137 ULPs from 0° ..< 90°, as opposed to 111 ULPs).
>
> Relative time (lower is better)
>
> _cos(_:) instrinsic   : 3.096
> pure Swift implementation : 3.165
>
> Almost everywhere the pure Swift implementation is within ±1 ULP of the
> Glibc/llvm implementation. Adding more terms to the approximation actually
> worsens the divergence, so I guess we are in the range where we have to
> start talking about error in the Glibc implementation as well. Here’s an 
> output
> dump
> 
> with input from −360° to +360°.
>
> The _cos(_:) intrinsic seems to be asymmetric across the positive and
> negative halves of the function, which causes the divergence to rise to
> about 3–5 ULPs on the far side of the unit circle. This could be due to
> rounding differences in the arguments, since π/2 and 3π/2 are impossible to
> represent in floating point. However I don’t know which implementation is
> “wrong” here. The Swift one gives the “right” output for all special
> angles; i.e. cos(90°) == 0, cos(60°) == 0.5, etc , whereas _cos(_:) gives
> slightly fuzzy values.
>
> If anyone wants to try it, I put the cosine implementation in an actual
> module on github; and the given benchmark numbers are for cross-module
> calls. 
>
> On Thu, Aug 3, 2017 at 7:32 PM, Taylor Swift  wrote:
>
>>
>>
>> On Thu, Aug 3, 2017 at 7:12 PM, Karl Wagner via swift-evolution <
>> swift-evolution@swift.org> wrote:
>>
>>>
>>> On 3. Aug 2017, at 13:04, Stephen Canon via swift-evolution <
>>> swift-evolution@swift.org> wrote:
>>>
>>> On Aug 2, 2017, at 7:03 PM, Karl Wagner via swift-evolution <
>>> swift-evolution@swift.org> wrote:
>>>
>>>
>>> It’s important to remember that computers are mathematical machines, and
>>> some functions which are implemented in hardware on essentially every
>>> platform (like sin/cos/etc) are definitely best implemented as compiler
>>> intrinsics.
>>>
>>>
>>> sin/cos/etc are implemented in software, not hardware. x86 does have the
>>> FSIN/FCOS instructions, but (almost) no one actually uses them to implement
>>> the sin( ) and cos( ) functions; they are a legacy curiosity, both too slow
>>> and too inaccurate for serious use today. There are no analogous
>>> instructions on ARM or PPC.
>>>
>>> – Steve
>>> ___
>>> swift-evolution mailing list
>>> swift-evolution@swift.org
>>> https://lists.swift.org/mailman/listinfo/swift-evolution
>>>
>>>
>>> Hah that’s pretty cool; I think I learned in EE years ago that it was
>>> implemented with a lookup table inside the CPU and never bothered to
>>> question it.
>>>
>>> The pure-Swift cosine implementation looks cool.
>>>
>>
>> I’m pretty sure it can be improved greatly, at least for Double.
>> Unfortunately performance falls off a cliff for Float for some reason, i
>> don’t know why.
>>
>>>
>>> As for the larger discussion about a Swift maths library: in general,
>>> it’s hard for any new Swift-only package to get off the ground without a
>>> more comprehensive package manager. The current version doesn’t support
>>> most of the Swift projects being worked on every day. Swift is also still a
>>> relatively young language - the new integer protocols have never even
>>> shipped in a stable release. Considering where we are, it’s not really
>>> surprising that most of the Swift maths libraries are still a bit
>>> rudimentary; I expect they will naturally evolve and develop in time, the
>>> way open-source code does.
>>>
>>>
>> Most of the SPM’s limitations have workarounds, the problem is it’s just
>> not very convenient, i.e. local and non-git dependencies. Other features
>> like gyb, I’m not sure if it’s a good idea to bring to the SPM. gyb is a
>> band-aid over deeper limitations of the language.
>>
>>
>>> It’s also worth considering that our excellent bridging with C removes
>>> some of the impetus to rewrite all your battle-tested maths code in Swift.
>>> The benefits are not obvious; the stage is set for pioneers to experiment
>>> and show the world why they should be writing their maths code in Swift.
>>>
>>>
>> The glibc/llvm functions are not generic. You cannot use _cos(_:) on a
>> protocol type like BinaryFloatingPoint. A pure Swift implementation
>> would allow generic programming with trig and other math functions; right
>> now anything beyond sqrt() require

[swift-evolution] Reminder: Swift.org scheduled outages for bug reporting, mailing lists, website and CI

2017-08-04 Thread Nicole Jacque via swift-evolution
Just a reminder, the mailing list server and CI will be going offline in about 
an hour.

> On Aug 2, 2017, at 4:52 PM, Nicole Jacque  wrote:
> 
> Hello All-
> 
> We will have some downtime for swift.org  resources over 
> the weekend as we upgrade our infrastructure.  
> 
> The outage schedule will be as follows:
> bugs.swift.org  will become unavailable starting at 9 
> PM Thursday, Aug 3 (Pacific) until the upgrade is completed on Saturday, Aug 5
> swift.org  mailing lists (including this list)  will 
> become unavailable starting at 3 PM Friday, Aug 4 until the upgrade is 
> completed on Saturday
> ci.swift.org  and all CI infrastructure will become 
> unavailable starting at 3 PM Friday, Aug 4 until the upgrade is completed on 
> Saturday.  We will also be locking the repos at this time until CI is back up.
> The swift.org  website will be unavailable for a short 
> time on Saturday afternoon.
> 
> We expect the upgrade to be complete on Saturday afternoon or evening.  We 
> will send out email when the upgrade is complete.
> 
> Thanks,
> Nicole
> 

___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] TrigonometricFloatingPoint/MathFloatingPoint protocol?

2017-08-04 Thread Taylor Swift via swift-evolution
update:

I’ve managed to improve the algorithm to the point where it’s arguably more
accurate than Glibc.cos(_:), and runs just as fast. Removing one term makes
the Swift implementation faster than _cos(_:), but worses the divergence by
like 23% (137 ULPs from 0° ..< 90°, as opposed to 111 ULPs).

Relative time (lower is better)

_cos(_:) instrinsic   : 3.096
pure Swift implementation : 3.165

Almost everywhere the pure Swift implementation is within ±1 ULP of the
Glibc/llvm implementation. Adding more terms to the approximation actually
worsens the divergence, so I guess we are in the range where we have to
start talking about error in the Glibc implementation as well. Here’s an output
dump

with input from −360° to +360°.

The _cos(_:) intrinsic seems to be asymmetric across the positive and
negative halves of the function, which causes the divergence to rise to
about 3–5 ULPs on the far side of the unit circle. This could be due to
rounding differences in the arguments, since π/2 and 3π/2 are impossible to
represent in floating point. However I don’t know which implementation is
“wrong” here. The Swift one gives the “right” output for all special
angles; i.e. cos(90°) == 0, cos(60°) == 0.5, etc , whereas _cos(_:) gives
slightly fuzzy values.

If anyone wants to try it, I put the cosine implementation in an actual
module on github; and the given benchmark numbers are for cross-module
calls. 

On Thu, Aug 3, 2017 at 7:32 PM, Taylor Swift  wrote:

>
>
> On Thu, Aug 3, 2017 at 7:12 PM, Karl Wagner via swift-evolution <
> swift-evolution@swift.org> wrote:
>
>>
>> On 3. Aug 2017, at 13:04, Stephen Canon via swift-evolution <
>> swift-evolution@swift.org> wrote:
>>
>> On Aug 2, 2017, at 7:03 PM, Karl Wagner via swift-evolution <
>> swift-evolution@swift.org> wrote:
>>
>>
>> It’s important to remember that computers are mathematical machines, and
>> some functions which are implemented in hardware on essentially every
>> platform (like sin/cos/etc) are definitely best implemented as compiler
>> intrinsics.
>>
>>
>> sin/cos/etc are implemented in software, not hardware. x86 does have the
>> FSIN/FCOS instructions, but (almost) no one actually uses them to implement
>> the sin( ) and cos( ) functions; they are a legacy curiosity, both too slow
>> and too inaccurate for serious use today. There are no analogous
>> instructions on ARM or PPC.
>>
>> – Steve
>> ___
>> swift-evolution mailing list
>> swift-evolution@swift.org
>> https://lists.swift.org/mailman/listinfo/swift-evolution
>>
>>
>> Hah that’s pretty cool; I think I learned in EE years ago that it was
>> implemented with a lookup table inside the CPU and never bothered to
>> question it.
>>
>> The pure-Swift cosine implementation looks cool.
>>
>
> I’m pretty sure it can be improved greatly, at least for Double.
> Unfortunately performance falls off a cliff for Float for some reason, i
> don’t know why.
>
>>
>> As for the larger discussion about a Swift maths library: in general,
>> it’s hard for any new Swift-only package to get off the ground without a
>> more comprehensive package manager. The current version doesn’t support
>> most of the Swift projects being worked on every day. Swift is also still a
>> relatively young language - the new integer protocols have never even
>> shipped in a stable release. Considering where we are, it’s not really
>> surprising that most of the Swift maths libraries are still a bit
>> rudimentary; I expect they will naturally evolve and develop in time, the
>> way open-source code does.
>>
>>
> Most of the SPM’s limitations have workarounds, the problem is it’s just
> not very convenient, i.e. local and non-git dependencies. Other features
> like gyb, I’m not sure if it’s a good idea to bring to the SPM. gyb is a
> band-aid over deeper limitations of the language.
>
>
>> It’s also worth considering that our excellent bridging with C removes
>> some of the impetus to rewrite all your battle-tested maths code in Swift.
>> The benefits are not obvious; the stage is set for pioneers to experiment
>> and show the world why they should be writing their maths code in Swift.
>>
>>
> The glibc/llvm functions are not generic. You cannot use _cos(_:) on a
> protocol type like BinaryFloatingPoint. A pure Swift implementation would
> allow generic programming with trig and other math functions; right now
> anything beyond sqrt() requires manual specialization.
>
>
>> - Karl
>>
>> ___
>> swift-evolution mailing list
>> swift-evolution@swift.org
>> https://lists.swift.org/mailman/listinfo/swift-evolution
>>
>>
>
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [planning] [discussion] Schedule for return of closure parameter labels (+ world domination ramble)

2017-08-04 Thread Mathew Huusko V via swift-evolution
Ah, I see. I understood on a basic level that additive features were safe,
but I didn't/don't have the knowledge to judge when adding actually means
changing (e.g. idk, 'adding abstract classes' or 'adding optional protocol
methods' implying 'changing/breaking inheritance/dispatch' or something..).

Anyway, I didn't know that about C++ – now *that's *a reassuring benchmark.
Thanks! ;)

On Fri, Aug 4, 2017 at 8:46 PM, Chris Lattner  wrote:

>
> > On Aug 4, 2017, at 12:03 PM, Mathew Huusko V  wrote:
> >
> > Thanks for the swift response, it's an honour; I agree wholeheartedly
> with your logic and sentiment. Sorry if I was unclear, but my
> concern/curiosity is not for the speed of Swift's development, but in fact
> for its long term evolution and longevity. At risk of repeating
> myself/boring everyone, that concern manifests over two intermingling
> phenomena:
> > 1) in the evolution email/proposal archive, a well intentioned (towards
> -complexity and +quality) but sometimes blasé air around potential
> uses/requirements of the language (~"Swift won't support that because
> people probably wouldn't use/need it").
> > 2) the reality of the clock, or what I think/thought the reality was.
> Obviously I don't want Swift to evolve too fast, and don't think having any
> particular feature right now is worth risking that, but won't the ABI be
> stabilised eventually (Swift 5?) and then it will actually be too late for
> some features?
>
> No.  ABI stability is less of a bound of new things than it is a bound on
> the ability to change existing things.
>
> To take one random example, C++ has been ABI stable on the Mac since
> effectively 10.0 (or whatever release first shipped GCC 3).  That hasn’t
> impeded the ability to add tons of new stuff to C++. :-)
>
> -Chris
>
>
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Planning][Request] "constexpr" for Swift 5

2017-08-04 Thread Robert Bennett via swift-evolution
Sorry, not sure how I missed that 😑.

> On Aug 4, 2017, at 3:16 PM, Félix Cloutier  wrote:
> 
> 
>> Le 4 août 2017 à 11:39, Robert Bennett  a écrit :
>> 
>>> That's not a concern with the `let` case that Robert brought up, since you 
>>> can't mutate a `let` array at all.
>>> 
>>> The big thing is that unconstrained escape analysis is uncomputable. Since 
>>> Swift array storage is COW, any function that receives the array as a 
>>> parameter is allowed to take a reference on its storage. If the storage 
>>> lives on the stack and that reference outlives the stack frame, you've got 
>>> a problem. It's the same problem that motivated @escaping for closures.
>>> 
>>> You could allow storage to be on the stack by forcing user to make a 
>>> pessimistic copy, which is possibly not an improvement.
>> 
>> 
>> Good point. If this is only problematic when multiple threads are accessing 
>> an array, then it could still be worthwhile if all accesses are (provably) 
>> on the thread that created the array.
> 
> To be clear, it's a problem independently of multi-threading. `func foo() -> 
> [Int] { return [1, 2, 3] }` is the most basic representation of it: you can't 
> store the array in the stack frame if you return it. (To be fair, that one 
> would be caught by escape analysis of any quality.) `func foo() { bar([1, 2, 
> 3, 4]) }` is another example: if you don't know what `bar` does with the 
> array, you can't store it on the stack because it might pass it to an object 
> that lives on the heap and outlives `foo`, for instance. @escaping solves 
> that problem for closures by specifically annotating parameters when the 
> assigned closure could still be referenced after the called function returns.
> 
>> And pessimistic copying might still be worth it for arrays below a certain 
>> size — for instance, copying an Array of length 1 (and recall that the 
>> array in question is a constant so its size is known at compile time) would 
>> definitely be worth not having that array in heap memory.
> 
> You only need pessimistic copies when that copy escapes, and since it 
> escapes, it needs to live on the heap by definition. Right now an array of 
> size 1 passed to 4 objects on the heap has one single backing representation. 
> With pessimistic copies, you'd get 4 times that buffer of size 1, which is 
> definitely not an improvement.
> 
>> Going back to the literal notion of FSAs — fixed size arrays not necessarily 
>> on the stack — I think that simply copying Array’s implementation sans 
>> RangeReplaceableCollection conformance is not a bad way to go. Any 
>> optimizations used for `let` Arrays could probably be applied to this type 
>> of FSA.
> 
> We're actually splitting this in multiple directions. John talks of 
> variable-sized arrays necessarily on the stack. :)
> 
> I see two useful characteristics to fixed-size arrays, which, 
> uncoincidentally, are what it takes to use them for C interop:
> 
> Their storage is inlined into their container (whether it be the stack or an 
> object)
> Their length is part of their type (and not directly included with the data 
> itself)
> 
> This is also enough to implement fixed-size arrays not necessarily on the 
> stack, mind you.
> 
> Félix
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [planning] [discussion] Schedule for return of closure parameter labels (+ world domination ramble)

2017-08-04 Thread Chris Lattner via swift-evolution

> On Aug 4, 2017, at 12:03 PM, Mathew Huusko V  wrote:
> 
> Thanks for the swift response, it's an honour; I agree wholeheartedly with 
> your logic and sentiment. Sorry if I was unclear, but my concern/curiosity is 
> not for the speed of Swift's development, but in fact for its long term 
> evolution and longevity. At risk of repeating myself/boring everyone, that 
> concern manifests over two intermingling phenomena:
> 1) in the evolution email/proposal archive, a well intentioned (towards 
> -complexity and +quality) but sometimes blasé air around potential 
> uses/requirements of the language (~"Swift won't support that because people 
> probably wouldn't use/need it").
> 2) the reality of the clock, or what I think/thought the reality was. 
> Obviously I don't want Swift to evolve too fast, and don't think having any 
> particular feature right now is worth risking that, but won't the ABI be 
> stabilised eventually (Swift 5?) and then it will actually be too late for 
> some features?

No.  ABI stability is less of a bound of new things than it is a bound on the 
ability to change existing things.

To take one random example, C++ has been ABI stable on the Mac since 
effectively 10.0 (or whatever release first shipped GCC 3).  That hasn’t 
impeded the ability to add tons of new stuff to C++. :-)

-Chris

___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Planning][Request] "constexpr" for Swift 5

2017-08-04 Thread Félix Cloutier via swift-evolution

> Le 4 août 2017 à 11:39, Robert Bennett  a écrit :
> 
>> That's not a concern with the `let` case that Robert brought up, since you 
>> can't mutate a `let` array at all.
>> 
>> The big thing is that unconstrained escape analysis is uncomputable. Since 
>> Swift array storage is COW, any function that receives the array as a 
>> parameter is allowed to take a reference on its storage. If the storage 
>> lives on the stack and that reference outlives the stack frame, you've got a 
>> problem. It's the same problem that motivated @escaping for closures.
>> 
>> You could allow storage to be on the stack by forcing user to make a 
>> pessimistic copy, which is possibly not an improvement.
> 
> 
> Good point. If this is only problematic when multiple threads are accessing 
> an array, then it could still be worthwhile if all accesses are (provably) on 
> the thread that created the array.

To be clear, it's a problem independently of multi-threading. `func foo() -> 
[Int] { return [1, 2, 3] }` is the most basic representation of it: you can't 
store the array in the stack frame if you return it. (To be fair, that one 
would be caught by escape analysis of any quality.) `func foo() { bar([1, 2, 3, 
4]) }` is another example: if you don't know what `bar` does with the array, 
you can't store it on the stack because it might pass it to an object that 
lives on the heap and outlives `foo`, for instance. @escaping solves that 
problem for closures by specifically annotating parameters when the assigned 
closure could still be referenced after the called function returns.

> And pessimistic copying might still be worth it for arrays below a certain 
> size — for instance, copying an Array of length 1 (and recall that the 
> array in question is a constant so its size is known at compile time) would 
> definitely be worth not having that array in heap memory.

You only need pessimistic copies when that copy escapes, and since it escapes, 
it needs to live on the heap by definition. Right now an array of size 1 passed 
to 4 objects on the heap has one single backing representation. With 
pessimistic copies, you'd get 4 times that buffer of size 1, which is 
definitely not an improvement.

> Going back to the literal notion of FSAs — fixed size arrays not necessarily 
> on the stack — I think that simply copying Array’s implementation sans 
> RangeReplaceableCollection conformance is not a bad way to go. Any 
> optimizations used for `let` Arrays could probably be applied to this type of 
> FSA.

We're actually splitting this in multiple directions. John talks of 
variable-sized arrays necessarily on the stack. :)

I see two useful characteristics to fixed-size arrays, which, uncoincidentally, 
are what it takes to use them for C interop:

Their storage is inlined into their container (whether it be the stack or an 
object)
Their length is part of their type (and not directly included with the data 
itself)

This is also enough to implement fixed-size arrays not necessarily on the 
stack, mind you.

Félix___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [planning] [discussion] Schedule for return of closure parameter labels (+ world domination ramble)

2017-08-04 Thread Mathew Huusko V via swift-evolution
Thanks for the swift response, it's an honour; I agree wholeheartedly with
your logic and sentiment. Sorry if I was unclear, but my concern/curiosity
is not for the speed of Swift's development, but in fact for its long term
evolution and longevity. At risk of repeating myself/boring everyone, that
concern manifests over two intermingling phenomena:
1) in the evolution email/proposal archive, a well intentioned (towards
-complexity and +quality) but sometimes blasé air around potential
uses/requirements of the language (~"Swift won't support that because
people probably wouldn't use/need it").
2) the reality of the clock, or what I think/thought the reality was.
Obviously I don't want Swift to evolve too fast, and don't think having any
particular feature right now is worth risking that, but won't the ABI be
stabilised eventually (Swift 5?) and then it will actually be too late for
some features? Please correct me if I'm wrong here.

A possible (not sure if this is ABI bound) example:
As far as I've seen, optional protocol requirements (like 'protected',
private conformances, currying, etc.) are more off the table than
postponed, having been deemed an anti-pattern. Fair enough – I'm inclined
to trust the people involved in those discussions. But what if after ABI
stabilisation people get around to building some significant systems in
Swift (say, UIKit, which relies heavily on optional protocol requirements
for good reason) and don't find reasonable alternatives? There's no going
back, right?

On Fri, Aug 4, 2017 at 6:32 PM, Chris Lattner  wrote:

>
> On Aug 4, 2017, at 9:16 AM, Mathew Huusko V via swift-evolution <
> swift-evolution@swift.org> wrote:
>
> Per https://lists.swift.org/pipermail/swift-evolution-announce/2
> 016-July/000233.html, the removal of parameter labels entirely was
> accepted as a temporary loss for Swift 3 as a means to remove them from the
> type system. I'm wondering if they're coming back (syntactically) any time
> soon?
>
>
> The planning approach for Swift 5 hasn’t been announced yet, but it should
> be soon-ish.
>
> Responding to the rest of your email in broad terms: there will always be
> a ton of things that are important and interesting to tackle.  There is
> also a long road ahead of Swift, so prioritization is not a bad thing: just
> because something doesn’t happen “now” doesn’t mean it never will.
>
> I would also argue that it would be *bad* for the language to evolve too
> fast.  Landing 20 major features all in the same year runs the very high
> risk that they doesn’t work well together and don’t have time to settle out
> properly.  It is far more important for Swift to be great over the long
> term than to have any individual little feature “now”.
>
> -Chris
>
>
>
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Planning][Request] "constexpr" for Swift 5

2017-08-04 Thread Robert Bennett via swift-evolution
> That's not a concern with the `let` case that Robert brought up, since you 
> can't mutate a `let` array at all.
> 
> The big thing is that unconstrained escape analysis is uncomputable. Since 
> Swift array storage is COW, any function that receives the array as a 
> parameter is allowed to take a reference on its storage. If the storage lives 
> on the stack and that reference outlives the stack frame, you've got a 
> problem. It's the same problem that motivated @escaping for closures.
> 
> You could allow storage to be on the stack by forcing user to make a 
> pessimistic copy, which is possibly not an improvement.


Good point. If this is only problematic when multiple threads are accessing an 
array, then it could still be worthwhile if all accesses are (provably) on the 
thread that created the array. And pessimistic copying might still be worth it 
for arrays below a certain size — for instance, copying an Array of length 
1 (and recall that the array in question is a constant so its size is known at 
compile time) would definitely be worth not having that array in heap memory.

Going back to the literal notion of FSAs — fixed size arrays not necessarily on 
the stack — I think that simply copying Array’s implementation sans 
RangeReplaceableCollection conformance is not a bad way to go. Any 
optimizations used for `let` Arrays could probably be applied to this type of 
FSA.

On Aug 4, 2017, at 2:15 PM, John McCall via swift-evolution 
mailto:swift-evolution@swift.org>> wrote:

> 
>> On Aug 4, 2017, at 1:19 PM, Félix Cloutier via swift-evolution 
>> mailto:swift-evolution@swift.org>> wrote:
>> 
>> That's not a concern with the `let` case that Robert brought up, since you 
>> can't mutate a `let` array at all.
>> 
>> The big thing is that unconstrained escape analysis is uncomputable. Since 
>> Swift array storage is COW, any function that receives the array as a 
>> parameter is allowed to take a reference on its storage. If the storage 
>> lives on the stack and that reference outlives the stack frame, you've got a 
>> problem. It's the same problem that motivated @escaping for closures.
>> 
>> You could allow storage to be on the stack by forcing user to make a 
>> pessimistic copy, which is possibly not an improvement.
> 
> Right.  I think maybe the name people keeping using for this feature is 
> misleading; a better name would be "inline arrays" or "directly-stored 
> arrays".  Having a fixed size is a necessary condition for storing the array 
> elements directly, but the people asking for this feature are really asking 
> for the different representation, not just the ability to statically 
> constrain the size of an array.
> 
> That representation difference comes with a lot of weaknesses and trade-offs, 
> but it's also useful sometimes.
> 
> John.
> 
> 
> 
>> 
>>> Le 4 août 2017 à 09:21, Taylor Swift via swift-evolution 
>>> mailto:swift-evolution@swift.org>> a écrit :
>>> 
>>> No, that doesn’t work. In many cases you want to mutate the elements of the 
>>> array without changing its size. For example, a Camera struct which 
>>> contains a matrix buffer, and some of the matrices get updated on each 
>>> frame that the camera moves. The matrix buffer also stores all of the 
>>> camera’s stored properties, so what would be conceptually stored properties 
>>> are actually computed properties that get and set a Float at an offset into 
>>> the buffer. Of course this could all be avoided if we had fixed layout 
>>> guarantees in the language, and then the Camera struct could be the matrix 
>>> buffer and dispense with the getters and setters instead of managing a heap 
>>> buffer.
>>> 
>>> On Fri, Aug 4, 2017 at 11:02 AM, Robert Bennett via swift-evolution 
>>> mailto:swift-evolution@swift.org>> wrote:
>>> So, I’m getting into this thread kind of late, and I’ve only skimmed most 
>>> of it, but…
>>> 
>>> A special FSA on the stack seems like the wrong direction. Wouldn’t it make 
>>> more sense to have *all* value types that don’t change in size — including 
>>> `let` Arrays — live on the stack? In which case, FSA would merely act like 
>>> a normal `let` Array, without RangeReplaceableCollection conformance, whose 
>>> elements could be changed via subscripting. I know nothing about the 
>>> underlying implementation details of Swift, so I may be way off base here.
>>> 
 On Aug 4, 2017, at 2:18 AM, David Hart >>> > wrote:
 
 Don’t small arrays live on the stack?
 
> On 4 Aug 2017, at 06:35, Félix Cloutier via swift-evolution 
> mailto:swift-evolution@swift.org>> wrote:
> 
> As far as I can tell, currently, all arrays live on the heap.
> 
>> Le 3 août 2017 à 19:03, Robert Bennett via swift-evolution 
>> mailto:swift-evolution@swift.org>> a écrit :
>> 
>> Where do constant Arrays currently live? I hope the answer is on the 
>> stack, since their size doesn’t change.
>> 
>> On Aug 3, 2017, at 8:44 PM, Tay

Re: [swift-evolution] [Planning][Request] "constexpr" for Swift 5

2017-08-04 Thread John McCall via swift-evolution

> On Aug 4, 2017, at 1:19 PM, Félix Cloutier via swift-evolution 
>  wrote:
> 
> That's not a concern with the `let` case that Robert brought up, since you 
> can't mutate a `let` array at all.
> 
> The big thing is that unconstrained escape analysis is uncomputable. Since 
> Swift array storage is COW, any function that receives the array as a 
> parameter is allowed to take a reference on its storage. If the storage lives 
> on the stack and that reference outlives the stack frame, you've got a 
> problem. It's the same problem that motivated @escaping for closures.
> 
> You could allow storage to be on the stack by forcing user to make a 
> pessimistic copy, which is possibly not an improvement.

Right.  I think maybe the name people keeping using for this feature is 
misleading; a better name would be "inline arrays" or "directly-stored arrays". 
 Having a fixed size is a necessary condition for storing the array elements 
directly, but the people asking for this feature are really asking for the 
different representation, not just the ability to statically constrain the size 
of an array.

That representation difference comes with a lot of weaknesses and trade-offs, 
but it's also useful sometimes.

John.



> 
>> Le 4 août 2017 à 09:21, Taylor Swift via swift-evolution 
>> mailto:swift-evolution@swift.org>> a écrit :
>> 
>> No, that doesn’t work. In many cases you want to mutate the elements of the 
>> array without changing its size. For example, a Camera struct which contains 
>> a matrix buffer, and some of the matrices get updated on each frame that the 
>> camera moves. The matrix buffer also stores all of the camera’s stored 
>> properties, so what would be conceptually stored properties are actually 
>> computed properties that get and set a Float at an offset into the buffer. 
>> Of course this could all be avoided if we had fixed layout guarantees in the 
>> language, and then the Camera struct could be the matrix buffer and dispense 
>> with the getters and setters instead of managing a heap buffer.
>> 
>> On Fri, Aug 4, 2017 at 11:02 AM, Robert Bennett via swift-evolution 
>> mailto:swift-evolution@swift.org>> wrote:
>> So, I’m getting into this thread kind of late, and I’ve only skimmed most of 
>> it, but…
>> 
>> A special FSA on the stack seems like the wrong direction. Wouldn’t it make 
>> more sense to have *all* value types that don’t change in size — including 
>> `let` Arrays — live on the stack? In which case, FSA would merely act like a 
>> normal `let` Array, without RangeReplaceableCollection conformance, whose 
>> elements could be changed via subscripting. I know nothing about the 
>> underlying implementation details of Swift, so I may be way off base here.
>> 
>>> On Aug 4, 2017, at 2:18 AM, David Hart >> > wrote:
>>> 
>>> Don’t small arrays live on the stack?
>>> 
 On 4 Aug 2017, at 06:35, Félix Cloutier via swift-evolution 
 mailto:swift-evolution@swift.org>> wrote:
 
 As far as I can tell, currently, all arrays live on the heap.
 
> Le 3 août 2017 à 19:03, Robert Bennett via swift-evolution 
> mailto:swift-evolution@swift.org>> a écrit :
> 
> Where do constant Arrays currently live? I hope the answer is on the 
> stack, since their size doesn’t change.
> 
> On Aug 3, 2017, at 8:44 PM, Taylor Swift via swift-evolution 
> mailto:swift-evolution@swift.org>> wrote:
> 
>> 
>> 
>> On Thu, Aug 3, 2017 at 8:20 PM, Karl Wagner via swift-evolution 
>> mailto:swift-evolution@swift.org>> wrote:
 
 The root cause, of course, is that the VLAs require new stack 
 allocations each time, and the stack is only deallocated as one lump 
 when the frame ends.
>>> 
>>> That is true of alloca(), but not of VLAs.  VLAs are freed when they go 
>>> out of scope.
>>> 
>> 
>> Learned something today.
>> 
>> Anyway, if the goal is stack allocation, I would prefer that we explored 
>> other ways to achieve it before jumping to a new array-type. I’m not 
>> really a fan of a future where [3; Double] is one type and (Double, 
>> Double, Double) is something else, and Array is yet another 
>> thing.
>> 
>> They are completely different things. 
>> 
>> [3; Double] is three contiguous Doubles which may or may not live on the 
>> stack. 
>> 
>> (Double, Double, Double) is three Doubles bound to a single variable 
>> name, which the compiler can rearrange for optimal performance and may 
>> or may not live on the stack. 
>> 
>> Array is an vector of Doubles that can dynamically grow and 
>> always lives in the heap.
>>  
>> 
>> From what I’ve read so far, the problem with stack-allocating some Array 
>> that you can pass to another function and which otherwise does not 
>> escape, is that the function may make an escaping reference (e.g. 
>> assigning it to an 

Re: [swift-evolution] [planning] [discussion] Schedule for return of closure parameter labels (+ world domination ramble)

2017-08-04 Thread Chris Lattner via swift-evolution

> On Aug 4, 2017, at 9:16 AM, Mathew Huusko V via swift-evolution 
>  wrote:
> 
> Per 
> https://lists.swift.org/pipermail/swift-evolution-announce/2016-July/000233.html
>  
> ,
>  the removal of parameter labels entirely was accepted as a temporary loss 
> for Swift 3 as a means to remove them from the type system. I'm wondering if 
> they're coming back (syntactically) any time soon?

The planning approach for Swift 5 hasn’t been announced yet, but it should be 
soon-ish.

Responding to the rest of your email in broad terms: there will always be a ton 
of things that are important and interesting to tackle.  There is also a long 
road ahead of Swift, so prioritization is not a bad thing: just because 
something doesn’t happen “now” doesn’t mean it never will.

I would also argue that it would be *bad* for the language to evolve too fast.  
Landing 20 major features all in the same year runs the very high risk that 
they doesn’t work well together and don’t have time to settle out properly.  It 
is far more important for Swift to be great over the long term than to have any 
individual little feature “now”.

-Chris


___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Planning][Request] "constexpr" for Swift 5

2017-08-04 Thread Félix Cloutier via swift-evolution
That's not a concern with the `let` case that Robert brought up, since you 
can't mutate a `let` array at all.

The big thing is that unconstrained escape analysis is uncomputable. Since 
Swift array storage is COW, any function that receives the array as a parameter 
is allowed to take a reference on its storage. If the storage lives on the 
stack and that reference outlives the stack frame, you've got a problem. It's 
the same problem that motivated @escaping for closures.

You could allow storage to be on the stack by forcing user to make a 
pessimistic copy, which is possibly not an improvement.

> Le 4 août 2017 à 09:21, Taylor Swift via swift-evolution 
>  a écrit :
> 
> No, that doesn’t work. In many cases you want to mutate the elements of the 
> array without changing its size. For example, a Camera struct which contains 
> a matrix buffer, and some of the matrices get updated on each frame that the 
> camera moves. The matrix buffer also stores all of the camera’s stored 
> properties, so what would be conceptually stored properties are actually 
> computed properties that get and set a Float at an offset into the buffer. Of 
> course this could all be avoided if we had fixed layout guarantees in the 
> language, and then the Camera struct could be the matrix buffer and dispense 
> with the getters and setters instead of managing a heap buffer.
> 
> On Fri, Aug 4, 2017 at 11:02 AM, Robert Bennett via swift-evolution 
> mailto:swift-evolution@swift.org>> wrote:
> So, I’m getting into this thread kind of late, and I’ve only skimmed most of 
> it, but…
> 
> A special FSA on the stack seems like the wrong direction. Wouldn’t it make 
> more sense to have *all* value types that don’t change in size — including 
> `let` Arrays — live on the stack? In which case, FSA would merely act like a 
> normal `let` Array, without RangeReplaceableCollection conformance, whose 
> elements could be changed via subscripting. I know nothing about the 
> underlying implementation details of Swift, so I may be way off base here.
> 
>> On Aug 4, 2017, at 2:18 AM, David Hart > > wrote:
>> 
>> Don’t small arrays live on the stack?
>> 
>>> On 4 Aug 2017, at 06:35, Félix Cloutier via swift-evolution 
>>> mailto:swift-evolution@swift.org>> wrote:
>>> 
>>> As far as I can tell, currently, all arrays live on the heap.
>>> 
 Le 3 août 2017 à 19:03, Robert Bennett via swift-evolution 
 mailto:swift-evolution@swift.org>> a écrit :
 
 Where do constant Arrays currently live? I hope the answer is on the 
 stack, since their size doesn’t change.
 
 On Aug 3, 2017, at 8:44 PM, Taylor Swift via swift-evolution 
 mailto:swift-evolution@swift.org>> wrote:
 
> 
> 
> On Thu, Aug 3, 2017 at 8:20 PM, Karl Wagner via swift-evolution 
> mailto:swift-evolution@swift.org>> wrote:
>>> 
>>> The root cause, of course, is that the VLAs require new stack 
>>> allocations each time, and the stack is only deallocated as one lump 
>>> when the frame ends.
>> 
>> That is true of alloca(), but not of VLAs.  VLAs are freed when they go 
>> out of scope.
>> 
> 
> Learned something today.
> 
> Anyway, if the goal is stack allocation, I would prefer that we explored 
> other ways to achieve it before jumping to a new array-type. I’m not 
> really a fan of a future where [3; Double] is one type and (Double, 
> Double, Double) is something else, and Array is yet another thing.
> 
> They are completely different things. 
> 
> [3; Double] is three contiguous Doubles which may or may not live on the 
> stack. 
> 
> (Double, Double, Double) is three Doubles bound to a single variable 
> name, which the compiler can rearrange for optimal performance and may or 
> may not live on the stack. 
> 
> Array is an vector of Doubles that can dynamically grow and 
> always lives in the heap.
>  
> 
> From what I’ve read so far, the problem with stack-allocating some Array 
> that you can pass to another function and which otherwise does not 
> escape, is that the function may make an escaping reference (e.g. 
> assigning it to an ivar or global, or capturing it in a closure).
> 
> How about if the compiler treated every Array it receives in a function 
> as being potentially stack-allocated. The first time you capture it, it 
> will check and copy to the heap if necessary. All subsequent escapes 
> (including passing to other functions) use the Array known to be 
> allocated on the heap, avoiding further checking or copying within the 
> function.
> 
> The same goes for Dictionary, and really any arbitrary value-type with 
> COW storage. The memory that those types allocate is part of the value, 
> so it would be cool if we could treat it like that.
> 
> 
> This is not true. FSAs have nothing to do with automatic

Re: [swift-evolution] [Planning][Request] "constexpr" for Swift 5

2017-08-04 Thread Félix Cloutier via swift-evolution
I've never seen the Swift compiler put array storage on automatic storage, even 
for small arrays. I don't think that it has much to do with their size, though 
(for any array that is not incredibly large).

> Le 3 août 2017 à 23:18, David Hart  a écrit :
> 
> Don’t small arrays live on the stack?
> 
>> On 4 Aug 2017, at 06:35, Félix Cloutier via swift-evolution 
>> mailto:swift-evolution@swift.org>> wrote:
>> 
>> As far as I can tell, currently, all arrays live on the heap.
>> 
>>> Le 3 août 2017 à 19:03, Robert Bennett via swift-evolution 
>>> mailto:swift-evolution@swift.org>> a écrit :
>>> 
>>> Where do constant Arrays currently live? I hope the answer is on the stack, 
>>> since their size doesn’t change.
>>> 
>>> On Aug 3, 2017, at 8:44 PM, Taylor Swift via swift-evolution 
>>> mailto:swift-evolution@swift.org>> wrote:
>>> 
 
 
 On Thu, Aug 3, 2017 at 8:20 PM, Karl Wagner via swift-evolution 
 mailto:swift-evolution@swift.org>> wrote:
>> 
>> The root cause, of course, is that the VLAs require new stack 
>> allocations each time, and the stack is only deallocated as one lump 
>> when the frame ends.
> 
> That is true of alloca(), but not of VLAs.  VLAs are freed when they go 
> out of scope.
> 
 
 Learned something today.
 
 Anyway, if the goal is stack allocation, I would prefer that we explored 
 other ways to achieve it before jumping to a new array-type. I’m not 
 really a fan of a future where [3; Double] is one type and (Double, 
 Double, Double) is something else, and Array is yet another thing.
 
 They are completely different things. 
 
 [3; Double] is three contiguous Doubles which may or may not live on the 
 stack. 
 
 (Double, Double, Double) is three Doubles bound to a single variable name, 
 which the compiler can rearrange for optimal performance and may or may 
 not live on the stack. 
 
 Array is an vector of Doubles that can dynamically grow and always 
 lives in the heap.
  
 
 From what I’ve read so far, the problem with stack-allocating some Array 
 that you can pass to another function and which otherwise does not escape, 
 is that the function may make an escaping reference (e.g. assigning it to 
 an ivar or global, or capturing it in a closure).
 
 How about if the compiler treated every Array it receives in a function as 
 being potentially stack-allocated. The first time you capture it, it will 
 check and copy to the heap if necessary. All subsequent escapes (including 
 passing to other functions) use the Array known to be allocated on the 
 heap, avoiding further checking or copying within the function.
 
 The same goes for Dictionary, and really any arbitrary value-type with COW 
 storage. The memory that those types allocate is part of the value, so it 
 would be cool if we could treat it like that.
 
 
 This is not true. FSAs have nothing to do with automatic storage, their 
 static size only makes them eligible to live on the stack, as tuples are 
 now. The defining quality of FSAs is that they are static and contiguous. 
 ___
 swift-evolution mailing list
 swift-evolution@swift.org 
 https://lists.swift.org/mailman/listinfo/swift-evolution 
 
>>> ___
>>> swift-evolution mailing list
>>> swift-evolution@swift.org 
>>> https://lists.swift.org/mailman/listinfo/swift-evolution 
>>> 
>> 
>> ___
>> swift-evolution mailing list
>> swift-evolution@swift.org 
>> https://lists.swift.org/mailman/listinfo/swift-evolution
> 

___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] Why you can't make someone else's class Decodable: a long-winded explanation of 'required' initializers

2017-08-04 Thread Itai Ferber via swift-evolution
To clarify a bit here — this isn’t a "privilege" so much so as a 
property of the design of these classes.
`NSData`, `NSString`, `NSArray`, and some others, are all known as 
_class clusters_; the classes you know and use are essentially abstract 
base classes whose implementation is given in private concrete 
subclasses that specialize based on usage. These classes are essentially 
an abstract interface for subclasses to follow. You can take a look at 
the [subclassing notes for 
`NSArray`](https://developer.apple.com/documentation/foundation/nsarray#1651549), 
for instance, to see the guidelines offered for subclassing such a base 
class.


The reason you can relatively safely offer `static` extensions on these 
types is that it’s reasonably rare to need to subclass them, and at 
that, even rarer to offer any interface _besides_ what’s given by the 
base class. You can rely on the, say, `NSString` interface to access all 
functionality needed to represent a string. If I were to subclass 
`NSString` with totally different properties, though, your `static` 
extension might not take that into account.


Not all types you list here are class clusters, BTW, but they largely 
fall into the same category of "never really subclassed". There’s no 
real need for anyone to subclass `NSDate` or `NSDecimalNumber` (since 
they’re pretty low-level structural types), so this should apply to 
those as well.


In general, this property applies to all types like this which are 
rarely subclassed. In Swift, types like this might fall under a `final 
class` designation, though in Objective-C it’s more by convention/lack 
of need than by strict enforcement. There’s a reason we offer some of 
these as `struct`s in Swift (e.g. `Date`, `Decimal`, `Data`, etc.).


On 3 Aug 2017, at 21:03, Gwendal Roué wrote:


Le 3 août 2017 à 19:10, Itai Ferber  a écrit :

I just mentioned this in my other email, but to point out here: the 
reason this works in your case is because you adopt these methods as 
static funcs and can reasonably rely on subclasses of NSData, 
NSNumber, NSString, etc. to do the right thing because of work done 
behind the scenes in the ObjC implementations of these classes (and 
because we’ve got established subclassing requirements on these 
methods — all subclasses of these classes are going to look 
approximately the same without doing anything crazy).


This would not work for Codable in the general case, however, where 
subclasses likely need to add additional storage, properties, encoded 
representations, etc., without equivalent requirements, either via 
additional protocols or conventions.


Thaks for your explanation why a static method in a protocol is able 
to instantiate non final classes like NSData, NSDate, NSNumber, 
NSDecimalNumber, NSString, etc.


Is this "privilege" stable? Can I rely on it to be maintained over 
time? Or would it be a better idea to drop support for those low-level 
Foundation classes, because they'll eventually become regular classes 
without any specific support? This would not harm that much: Data, 
Date, String are there for a reason. NSDecimalNumber is the only one 
of its kind, though.


Gwendal
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Planning][Request] "constexpr" for Swift 5

2017-08-04 Thread Taylor Swift via swift-evolution
No, that doesn’t work. In many cases you want to mutate the elements of the
array without changing its size. For example, a Camera struct which
contains a matrix buffer, and some of the matrices get updated on each
frame that the camera moves. The matrix buffer also stores all of the
camera’s stored properties, so what would be conceptually stored properties
are actually computed properties that get and set a Float at an offset into
the buffer. Of course this could all be avoided if we had fixed layout
guarantees in the language, and then the Camera struct could *be* the
matrix buffer and dispense with the getters and setters instead of managing
a heap buffer.

On Fri, Aug 4, 2017 at 11:02 AM, Robert Bennett via swift-evolution <
swift-evolution@swift.org> wrote:

> So, I’m getting into this thread kind of late, and I’ve only skimmed most
> of it, but…
>
> A special FSA on the stack seems like the wrong direction. Wouldn’t it
> make more sense to have *all* value types that don’t change in size —
> including `let` Arrays — live on the stack? In which case, FSA would merely
> act like a normal `let` Array, without RangeReplaceableCollection
> conformance, whose elements could be changed via subscripting. I know
> nothing about the underlying implementation details of Swift, so I may be
> way off base here.
>
> On Aug 4, 2017, at 2:18 AM, David Hart  wrote:
>
> Don’t small arrays live on the stack?
>
> On 4 Aug 2017, at 06:35, Félix Cloutier via swift-evolution <
> swift-evolution@swift.org> wrote:
>
> As far as I can tell, currently, all arrays live on the heap.
>
> Le 3 août 2017 à 19:03, Robert Bennett via swift-evolution <
> swift-evolution@swift.org> a écrit :
>
> Where do constant Arrays currently live? I hope the answer is on the
> stack, since their size doesn’t change.
>
> On Aug 3, 2017, at 8:44 PM, Taylor Swift via swift-evolution <
> swift-evolution@swift.org> wrote:
>
>
>
> On Thu, Aug 3, 2017 at 8:20 PM, Karl Wagner via swift-evolution <
> swift-evolution@swift.org> wrote:
>
>>
>> The root cause, of course, is that the VLAs require new stack allocations
>> each time, and the stack is only deallocated as one lump when the frame
>> ends.
>>
>>
>> That is true of alloca(), but not of VLAs.  VLAs are freed when they go
>> out of scope.
>>
>>
>> Learned something today.
>>
>> Anyway, if the goal is stack allocation, I would prefer that we explored
>> other ways to achieve it before jumping to a new array-type. I’m not really
>> a fan of a future where [3; Double] is one type and (Double, Double,
>> Double) is something else, and Array is yet another thing.
>>
>
> They are completely different things.
>
> [3; Double] is three *contiguous* Doubles which may or may not live on
> the stack.
>
> (Double, Double, Double) is three Doubles bound to a single variable
> *name*, which the compiler can rearrange for optimal performance and may
> or may not live on the stack.
>
> Array is an vector of Doubles that can dynamically grow and always
> lives in the heap.
>
>
>>
>> From what I’ve read so far, the problem with stack-allocating some Array
>> that you can pass to another function and which otherwise does not escape,
>> is that the function may make an escaping reference (e.g. assigning it to
>> an ivar or global, or capturing it in a closure).
>>
>> How about if the compiler treated every Array it receives in a function
>> as being potentially stack-allocated. The first time you capture it, it
>> will check and copy to the heap if necessary. All subsequent escapes
>> (including passing to other functions) use the Array known to be allocated
>> on the heap, avoiding further checking or copying within the function.
>>
>> The same goes for Dictionary, and really any arbitrary value-type with
>> COW storage. The memory that those types allocate is part of the value, so
>> it would be cool if we could treat it like that.
>>
>>
> This is not true. FSAs have nothing to do with automatic storage, their
> static size only makes them *eligible* to live on the stack, as tuples
> are now. The defining quality of FSAs is that they are static and
> contiguous.
>
> ___
> swift-evolution mailing list
> swift-evolution@swift.org
> https://lists.swift.org/mailman/listinfo/swift-evolution
>
> ___
> swift-evolution mailing list
> swift-evolution@swift.org
> https://lists.swift.org/mailman/listinfo/swift-evolution
>
>
> ___
> swift-evolution mailing list
> swift-evolution@swift.org
> https://lists.swift.org/mailman/listinfo/swift-evolution
>
>
>
>
> ___
> swift-evolution mailing list
> swift-evolution@swift.org
> https://lists.swift.org/mailman/listinfo/swift-evolution
>
>
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


[swift-evolution] [planning] [discussion] Schedule for return of closure parameter labels (+ world domination ramble)

2017-08-04 Thread Mathew Huusko V via swift-evolution
Per
https://lists.swift.org/pipermail/swift-evolution-announce/2016-July/000233.html,
the removal of parameter labels entirely was accepted as a temporary loss
for Swift 3 as a means to remove them from the type system. I'm wondering
if they're coming back (syntactically) any time soon?

Other than being in the spirit of Swift in general (safe APIs, first class
closures/functions), with the lack of optional protocol requirements the
only way to achieve something similar is..
`lazy var someFunc: ((someParam: String) -> String)? = { [unowned self]
someParam in ... }`
.. which doesn't compile anymore.

–––

As a related aside, I love Swift, and as it gains more core functionality
(e.g. KeyPaths) I love it even more. However I'm concerned that the systems
(frameworks *or* apps) I/others create on top of such functionality are
only as good as their APIs, which are often held back by the lack of
relevant features –
optional protocol requirements, within-module access (or at least,
*visibility*) control (e.g. ~protected), explicit/arbitrary
namespaces, abstract classes, currying, generalised existentials, factory
initialisers, closure parameter labels, etc.
– and it's concerning to read through swift-evolution and find a general
pattern of these features being postponed or worse, rejected, not because
of clear alternatives to producing the same *safe and expressive/explicit* APIs
they facilitate, but because ~"most programmers probably won't need/use
them".

This from/about the language that has tacked on a whole statement ('guard')
for inverse conditionals.
This from/about the language that is/has successfully popularised protocol
oriented programming, custom/overloaded operators, and ADTs.
This from/about the language that's supposed to define the next decade+ of
Apple and its (or more) community's software, as well as programming
education.

I think that if/when Swift accomplishes world domination (cc: Lattner), it
will find that the world is more diverse than has been let on, and the only
practical limit to what people will 'need'/find useful will be what Swift
was able to integrate safely/elegantly.

None of this would matter of course, or at least not so much, if it wasn't
for Swift's nearing target of ABI stability. Someone absolutely do feel
free to console me/tell me that all of these kinds of things (and things
that haven't been thought of yet; state of the art is a quickly moving
target after all..) could be added later if the community changes its mind,
but if not, that's scary. I understand the dangers of feature creep and
kitchen-sinkage, but C++ (e.g/etc.; TypeScript?) got the way it is by
reckless feature addition over multiple generations unchecked by ABI
stability requirements. Surely if there's only one main chance/generation
of feature addition with Swift, it should be relatively liberal/prescient.

Anyway, I've said/rambled more than enough/than I'm qualified to. TL;DR:/in
summary, some things are relatively small (closure parameter labels
probably won't define the future of the language) and some things.. aren't;
I hope that everything is being done to ensure that Swift can dominate the
world (and its diverse use cases)* currently* as well as hold that spot
when the standards/state of the art changes. I'd be really curious to hear
from core-esque people about their practical or lofty thoughts/feelings on
this matter.
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Planning][Request] "constexpr" for Swift 5

2017-08-04 Thread Robert Bennett via swift-evolution
So, I’m getting into this thread kind of late, and I’ve only skimmed most of 
it, but…

A special FSA on the stack seems like the wrong direction. Wouldn’t it make 
more sense to have *all* value types that don’t change in size — including 
`let` Arrays — live on the stack? In which case, FSA would merely act like a 
normal `let` Array, without RangeReplaceableCollection conformance, whose 
elements could be changed via subscripting. I know nothing about the underlying 
implementation details of Swift, so I may be way off base here.

> On Aug 4, 2017, at 2:18 AM, David Hart  wrote:
> 
> Don’t small arrays live on the stack?
> 
>> On 4 Aug 2017, at 06:35, Félix Cloutier via swift-evolution 
>> mailto:swift-evolution@swift.org>> wrote:
>> 
>> As far as I can tell, currently, all arrays live on the heap.
>> 
>>> Le 3 août 2017 à 19:03, Robert Bennett via swift-evolution 
>>> mailto:swift-evolution@swift.org>> a écrit :
>>> 
>>> Where do constant Arrays currently live? I hope the answer is on the stack, 
>>> since their size doesn’t change.
>>> 
>>> On Aug 3, 2017, at 8:44 PM, Taylor Swift via swift-evolution 
>>> mailto:swift-evolution@swift.org>> wrote:
>>> 
 
 
 On Thu, Aug 3, 2017 at 8:20 PM, Karl Wagner via swift-evolution 
 mailto:swift-evolution@swift.org>> wrote:
>> 
>> The root cause, of course, is that the VLAs require new stack 
>> allocations each time, and the stack is only deallocated as one lump 
>> when the frame ends.
> 
> That is true of alloca(), but not of VLAs.  VLAs are freed when they go 
> out of scope.
> 
 
 Learned something today.
 
 Anyway, if the goal is stack allocation, I would prefer that we explored 
 other ways to achieve it before jumping to a new array-type. I’m not 
 really a fan of a future where [3; Double] is one type and (Double, 
 Double, Double) is something else, and Array is yet another thing.
 
 They are completely different things. 
 
 [3; Double] is three contiguous Doubles which may or may not live on the 
 stack. 
 
 (Double, Double, Double) is three Doubles bound to a single variable name, 
 which the compiler can rearrange for optimal performance and may or may 
 not live on the stack. 
 
 Array is an vector of Doubles that can dynamically grow and always 
 lives in the heap.
  
 
 From what I’ve read so far, the problem with stack-allocating some Array 
 that you can pass to another function and which otherwise does not escape, 
 is that the function may make an escaping reference (e.g. assigning it to 
 an ivar or global, or capturing it in a closure).
 
 How about if the compiler treated every Array it receives in a function as 
 being potentially stack-allocated. The first time you capture it, it will 
 check and copy to the heap if necessary. All subsequent escapes (including 
 passing to other functions) use the Array known to be allocated on the 
 heap, avoiding further checking or copying within the function.
 
 The same goes for Dictionary, and really any arbitrary value-type with COW 
 storage. The memory that those types allocate is part of the value, so it 
 would be cool if we could treat it like that.
 
 
 This is not true. FSAs have nothing to do with automatic storage, their 
 static size only makes them eligible to live on the stack, as tuples are 
 now. The defining quality of FSAs is that they are static and contiguous. 
 ___
 swift-evolution mailing list
 swift-evolution@swift.org 
 https://lists.swift.org/mailman/listinfo/swift-evolution 
 
>>> ___
>>> swift-evolution mailing list
>>> swift-evolution@swift.org 
>>> https://lists.swift.org/mailman/listinfo/swift-evolution 
>>> 
>> 
>> ___
>> swift-evolution mailing list
>> swift-evolution@swift.org 
>> https://lists.swift.org/mailman/listinfo/swift-evolution
> 

___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution