Re: [swift-users] dispatch concurrent map: is this right?

2016-10-30 Thread Dave Abrahams via swift-users

on Sun Oct 30 2016, Dave Abrahams  wrote:

>> I quite like the API as an extension on Range. I think it would be a
>> nice addition to Dispatch (once we start allowing additive proposals):
>>
>> extension Range where Bound : Strideable, Bound.Stride : SignedInteger {
>>
>>   func concurrentMap(_ transform: (Bound) -> T) -> [T] {
>> let n= numericCast(count) as Int
>> let buffer = UnsafeMutablePointer.allocate(capacity: n)
>>
>> DispatchQueue.concurrentPerform(iterations: n) {
>>   (buffer + $0).initialize(to: transform(lowerBound + numericCast($0)))
>> }
>>
>> // Unfortunately, the buffer is copied when making it an Array.
>> defer { buffer.deallocate(capacity: n) }
>> return Array(UnsafeMutableBufferPointer(start: buffer, count: n))
>>   }
>> }
>>
>> extension Collection {
>>   func concurrentMap(_ transform: (Iterator.Element)->T) -> [T] {
>>
>> // ‘as Range’ because CountableRange is a collection, causing the 
>> function to be recursive.
>> return ((0..>   transform(self[index(startIndex, offsetBy: numericCast($0))])
>> }
>>   }
>> }
>
> I see the beauty in what you're doing here, but I don't see any
> advantage to it for users.  Now Range (which will be collapsed with
> CountableRange in Swift 4) will have two overloads of concurrentMap.  In
> general, avoidable overloads are bad for the user experience.

Oh, and I should add, a suite of parallel algorithms would be great, but
it should be implemented similarly to lazy, so instead of 

   x.concurrentMap { ... }.concurrentFilter { ... }

you'd write

   x.parallel.map { ... }.filter { ... }

Cheers,

-- 
-Dave

___
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users


Re: [swift-users] dispatch concurrent map: is this right?

2016-10-30 Thread Dave Abrahams via swift-users

on Sun Oct 30 2016, Karl  wrote:

>> On 30 Oct 2016, at 19:23, Dave Abrahams via swift-users 
>>  wrote:
>> 
>> 
>> on Sun Oct 30 2016, Karl  wrote:
>> 
>
 On 30 Oct 2016, at 09:15, Karl  wrote:
 
 I had the need for a concurrent map recently. I had a part of a
 program which needed to read chunks of data and concurrently process
 them and assemble the results in an array. This isn’t necessarily as
 obvious as it sounds, because of arrays being value types. I came up
 with the following snippet which I’d like to check for correctness;
 it could also be helpful to others.
 
 Perhaps this is something Dispatch should provide out-of-the-box?
 
 - Karl
>>> 
>>> Ah one second, I was refactoring this and forgot to test it. Here’s the 
>>> actual code:
>> 
>> A map presumably requires an input 
>
> DispatchQueue.concurrentMap maps a Range -> T, but since the
> range is always 0.. be written quite naturally as an extension on Range and build
> everything on top of it.

Sorry, I wrote “a map presumably requires an input” before I realized
what you were doing.  I should have deleted that.

>> 
>>> extension DispatchQueue {
>>> 
>>>  static func concurrentMap(iterations: Int, execute block: (Int) -> T) 
>>> -> [T] {
>>> 
>>>let __result = UnsafeMutableRawBufferPointer.allocate(count: iterations 
>>> * MemoryLayout.stride)
>>>defer { __result.deallocate() }
>>>let _result  = __result.baseAddress?.assumingMemoryBound(to: T.self)
>> 
>> You never bound the memory to T, so this will be undefined behavior.  
>> 
>>>let result   = UnsafeMutableBufferPointer(start: _result, count: 
>>> iterations)
>>>concurrentPerform(iterations: iterations) { idx in
>>>  result[idx] = block(idx)
>> 
>> You also never initialized the Ts in that memory region, so assigning
>> into them will also be undefined behavior.
>> 
>>>}
>>>return Array(result)
>>>  }
>>> }
>>> 
>>> extension Array {
>>>  func concurrentMap(execute block: (Element)->T) -> [T] {
>>>return DispatchQueue.concurrentMap(iterations: count) { block(self[$0]) }
>>>  }
>>> }
>>> 
>>> Unfortunately I don’t think there’s a way to get an array to take over a +1
>>> UnsafeMutableBufferPointer without copying.
>> 
>> The only correct way to do this without creating intermediate storage is
>> to have a way to initialize your result elements, e.g.:
>> 
>>  import Dispatch
>> 
>>  protocol DefaultInitializable {
>>init()
>>  }
>> 
>>  extension RandomAccessCollection {
>>func concurrentMap(_ transform: (Iterator.Element)->T) -> [T]
>>where T : DefaultInitializable {
>>  var result = Array(
>>repeating: T(), count: numericCast(self.count))
>> 
>>  DispatchQueue.concurrentPerform(iterations: result.count) {
>>offset in 
>>result[offset] = transform(
>>  self[index(startIndex, offsetBy: numericCast(offset))])
>>  }
>>  return result
>>}
>>  }
>> 
>>  extension Int : DefaultInitializable {  }
>> 
>>  print((3..<20).concurrentMap { $0 * 2 })
>> 
>
> I had a go at doing that before, using Optional and unwrapping at
> the end — but it occurred to me that it would be very inefficient for
> things like Optional, and introduces more allocations.

Yeah, optional is not really a good choice for that application.

>> If you don't want the DefaultInitializable requirement (or some other
>> way to prepare initialized elements), you'll need to manage memory
>> yourself:
>> 
>>  extension RandomAccessCollection {
>>func concurrentMap(_ transform: (Iterator.Element)->T) -> [T] {
>>  let n = numericCast(self.count) as Int
>>  let p = UnsafeMutablePointer.allocate(capacity: n)
>>  defer { p.deallocate(capacity: n) }
>> 
>>  DispatchQueue.concurrentPerform(iterations: n) {
>>offset in
>>(p + offset).initialize(
>>  to: transform(
>>self[index(startIndex, offsetBy: numericCast(offset))]))
>>  }
>> 
>>  return Array(UnsafeMutableBufferPointer(start: p, count: n))
>>}
>>  }
>> 
>> This posting highlights a couple of weaknesses in the standard library
>> for which I'd appreciate bug reports:
>> 
>> 1. No way to arbitrarily initialize an Array's storage.
>> 2. UnsafeMutableBufferPointer doesn't have an allocating init
>> 
> Filed:
>
> 1. https://bugs.swift.org/browse/SR-3087
> 2. https://bugs.swift.org/browse/SR-3088

Thanks for these!
>
> What is your opinion on the corelibs extending the standard library
> types? 
> Foundation does it to provide APIs from NSString, but it’s kind of a
> special case. 

My opinion is that, *as a rule*, frameworks should avoid extending APIs
from other frameworks; it makes it very hard for the author of the
original type to manage the user experience of her type.  But there are
exceptions to every rule ;-)

> Would it be reasonable for Dispatch (which is not 

[swift-users] [SwiftPM] Build system module package to .a/.dylib?

2016-10-30 Thread Richard Wei via swift-users
Hi, 

Is there a way to build a system module package into a static/dynamic library, 
so that a Swift script can link against this library to find the module? Adding 
library products to Package.swift doesn’t seem to work.

Thanks,
Richard

___
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users


Re: [swift-users] dispatch concurrent map: is this right?

2016-10-30 Thread Karl via swift-users

> On 30 Oct 2016, at 19:23, Dave Abrahams via swift-users 
>  wrote:
> 
> 
> on Sun Oct 30 2016, Karl  wrote:
> 
>>> On 30 Oct 2016, at 09:15, Karl  wrote:
>>> 
>>> I had the need for a concurrent map recently. I had a part of a
>>> program which needed to read chunks of data and concurrently process
>>> them and assemble the results in an array. This isn’t necessarily as
>>> obvious as it sounds, because of arrays being value types. I came up
>>> with the following snippet which I’d like to check for correctness;
>>> it could also be helpful to others.
>>> 
>>> Perhaps this is something Dispatch should provide out-of-the-box?
>>> 
>>> - Karl
>> 
>> Ah one second, I was refactoring this and forgot to test it. Here’s the 
>> actual code:
> 
> A map presumably requires an input 

DispatchQueue.concurrentMap maps a Range -> T, but since the range is 
always 0.. 
>> extension DispatchQueue {
>> 
>>  static func concurrentMap(iterations: Int, execute block: (Int) -> T) -> 
>> [T] {
>> 
>>let __result = UnsafeMutableRawBufferPointer.allocate(count: iterations * 
>> MemoryLayout.stride)
>>defer { __result.deallocate() }
>>let _result  = __result.baseAddress?.assumingMemoryBound(to: T.self)
> 
> You never bound the memory to T, so this will be undefined behavior.  
> 
>>let result   = UnsafeMutableBufferPointer(start: _result, count: 
>> iterations)
>>concurrentPerform(iterations: iterations) { idx in
>>  result[idx] = block(idx)
> 
> You also never initialized the Ts in that memory region, so assigning
> into them will also be undefined behavior.
> 
>>}
>>return Array(result)
>>  }
>> }
>> 
>> extension Array {
>>  func concurrentMap(execute block: (Element)->T) -> [T] {
>>return DispatchQueue.concurrentMap(iterations: count) { block(self[$0]) }
>>  }
>> }
>> 
>> Unfortunately I don’t think there’s a way to get an array to take over a +1
>> UnsafeMutableBufferPointer without copying.
> 
> The only correct way to do this without creating intermediate storage is
> to have a way to initialize your result elements, e.g.:
> 
>  import Dispatch
> 
>  protocol DefaultInitializable {
>init()
>  }
> 
>  extension RandomAccessCollection {
>func concurrentMap(_ transform: (Iterator.Element)->T) -> [T]
>where T : DefaultInitializable {
>  var result = Array(
>repeating: T(), count: numericCast(self.count))
> 
>  DispatchQueue.concurrentPerform(iterations: result.count) {
>offset in 
>result[offset] = transform(
>  self[index(startIndex, offsetBy: numericCast(offset))])
>  }
>  return result
>}
>  }
> 
>  extension Int : DefaultInitializable {  }
> 
>  print((3..<20).concurrentMap { $0 * 2 })
> 

I had a go at doing that before, using Optional and unwrapping at the end — 
but it occurred to me that it would be very inefficient for things like 
Optional, and introduces more allocations.


> If you don't want the DefaultInitializable requirement (or some other
> way to prepare initialized elements), you'll need to manage memory
> yourself:
> 
>  extension RandomAccessCollection {
>func concurrentMap(_ transform: (Iterator.Element)->T) -> [T] {
>  let n = numericCast(self.count) as Int
>  let p = UnsafeMutablePointer.allocate(capacity: n)
>  defer { p.deallocate(capacity: n) }
> 
>  DispatchQueue.concurrentPerform(iterations: n) {
>offset in
>(p + offset).initialize(
>  to: transform(
>self[index(startIndex, offsetBy: numericCast(offset))]))
>  }
> 
>  return Array(UnsafeMutableBufferPointer(start: p, count: n))
>}
>  }
> 
> This posting highlights a couple of weaknesses in the standard library
> for which I'd appreciate bug reports:
> 
> 1. No way to arbitrarily initialize an Array's storage.
> 2. UnsafeMutableBufferPointer doesn't have an allocating init
> 
> Thanks!
> 
> -- 
> -Dave
> 
> ___
> swift-users mailing list
> swift-users@swift.org
> https://lists.swift.org/mailman/listinfo/swift-users


Filed:

1. https://bugs.swift.org/browse/SR-3087
2. https://bugs.swift.org/browse/SR-3088

What is your opinion on the corelibs extending the standard library types? 
Foundation does it to provide APIs from NSString, but it’s kind of a special 
case. Would it be reasonable for Dispatch (which is not _such_ a special case) 
to also extend types like Range and Collection?

I quite like the API as an extension on Range. I think it would be a nice 
addition to Dispatch (once we start allowing additive proposals):

extension Range where Bound : Strideable, Bound.Stride : SignedInteger {

  func concurrentMap(_ transform: (Bound) -> T) -> [T] {
let n= numericCast(count) as Int
let buffer = 

Re: [swift-users] Swift Package Manager and Git submodules

2016-10-30 Thread Daniel Dunbar via swift-users

> On Oct 29, 2016, at 7:22 AM, Anton Bronnikov  wrote:
> 
> Thanks, Daniel.
> 
> Yes, on Oct-27 snapshot `Swift version 3.0-dev (LLVM b52fce3ab4, Clang 
> 4edf31e82f, Swift bf2de4a41c)` if I build with `swift build 
> --enable-new-resolver` then I do get the expected behaviour.  Building with 
> usual `swift build` gets me an old - wrong - one.
> 
> It’s an experimental feature at the moment, right (e.g. I can not invoke 
> `swift package fetch --enable-new-resolver`).

You should be able to use it almost everywhere, but yes it isn't on by default 
because there are a couple pieces not done.

>  Will it be in 3.1?

We are working to switch over to it ASAP, but I don't know exactly when that 
will happen. I hope w/in a month or two.

 - Daniel

> 
> Thank you.
> Cheers,
> Anton
> 
>> On 29 Oct 2016, at 00:18, Daniel Dunbar  wrote:
>> 
>> This sounds like a bug to me, I suspect that the current code isn't causing 
>> the submodule to update appropriately.
>> 
>> If you have a working example, can you try using the latest OSS snapshot 
>> from swift.org, and running:
>> swift package reset
>> swift build --enable-new-resolver
>> and seeing if you get the behavior you expect?
>> 
>> - Daniel
>> 
>>> On Oct 28, 2016, at 1:53 PM, Anton Bronnikov via swift-users 
>>>  wrote:
>>> 
>>> Hi,
>>> 
>>> I have a question whether what I observe is by-design, a bug, or not yet 
>>> fully implemented feature in Swift Package Manager.
>>> 
>>> - Let’s say, I have a C repository with some library, and it has two 
>>> versions tagged, namely 0.0.1 and 0.0.2.
>>> - Then I have a Swift repository that includes the above as a submodule, 
>>> provides necessary files and exports C functionality into Swift.  This one 
>>> also has two versions tagged, 0.0.1 and 0.0.2, each matching corresponding 
>>> version within C repository.
>>> - Finally, I have an application in Swift, that uses the wrapper package as 
>>> a dependency and specifies 0.0.1 as the desired version.
>>> 
>>> Normally, I would expect that `swift build` would have PM to check out 
>>> 0.0.1/0.0.1 versions of the repositories (SwiftWrapper/CLibrary).  However, 
>>> in fact what I get is 0.0.1/0.0.2 (in other words, I get the right - older 
>>> - version of the wrapper package, but wrong - new - version of the C 
>>> submodule).
>>> 
>>> The use case is to “escort” a C library that is being continuously 
>>> developed and used as such (e.g. in Linux community) with its Swift bridge 
>>> without having to copy-paste the sources from the original repo into the 
>>> mirror (so that the Swift wrapper would only provide the Package.swift, 
>>> public header file, and possible a modulemap).
>>> 
>>> If there are other (better) way to do this, I will be glad to hear.
>>> 
>>> Thanks for the help.
>>> Cheers,
>>> Anton
>>> 
>>> ___
>>> swift-users mailing list
>>> swift-users@swift.org
>>> https://lists.swift.org/mailman/listinfo/swift-users
>> 
> 

___
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users


[swift-users] Custom little/big endian data structure.

2016-10-30 Thread Adrian Zubarev via swift-users
Hi there,

is there actually a way to build a custom data structure that will 
automatically be converted to little/big endian on a little/big endian system, 
just like (U)Int16/32/64 do?

I could build as a workaround a mechanism that will do that for me, but I’m 
curious if this is possible with some workaround. 

Specifically, I’m talking about a 128 Bit data structure.

Best regards,

-- 
Adrian Zubarev
Sent with Airmail___
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users


Re: [swift-users] dispatch concurrent map: is this right?

2016-10-30 Thread Dave Abrahams via swift-users

on Sun Oct 30 2016, Karl  wrote:

>> On 30 Oct 2016, at 09:15, Karl  wrote:
>> 
>> I had the need for a concurrent map recently. I had a part of a
>> program which needed to read chunks of data and concurrently process
>> them and assemble the results in an array. This isn’t necessarily as
>> obvious as it sounds, because of arrays being value types. I came up
>> with the following snippet which I’d like to check for correctness;
>> it could also be helpful to others.
>> 
>> Perhaps this is something Dispatch should provide out-of-the-box?
>> 
>> - Karl
>
> Ah one second, I was refactoring this and forgot to test it. Here’s the 
> actual code:

A map presumably requires an input 

> extension DispatchQueue {
>
>   static func concurrentMap(iterations: Int, execute block: (Int) -> T) -> 
> [T] {
>
> let __result = UnsafeMutableRawBufferPointer.allocate(count: iterations * 
> MemoryLayout.stride)
> defer { __result.deallocate() }
> let _result  = __result.baseAddress?.assumingMemoryBound(to: T.self)

You never bound the memory to T, so this will be undefined behavior.  

> let result   = UnsafeMutableBufferPointer(start: _result, count: 
> iterations)
> concurrentPerform(iterations: iterations) { idx in
>   result[idx] = block(idx)

You also never initialized the Ts in that memory region, so assigning
into them will also be undefined behavior.

> }
> return Array(result)
>   }
> }
>
> extension Array {
>   func concurrentMap(execute block: (Element)->T) -> [T] {
> return DispatchQueue.concurrentMap(iterations: count) { block(self[$0]) }
>   }
> }
>
> Unfortunately I don’t think there’s a way to get an array to take over a +1
> UnsafeMutableBufferPointer without copying.

The only correct way to do this without creating intermediate storage is
to have a way to initialize your result elements, e.g.:

  import Dispatch

  protocol DefaultInitializable {
init()
  }

  extension RandomAccessCollection {
func concurrentMap(_ transform: (Iterator.Element)->T) -> [T]
where T : DefaultInitializable {
  var result = Array(
repeating: T(), count: numericCast(self.count))

  DispatchQueue.concurrentPerform(iterations: result.count) {
offset in 
result[offset] = transform(
  self[index(startIndex, offsetBy: numericCast(offset))])
  }
  return result
}
  }

  extension Int : DefaultInitializable {  }

  print((3..<20).concurrentMap { $0 * 2 })

If you don't want the DefaultInitializable requirement (or some other
way to prepare initialized elements), you'll need to manage memory
yourself:

  extension RandomAccessCollection {
func concurrentMap(_ transform: (Iterator.Element)->T) -> [T] {
  let n = numericCast(self.count) as Int
  let p = UnsafeMutablePointer.allocate(capacity: n)
  defer { p.deallocate(capacity: n) }

  DispatchQueue.concurrentPerform(iterations: n) {
offset in
(p + offset).initialize(
  to: transform(
self[index(startIndex, offsetBy: numericCast(offset))]))
  }

  return Array(UnsafeMutableBufferPointer(start: p, count: n))
}
  }

This posting highlights a couple of weaknesses in the standard library
for which I'd appreciate bug reports:

1. No way to arbitrarily initialize an Array's storage.
2. UnsafeMutableBufferPointer doesn't have an allocating init

Thanks!

-- 
-Dave

___
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users


Re: [swift-users] dispatch concurrent map: is this right?

2016-10-30 Thread Karl via swift-users

> On 30 Oct 2016, at 09:15, Karl  wrote:
> 
> I had the need for a concurrent map recently. I had a part of a program which 
> needed to read chunks of data and concurrently process them and assemble the 
> results in an array. This isn’t necessarily as obvious as it sounds, because 
> of arrays being value types. I came up with the following snippet which I’d 
> like to check for correctness; it could also be helpful to others.
> 
> Perhaps this is something Dispatch should provide out-of-the-box?
> 
> - Karl

Ah one second, I was refactoring this and forgot to test it. Here’s the actual 
code:

extension DispatchQueue {

  static func concurrentMap(iterations: Int, execute block: (Int) -> T) -> 
[T] {

let __result = UnsafeMutableRawBufferPointer.allocate(count: iterations * 
MemoryLayout.stride)
defer { __result.deallocate() }
let _result  = __result.baseAddress?.assumingMemoryBound(to: T.self)
let result   = UnsafeMutableBufferPointer(start: _result, count: 
iterations)
concurrentPerform(iterations: iterations) { idx in
  result[idx] = block(idx)
}
return Array(result)
  }
}

extension Array {
  func concurrentMap(execute block: (Element)->T) -> [T] {
return DispatchQueue.concurrentMap(iterations: count) { block(self[$0]) }
  }
}


Unfortunately I don’t think there’s a way to get an array to take over a +1 
UnsafeMutableBufferPointer without copying.

- Karl___
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users


[swift-users] dispatch concurrent map: is this right?

2016-10-30 Thread Karl via swift-users
I had the need for a concurrent map recently. I had a part of a program which 
needed to read chunks of data and concurrently process them and assemble the 
results in an array. This isn’t necessarily as obvious as it sounds, because of 
arrays being value types. I came up with the following snippet which I’d like 
to check for correctness; it could also be helpful to others.

Perhaps this is something Dispatch should provide out-of-the-box?

- Karl


extension DispatchQueue {

  static func concurrentMap(iterations: Int, execute block: (Int) -> T) -> 
[T] {
var result = Array()
result.reserveCapacity(iterations)

result.withUnsafeMutableBufferPointer { (results: inout 
UnsafeMutableBufferPointer) in
  concurrentPerform(iterations: iterations) { idx in
results[idx] = block(idx)
  }
}

return result
  }
}

extension Array {
  func concurrentMap(execute block: (Element)->T) -> [T] {
return DispatchQueue.concurrentMap(iterations: count) { block(self[$0]) }
  }
}___
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users