[julia-users] Return value discrepancy between `foo[i] = x` and `setindex!(foo, x, i)`

2015-08-16 Thread Kenta Sato
I thought that `foo[i] = x` is a syntax sugar of `setindex!(foo, x, i)` and 
hence the return values are identical in both cases. This is suggested in a 
section of the manual: 
http://julia.readthedocs.org/en/release-0.3/stdlib/collections/#indexable-collections
.

setindex!(*collection*, *value*, *key...*)

 Store the given value at the given key or index within a collection. The 
 syntax a[i,j,...] = x is converted by the compiler to setindex!(a, x, i, 
 j, ...).


But the following code doesn't work as such:

type Foo; end

function Base.setindex!(foo::Foo, x, i)
return 100
end

let
foo = Foo()
@show (foo[1] = 1)
@show (setindex!(foo, 1, 1))
end

Actual:
foo[1] = 1 = 1
setindex!(foo,1,1) = 100


Expected:
foo[1] = 1 = 100
setindex!(foo,1,1) = 100


So my question is which is the intended behavior?
I think it is unreasonable for `setindex!` to ignore the specified return 
value when written as `foo[i] = x` if `foo[i] = x` is really converted to 
`setindex!(foo, x, i)`.


Re: [julia-users] Return value discrepancy between `foo[i] = x` and `setindex!(foo, x, i)`

2015-08-16 Thread Stefan Karpinski
This is intentional. It's more like syntactic sugar for (setindex!(foo, x, i); 
x). The documentation should be updated.

 On Aug 16, 2015, at 4:52 PM, Kenta Sato bicycle1...@gmail.com wrote:
 
 I thought that `foo[i] = x` is a syntax sugar of `setindex!(foo, x, i)` and 
 hence the return values are identical in both cases. This is suggested in a 
 section of the manual: 
 http://julia.readthedocs.org/en/release-0.3/stdlib/collections/#indexable-collections.
 
 setindex!(collection, value, key...)
 Store the given value at the given key or index within a collection. The 
 syntax a[i,j,...] = x is converted by the compiler to setindex!(a, x, i, j, 
 ...).
 
 
 But the following code doesn't work as such:
 
 type Foo; end
 
 function Base.setindex!(foo::Foo, x, i)
 return 100
 end
 
 let
 foo = Foo()
 @show (foo[1] = 1)
 @show (setindex!(foo, 1, 1))
 end
 
 Actual:
 foo[1] = 1 = 1
 setindex!(foo,1,1) = 100
 
 
 Expected:
 foo[1] = 1 = 100
 setindex!(foo,1,1) = 100
 
 
 So my question is which is the intended behavior?
 I think it is unreasonable for `setindex!` to ignore the specified return 
 value when written as `foo[i] = x` if `foo[i] = x` is really converted to 
 `setindex!(foo, x, i)`.