[julia-users] Re: ANN: SingularIntegralEquations.jl release v0.0.1
This is really cool! I was actually looking for something like Lemma 5.9 in your preprint, and definitely did not expect to find it on julia-users. Thanks for saving me the time! On Tuesday, July 7, 2015 at 5:41:36 AM UTC-7, richard@maths.ox.ac.uk wrote: This is to announce the v0.0.1 release of the new package SingularIntegralEquations.jl https://github.com/ApproxFun/SingularIntegralEquations.jl Built on top of the fast linear algebra for function approximation in ApproxFun, this new package solves problems in acoustic scattering (Helmholtz and gravity Helmholtz equations), potential theory (Laplace equation), fracture mechanics, and Riemann--Hilbert problems. There is a preprint available on the package's readme for algorithmic details. Joint work with Sheehan Olver.
Re: [julia-users] Re: questions about coroutines
One of the limitations I find with the current implementations of tasks is the lock-step nature of produce/consume as also being type unstable. Channels which are type-stable should be much faster. Tasks should communicate via channels and I hope to have produce/consume use them too. Filling an array, or a lazy stream generator will naturally be faster - I don't know the overhead of task switching (when using a mix of tasks and channels) at this time though. On Tue, Jul 7, 2015 at 9:57 PM, andrew cooke and...@acooke.org wrote: sorry, i phrased that really poorly. what i meant was, they seem to be intended as a kind of everything included replacement for multiple threading. for example, the PR you link to adds channels. i assumed that was why they have a reputation for being too slow to be useful in cases where, say, you can just allocate an array, fill it in a loop, and return the array. or am i wrong about that too? maybe it would be clearer if i explained a motivating example. i recently wrote a parser combinator using a trampoline approach. obviously, i could have used Tasks to simplify the code. but i didn't because i assumed they would be way slower than doing the trampoline by hand. now i don't have a reference for that, but the comments earlier in this thread say that they are type unstable, which would certainly not help. so what i am asking, i guess is: (1) am i wrong about Tasks being slow? (2) if not, are there any plans for a faster, simpler, type-stable coroutine? (3) or even just lazy streams, if that helps with type stability? andrew On Tuesday, 7 July 2015 10:02:20 UTC-3, Amit Murthy wrote: Tasks are lighweight, not at all like threads. You can literally have thousands of them without any issues. Multi-threaded julia is a work-in-progress. Currently each julia process has a single thread of execution and tasks switch whenever there is I/O involved or on an explicit yield(). libuv provides the underlying event-driven I/O, timers, etc. On Tue, Jul 7, 2015 at 6:02 PM, andrew cooke and...@acooke.org wrote: it seems to me that coroutines (Tasks) in julia are very much intended for heavyweight multithread use. but lazy streams are really useful in general (they transformed how i use python), and even full-on coroutines can be useful in some single-threaded applications. so i wonder if there's a need for a lighter-weight implementation? maybe something already exists - it's not clear to me this needs to be part of the base language. andrew
Re: [julia-users] Errors while trying to install Cxx Package
Ok, thanks a lot for your help! On Tuesday, July 7, 2015 at 6:03:12 PM UTC+2, Keno Fischer wrote: Yes, that is correct, just create a Make.user file with that content in the same directory as your julia source install (where the Make.inc file already is). The `LLVM_VER=svn` line directs it to use the svn version of llvm. On Tue, Jul 7, 2015 at 11:59 AM, Kostas Tavlaridis-Gyparakis kostas.t...@gmail.com javascript: wrote: Ok, thank you pointing that out. So, I need to uninstall my current version and install using the source files. Just two questions really noob ones. In the read me file, it says that I need to add the following lines to a Make.user: override LLDB_VER=master override LLVM_VER=svn override LLVM_ASSERTIONS=1 override BUILD_LLVM_CLANG=1 override BUILD_LLDB=1 override USE_LLVM_SHLIB=1 override LLDB_DISABLE_PYTHON=1 Does this mean that I just create a file with the name Make.user and copy paste this line on the file and then save it in the directory where the source file is downloaded? Also when you say ne that uses LLVM-svn not sure what I need to check for this one. On Tuesday, July 7, 2015 at 5:36:49 PM UTC+2, Keno Fischer wrote: Please see the instructions in the Cxx.jl README. In particular, you need (at the moment at least) - a source install of julia - one that uses LLVM-svn On Tue, Jul 7, 2015 at 11:33 AM, Kostas Tavlaridis-Gyparakis kostas.t...@gmail.com wrote: Hello, I am running the following version of Julia: Julia Version 0.4.0-dev+5809 Commit b414076* (2015-07-06 15:38 UTC) Platform Info: System: Linux (x86_64-linux-gnu) CPU: Intel(R) Core(TM) i5-4300U CPU @ 1.90GHz WORD_SIZE: 64 BLAS: libopenblas (NO_LAPACKE DYNAMIC_ARCH NO_AFFINITY Haswell) LAPACK: liblapack.so.3 LIBM: libopenlibm LLVM: libLLVM-3.3 And when I try to add the CXX package I receive the following error: Pkg.build(Cxx) INFO: Building Cxx Tuning for julia installation at: /usr/bin BuildBootstrap.Makefile:2: /usr/bin/../../deps/Versions.make: No such file or directory BuildBootstrap.Makefile:3: /usr/bin/../../Make.inc: No such file or directory make: *** No rule to make target '/usr/bin/../../Make.inc'. Stop. =[ ERROR: Cxx ]= LoadError: failed process: Process(`make -f BuildBootstrap.Makefile JULIA_HOME=/usr/bin`, ProcessExited(2)) [2] while loading /home/kostav/.julia/v0.4/Cxx/deps/build.jl, in expression starting on line 16 [ BUILD ERRORS ] WARNING: Cxx had build errors. - packages with build errors remain installed in /home/kostav/.julia/v0.4 - build the package(s) and all dependencies with `Pkg.build(Cxx)` - build a single package by running its `deps/build.jl` script I did try to search online about it and found some posts but didn't manage to solve the issue, so in case there are any suggestions I would be really glad to hear.
Re: [julia-users] Re: Convert Array{Tuple} to Matrix
Are you using 0.3 or 0.4? You could try making `SpatialData` a subtype of `AbstractArray`. It's possible to do on either version, but it's substantially easier to do on 0.4. Then you could use any of the methods in base that are defined for `AbstractArray` (including `maximum`) directly. If you're on 0.4, you could crib off of a type I created to test the indexing behaviors: https://github.com/JuliaLang/julia/blob/master/test/abstractarray.jl#L30-L69 On Tuesday, July 7, 2015 at 12:05:15 PM UTC-4, Júlio Hoffimann wrote: What I'm actually trying to do is create a type for spatial data: typealias SpatialData Dict{Tuple{Integer,Integer,Integer},Real} data = SpatialData([(i,j,k) = rand() for i=1:10, j=1:20, k=1:30]) However I need to do some checks on the size of the bounding box, (10,20,30) in this case. What I'm doing right now is loop over keys(data), but I would like to make it a matrix and simply apply the builtin maximum() to the columns. Does anyone suggest a better design? How to create such a type to support these checks? I also tried to play with sparse matrices, but I need 3D spatial data representation. -Júlio
Re: [julia-users] Re: Convert Array{Tuple} to Matrix
What I'm actually trying to do is create a type for spatial data: typealias SpatialData Dict{Tuple{Integer,Integer,Integer},Real} data = SpatialData([(i,j,k) = rand() for i=1:10, j=1:20, k=1:30]) However I need to do some checks on the size of the bounding box, (10,20,30) in this case. What I'm doing right now is loop over keys(data), but I would like to make it a matrix and simply apply the builtin maximum() to the columns. Does anyone suggest a better design? How to create such a type to support these checks? I also tried to play with sparse matrices, but I need 3D spatial data representation. -Júlio
[julia-users] what are hard and soft bindings?
i'm trying to understand the difference between using and importall. i have the same confusion described at https://github.com/JuliaLang/julia/issues/11031 but, unlike the OP there, reading https://github.com/JuliaLang/julia/issues/8000 had not clarified things for me. thanks, andrew
Re: [julia-users] Re: questions about coroutines
sorry, i phrased that really poorly. what i meant was, they seem to be intended as a kind of everything included replacement for multiple threading. for example, the PR you link to adds channels. i assumed that was why they have a reputation for being too slow to be useful in cases where, say, you can just allocate an array, fill it in a loop, and return the array. or am i wrong about that too? maybe it would be clearer if i explained a motivating example. i recently wrote a parser combinator using a trampoline approach. obviously, i could have used Tasks to simplify the code. but i didn't because i assumed they would be way slower than doing the trampoline by hand. now i don't have a reference for that, but the comments earlier in this thread say that they are type unstable, which would certainly not help. so what i am asking, i guess is: (1) am i wrong about Tasks being slow? (2) if not, are there any plans for a faster, simpler, type-stable coroutine? (3) or even just lazy streams, if that helps with type stability? andrew On Tuesday, 7 July 2015 10:02:20 UTC-3, Amit Murthy wrote: Tasks are lighweight, not at all like threads. You can literally have thousands of them without any issues. Multi-threaded julia is a work-in-progress. Currently each julia process has a single thread of execution and tasks switch whenever there is I/O involved or on an explicit yield(). libuv provides the underlying event-driven I/O, timers, etc. On Tue, Jul 7, 2015 at 6:02 PM, andrew cooke and...@acooke.org javascript: wrote: it seems to me that coroutines (Tasks) in julia are very much intended for heavyweight multithread use. but lazy streams are really useful in general (they transformed how i use python), and even full-on coroutines can be useful in some single-threaded applications. so i wonder if there's a need for a lighter-weight implementation? maybe something already exists - it's not clear to me this needs to be part of the base language. andrew
Re: [julia-users] Re: Convert Array{Tuple} to Matrix
Hi Matt, That is a very good suggestion! I'm using Julia 0.4, how would you retrieve the locations from an AbstractArray? immutable SpatialData : AbstractArray{Real,3} end is all that is needed to define the type? -Júlio
Re: [julia-users] Re: Convert Array{Tuple} to Matrix
Are the 'unset' indices defined to have some value in your application? If not, trying to shoehorn this into an `AbstractArray` probably won't work very well. Also note that, while correct, using the methods in base will be slow if you have very large and very sparse matrices. In defining the type, you still need to store that dictionary somewhere. Here's a quick simplification for a minimal definition: immutable SpatialData{T} : AbstractArray{T,3} data::Dict{NTuple{3,Int}, T} dims::NTuple{3,Int} end SpatialData{T}(::Type{T}, d1::Int, d2::Int, d3::Int) = SpatialData(T, (d1,d2,d3)) SpatialData{T}(::Type{T}, dims::NTuple{3,Int}) = SpatialData(Dict{NTuple{3,Int}, T}(), dims) Base.size(A::SpatialData) = A.dims Base.similar{T}(A::SpatialData, ::Type{T}, dims::NTuple{3,Int}) = SpatialData(T, dims) Base.getindex{T}(A::SpatialData{T}, i1::Int, i2::Int, i3::Int) = (checkbounds(A,i1,i2,i3); get(A.data,(i1,i2,i3),zero(T))) Base.setindex!{T}(A::SpatialData{T}, v, i1::Int, i2::Int, i3::Int) = (checkbounds(A,i1,i2,i3); A.data[(i1,i2,i3)] = v) You then construct it and work with it like you do an array: julia A = SpatialData(Float64, 4, 3, 2) 4x3x2 SpatialData{Float64}: [:, :, 1] = 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 [:, :, 2] = 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 julia A[:,1] = 2.0 2.0 julia A 4x3x2 SpatialData{Float64}: [:, :, 1] = 2.0 0.0 0.0 2.0 0.0 0.0 2.0 0.0 0.0 2.0 0.0 0.0 [:, :, 2] = 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 On Tuesday, July 7, 2015 at 12:43:36 PM UTC-4, Júlio Hoffimann wrote: Hi Matt, That is a very good suggestion! I'm using Julia 0.4, how would you retrieve the locations from an AbstractArray? immutable SpatialData : AbstractArray{Real,3} end is all that is needed to define the type? -Júlio
Re: [julia-users] Errors while trying to install Cxx Package
Ok, thank you pointing that out. So, I need to uninstall my current version and install using the source files. Just two questions really noob ones. In the read me file, it says that I need to add the following lines to a Make.user: override LLDB_VER=master override LLVM_VER=svn override LLVM_ASSERTIONS=1 override BUILD_LLVM_CLANG=1 override BUILD_LLDB=1 override USE_LLVM_SHLIB=1 override LLDB_DISABLE_PYTHON=1 Does this mean that I just create a file with the name Make.user and copy paste this line on the file and then save it in the directory where the source file is downloaded? Also when you say ne that uses LLVM-svn not sure what I need to check for this one. On Tuesday, July 7, 2015 at 5:36:49 PM UTC+2, Keno Fischer wrote: Please see the instructions in the Cxx.jl README. In particular, you need (at the moment at least) - a source install of julia - one that uses LLVM-svn On Tue, Jul 7, 2015 at 11:33 AM, Kostas Tavlaridis-Gyparakis kostas.t...@gmail.com javascript: wrote: Hello, I am running the following version of Julia: Julia Version 0.4.0-dev+5809 Commit b414076* (2015-07-06 15:38 UTC) Platform Info: System: Linux (x86_64-linux-gnu) CPU: Intel(R) Core(TM) i5-4300U CPU @ 1.90GHz WORD_SIZE: 64 BLAS: libopenblas (NO_LAPACKE DYNAMIC_ARCH NO_AFFINITY Haswell) LAPACK: liblapack.so.3 LIBM: libopenlibm LLVM: libLLVM-3.3 And when I try to add the CXX package I receive the following error: Pkg.build(Cxx) INFO: Building Cxx Tuning for julia installation at: /usr/bin BuildBootstrap.Makefile:2: /usr/bin/../../deps/Versions.make: No such file or directory BuildBootstrap.Makefile:3: /usr/bin/../../Make.inc: No such file or directory make: *** No rule to make target '/usr/bin/../../Make.inc'. Stop. =[ ERROR: Cxx ]= LoadError: failed process: Process(`make -f BuildBootstrap.Makefile JULIA_HOME=/usr/bin`, ProcessExited(2)) [2] while loading /home/kostav/.julia/v0.4/Cxx/deps/build.jl, in expression starting on line 16 [ BUILD ERRORS ] WARNING: Cxx had build errors. - packages with build errors remain installed in /home/kostav/.julia/v0.4 - build the package(s) and all dependencies with `Pkg.build(Cxx)` - build a single package by running its `deps/build.jl` script I did try to search online about it and found some posts but didn't manage to solve the issue, so in case there are any suggestions I would be really glad to hear.
Re: [julia-users] Re: Convert Array{Tuple} to Matrix
Yes, unfortunately the unset values shouldn't be listed as 0 for instance. I'll try move forward with my current implementation and see if the code is ok. Thanks, -Júlio
Re: [julia-users] Re: Help in understanding Julia's ways
On 7 July 2015 at 04:12, Ismael VC ismael.vc1...@gmail.com wrote: Couldn't we just get a warning and let people shoot themselves in their foot if that's what they want? Something like: Warning: Using private method/type in module Foo at foo.jl:n. I like that you are trying to find some middle ground. But if I am doing this intentionally the warning will be a constant annoyance, so, how should I silence it? Yet another command line option so that we approach gcc/clang with different levels of warnings? Yuck... Annotations (@silence?) like Java? Double yuck... Giving users this level of power is something that I am happy with. Last week it allowed me to add temporarily add a function looking deep into a composite type from a library when debugging. Sure, it can be abused, but it can equally well be used properly when necessary. Sure, it may break, but ultimately it is always up to the caller not to violate the API. Sure, sometimes base violates this idiom, but this is most likely due to historic reasons and a lack of a consensus more than anything else. In the future, given enough experience and evidence to the contrary, I would be happy to reconsider my position. But for now, using idioms like the one below, is how I write my Julia code. export Weights, W, b, fanin, fanout immutable Weights{T:FloatingPoint} W::Matrix{T} b::Vector{T} end Weights(fanin, fanout) = Weights(rand(fanout, fanin)./1024, zeros(fanout)) W(w) = w.w b(w) = w.b fanin(w) = size(W(w), 2) fanout(w) = size(W(w), 1) Apologies for the terribly short variable names, it is an example after all. Pontus
[julia-users] Re: Convert Array{Tuple} to Matrix
This is reasonably clean: [y[i] for y in x, i in 1:3] If performance matters you are probably better off with the other suggestions. Den måndag 6 juli 2015 kl. 22:10:01 UTC+2 skrev Júlio Hoffimann: Hi, How to convert: 1000-element Array{Tuple{Integer,Integer,Integer},1}: (10,2,1) (5,7,10) (5,7,4) (1,1,6) (2,3,6) (8,6,4) (10,2,4) (1,3,9) (9,3,7) (5,2,4) ⋮ (1,6,8) (4,6,6) (3,9,5) (10,4,10) (8,7,4) (4,8,9) (2,6,10) (3,6,5) (1,7,10) into the corresponding 1000x3 matrix in a clean fashion? -Júlio
[julia-users] Re: Julia-lang TCO / femto-lisp TCO
On Tuesday, July 7, 2015 at 11:11:19 AM UTC-4, Steven Sagaert wrote: see http://blog.zachallaun.com/post/jumping-julia to work around not having TCO and still use recursion to traverse LARGE data structures without stackoverflow. That's also how a bunch of other languages (e.g. Scala F#) do this (called trampolining). (You could also just use a loop.)
[julia-users] Re: Julia-lang TCO / femto-lisp TCO
no, not really. On Tuesday, 7 July 2015 16:28:01 UTC-3, Steven G. Johnson wrote: On Tuesday, July 7, 2015 at 11:11:19 AM UTC-4, Steven Sagaert wrote: see http://blog.zachallaun.com/post/jumping-julia to work around not having TCO and still use recursion to traverse LARGE data structures without stackoverflow. That's also how a bunch of other languages (e.g. Scala F#) do this (called trampolining). (You could also just use a loop.)
[julia-users] Solving nonlinear equations quickly using FastAnonymous @anon and Julia 0.4
I'm writing this in case other people are trying to do the same thing I've done, and also to see if anyone has any suggestions. Recently I have been writing some code that requires solving lots(tens of thousands) of simple non-linear equations. The application is economics, I am solving an intratemporal first order condition for optimal labor supply given the state and a savings decision. This requires solving the same equation many times, but with different parameters. As far as I know, the standard ways to do this are to either define a nested function which by the lexical scoping rules inherits the parameters of the outer function, or use an anonymous function. Both these methods are slow right now because Julia can't inline those functions. However, the FastAnonymous package lets you define an anonymous function, which behaves exactly like a function but isn't type ::Function, which is fast. Crucially for me, in Julia 0.4 you can modify the parameters of the function you get out of FastAnonymous. I rewrote some code I had which depended on solving a lot of non-linear equations, and it's now 3 times as fast, running in 2s instead of 6s. Here I'll describe a simplified version of my setup and point out a few issues. 1. I store the anonymous function in a type that I will pass along to the function which needs to solve the nonlinear equation. I use a parametric type here since the type of an anonymous function seems to vary with every instance. For example, typeof(UF.fhoursFOC) FastAnonymous.##Closure#11431{Ptr{Void} @0x7f2c2eb26e30,0x10e636ff02d85766,(:h,)} To construct the type, immutable CRRA_labor{T1, T2} : LaborChoice # : means subtype of sigmac::Float64 sigmal::Float64 psi::Float64 hoursmax::Float64 state::State # Encodes info on how to solve itself fhoursFOC::T1 fJACOBhoursFOC::T2 end To set up the anonymous functions fhoursFOC and fJACOBhoursFOC (the jacobian), I define a constructor function CRRA_labor(sigmac,sigmal,psi,hoursmax,state) fhoursFOC = @anon h - hoursFOC(CRRA_labor(sigmac,sigmal,psi,hoursmax, state,0., 0.) , h, state) fJACOBhoursFOC = @anon jh - JACOBhoursFOC(CRRA_labor(sigmac,sigmal,psi, hoursmax,state,0., 0.) , jh, state) CRRA_labor(sigmac,sigmal,psi,hoursmax,state,fhoursFOC, fJACOBhoursFOC) end This looks a bit complicated because the nonlinear equation I need to solve, hoursFOC, relies on the type CRRA_labor, as well as some aggregate and idiosyncratic state info, to set up the problem. To encode this information, I define a dummy instance of CRRA_labor, where I supply 0's in place of the anonymous functions. I tried to make a self-referential type here as described in the documentation, but I couldn't get it to work, so I went with the dummy instance instead. @anon sets up the anonymous function. This means that code like fhoursFOC(0.5) will return a value. 2. Now that I have my anonymous function taking only 1 variable, I can use the nonlinear equation solver. Unfortunately, the existing nonlinear equation solvers like Roots.fzero and NLsolve ask the argument to be of type ::Function. Since anonymous functions work like functions but are actually some different type, they wouldn't accept my argument. Instead, I wrote my own Newton method, which is like 5 lines of code, where I don't restrict the argument type. I think it would be very straightforward to make this a multivariate Newton method. function myNewton(f, j, x) for n = 1:100 fx , jx = f(x), j(x) abs(fx) 1e-6 return x d = fx/jx x = x - d end println(Too many iterations) return NaN end 3. The useful thing here in 0.4 is that you can edit the parameters of the anonymous function. The parameters are encoded in a custom type state::State, and I update the state. Then I call my nonlinear equation solver UF.fhoursFOC.state, UF.fJACOBhoursFOC.state = state, state f = UF.fhoursFOC j = UF.fJACOBhoursFOC hours = myNewton(f, j, hoursguess) This runs much faster than my old version which used NLsolve, which itself ran faster than a version using Roots.fzero. Issues: 1. Since the type of the anonymous function isn't ::Function, I had to write my own solver. I'm pretty sure a 1-line edit to Roots.fzero where I just remove the ::Function type annotation would let it work there, but I'm not aware of another workaround. 2. I would rather use NLsolve, which uses in-place updating of its arguments ( f!(input::Array, output::Array) ), but I've tried constructing an anonymous function that does that, and @anon didn't work. Perhaps there is a workaround. 3. Since I'm using an anonymous function, I have to explicitly pass it around. Encoding it into the type CRRA_labor wasn't really hard though.
[julia-users] Efficient way to compute X' diag(w) Y
A lot of statistical algorithms require to compute X' diag(w) Y where X and Y are two matrices and w is a vector of weight (Y may or may not equal X). Is there a way to compute this product efficiently in Julia?
Re: [julia-users] what are hard and soft bindings?
I think his summary in #11031 is accurate, and the reason given directly in the reply is the reason for using/importall to be different. Issue #8000 is a discussion of ways to change/simplify the syntax. I'll take a stab at rephrasing the different in case that helps. There is only one difference, and on the surface (syntax-wise) it may seem very minor. The difference between using and importall is that with using you need to say function Foo.bar(.. to extend module Foo's function bar with a new method, but with importall or import Foo.bar, you only need to say function bar(... and it automatically extends module Foo's function bar. If you use importall, then function Foo.bar(... and function bar(... become equivalent. If you use using, then they are different. The reason this is important enough to have been given separate syntax is that you don't want to accidentally extend a function that you didn't know existed, because that could easily cause a bug. This is most likely to happen with a method that takes a common type like string or int, because both you and the other module could define a method to handle such a common type. If you use importall, then you'll replace the other module's implementation of bar(s::String) with your new implementation, which could easily do something complete different (and break all/many future usages of the other functions in module Foo that depend on calling bar). Does this make more sense to you? (And is it an answer to the question you were asking, or did I misunderstand?) Best, Leah On Tue, Jul 7, 2015 at 9:05 AM, andrew cooke and...@acooke.org wrote: i'm trying to understand the difference between using and importall. i have the same confusion described at https://github.com/JuliaLang/julia/issues/11031 but, unlike the OP there, reading https://github.com/JuliaLang/julia/issues/8000 had not clarified things for me. thanks, andrew
[julia-users] Re: ANN: SingularIntegralEquations.jl release v0.0.1
Those examples look really impressive. I'm wondering if the first figure in the README actually corresponds to the code block above it. The code looks like it is describing a plane wave incident at a 45 degree angle on a single plate, but the figure has a rotation symmetry, with several plates and circles. On Tuesday, July 7, 2015 at 5:41:36 AM UTC-7, richard@maths.ox.ac.uk wrote: This is to announce the v0.0.1 release of the new package SingularIntegralEquations.jl https://github.com/ApproxFun/SingularIntegralEquations.jl Built on top of the fast linear algebra for function approximation in ApproxFun, this new package solves problems in acoustic scattering (Helmholtz and gravity Helmholtz equations), potential theory (Laplace equation), fracture mechanics, and Riemann--Hilbert problems. There is a preprint available on the package's readme for algorithmic details. Joint work with Sheehan Olver.
[julia-users] Re: Solving nonlinear equations quickly using FastAnonymous @anon and Julia 0.4
It isn't your first choice, but `Roots.fzero` can have `@anon` functions passed to it, unless I forgot to tag a new version after making that change on master not so long ago. On Tuesday, July 7, 2015 at 2:29:51 PM UTC-4, Andrew wrote: I'm writing this in case other people are trying to do the same thing I've done, and also to see if anyone has any suggestions. Recently I have been writing some code that requires solving lots(tens of thousands) of simple non-linear equations. The application is economics, I am solving an intratemporal first order condition for optimal labor supply given the state and a savings decision. This requires solving the same equation many times, but with different parameters. As far as I know, the standard ways to do this are to either define a nested function which by the lexical scoping rules inherits the parameters of the outer function, or use an anonymous function. Both these methods are slow right now because Julia can't inline those functions. However, the FastAnonymous package lets you define an anonymous function, which behaves exactly like a function but isn't type ::Function, which is fast. Crucially for me, in Julia 0.4 you can modify the parameters of the function you get out of FastAnonymous. I rewrote some code I had which depended on solving a lot of non-linear equations, and it's now 3 times as fast, running in 2s instead of 6s. Here I'll describe a simplified version of my setup and point out a few issues. 1. I store the anonymous function in a type that I will pass along to the function which needs to solve the nonlinear equation. I use a parametric type here since the type of an anonymous function seems to vary with every instance. For example, typeof(UF.fhoursFOC) FastAnonymous.##Closure#11431{Ptr{Void} @0x7f2c2eb26e30,0x10e636ff02d85766,(:h,)} To construct the type, immutable CRRA_labor{T1, T2} : LaborChoice # : means subtype of sigmac::Float64 sigmal::Float64 psi::Float64 hoursmax::Float64 state::State # Encodes info on how to solve itself fhoursFOC::T1 fJACOBhoursFOC::T2 end To set up the anonymous functions fhoursFOC and fJACOBhoursFOC (the jacobian), I define a constructor function CRRA_labor(sigmac,sigmal,psi,hoursmax,state) fhoursFOC = @anon h - hoursFOC(CRRA_labor(sigmac,sigmal,psi,hoursmax, state,0., 0.) , h, state) fJACOBhoursFOC = @anon jh - JACOBhoursFOC(CRRA_labor(sigmac,sigmal, psi,hoursmax,state,0., 0.) , jh, state) CRRA_labor(sigmac,sigmal,psi,hoursmax,state,fhoursFOC, fJACOBhoursFOC) end This looks a bit complicated because the nonlinear equation I need to solve, hoursFOC, relies on the type CRRA_labor, as well as some aggregate and idiosyncratic state info, to set up the problem. To encode this information, I define a dummy instance of CRRA_labor, where I supply 0's in place of the anonymous functions. I tried to make a self-referential type here as described in the documentation, but I couldn't get it to work, so I went with the dummy instance instead. @anon sets up the anonymous function. This means that code like fhoursFOC(0.5) will return a value. 2. Now that I have my anonymous function taking only 1 variable, I can use the nonlinear equation solver. Unfortunately, the existing nonlinear equation solvers like Roots.fzero and NLsolve ask the argument to be of type ::Function. Since anonymous functions work like functions but are actually some different type, they wouldn't accept my argument. Instead, I wrote my own Newton method, which is like 5 lines of code, where I don't restrict the argument type. I think it would be very straightforward to make this a multivariate Newton method. function myNewton(f, j, x) for n = 1:100 fx , jx = f(x), j(x) abs(fx) 1e-6 return x d = fx/jx x = x - d end println(Too many iterations) return NaN end 3. The useful thing here in 0.4 is that you can edit the parameters of the anonymous function. The parameters are encoded in a custom type state::State, and I update the state. Then I call my nonlinear equation solver UF.fhoursFOC.state, UF.fJACOBhoursFOC.state = state, state f = UF.fhoursFOC j = UF.fJACOBhoursFOC hours = myNewton(f, j, hoursguess) This runs much faster than my old version which used NLsolve, which itself ran faster than a version using Roots.fzero. Issues: 1. Since the type of the anonymous function isn't ::Function, I had to write my own solver. I'm pretty sure a 1-line edit to Roots.fzero where I just remove the ::Function type annotation would let it work there, but I'm not aware of another workaround. 2. I would rather use NLsolve, which uses in-place updating of its arguments ( f!(input::Array, output::Array) ), but I've tried constructing an anonymous function that does that, and @anon didn't work.
[julia-users] Efficient way to compute X' diag(w) Y
If X can be Y, then you should of course not use scale! .
[julia-users] Efficient way to compute X' diag(w) Y
X'*scale(w,Y) should do the job. Or if you can spare to destroy Y (or more accurately, replace it by diag(w)*Y) you could do X'*scale!(w,Y) which should allocate less (and therefore be slightly faster).
[julia-users] Re: Julia-lang TCO / femto-lisp TCO
see http://blog.zachallaun.com/post/jumping-julia to work around not having TCO and still use recursion to traverse LARGE data structures without stackoverflow. That's also how a bunch of other languages (e.g. Scala F#) do this (called trampolining). On Sunday, November 24, 2013 at 3:49:14 PM UTC+1, Piotr X wrote: Hi! First of all, let me thank you for bringing another language to the world! Always good to see a new star on the sky. Coming for the Python and Scheme side of the table (well, it is a round table..), I have a couple of quick question before getting my hands on Julia: 1) I understand that femto-lisp was integragted into Julia. 1.1) Is it possible to to work mainly with femto-lisp, using Julia-lang packages, its concurrency/parallel computing libraries and its fast program generating compiler? 1.2.) Does tje Julia-lang implementation of femto-lisp have TCO (Tail call optimization)? 1.3.) What important scheme like features are missing from femto-lisp? 2) I found a Julia-Dev thread on TCO. Is there any news on that? Regards, Piotr
[julia-users] Performance of SharedArrays in a parallel loop
Hi. I was studying parallel methods with some silly routines. After some tries, I realized that reading Shared Arrays is slower than reading regular arrays, although I cannot understand the reason. I am using the latest version in Nightly. As an example, routine P2 takes more time than routine P1, and the only difference is the fact that A is Shared. Is it expected ? https://gist.github.com/CodeLenz/776c138c2c49dbd4a2d5 Thanks for your help.
[julia-users] Errors while trying to install Cxx Package
Hello, I am running the following version of Julia: Julia Version 0.4.0-dev+5809 Commit b414076* (2015-07-06 15:38 UTC) Platform Info: System: Linux (x86_64-linux-gnu) CPU: Intel(R) Core(TM) i5-4300U CPU @ 1.90GHz WORD_SIZE: 64 BLAS: libopenblas (NO_LAPACKE DYNAMIC_ARCH NO_AFFINITY Haswell) LAPACK: liblapack.so.3 LIBM: libopenlibm LLVM: libLLVM-3.3 And when I try to add the CXX package I receive the following error: Pkg.build(Cxx) INFO: Building Cxx Tuning for julia installation at: /usr/bin BuildBootstrap.Makefile:2: /usr/bin/../../deps/Versions.make: No such file or directory BuildBootstrap.Makefile:3: /usr/bin/../../Make.inc: No such file or directory make: *** No rule to make target '/usr/bin/../../Make.inc'. Stop. =[ ERROR: Cxx ]= LoadError: failed process: Process(`make -f BuildBootstrap.Makefile JULIA_HOME=/usr/bin`, ProcessExited(2)) [2] while loading /home/kostav/.julia/v0.4/Cxx/deps/build.jl, in expression starting on line 16 [ BUILD ERRORS ] WARNING: Cxx had build errors. - packages with build errors remain installed in /home/kostav/.julia/v0.4 - build the package(s) and all dependencies with `Pkg.build(Cxx)` - build a single package by running its `deps/build.jl` script I did try to search online about it and found some posts but didn't manage to solve the issue, so in case there are any suggestions I would be really glad to hear.
Re: [julia-users] Errors while trying to install Cxx Package
Please see the instructions in the Cxx.jl README. In particular, you need (at the moment at least) - a source install of julia - one that uses LLVM-svn On Tue, Jul 7, 2015 at 11:33 AM, Kostas Tavlaridis-Gyparakis kostas.tavlari...@gmail.com wrote: Hello, I am running the following version of Julia: Julia Version 0.4.0-dev+5809 Commit b414076* (2015-07-06 15:38 UTC) Platform Info: System: Linux (x86_64-linux-gnu) CPU: Intel(R) Core(TM) i5-4300U CPU @ 1.90GHz WORD_SIZE: 64 BLAS: libopenblas (NO_LAPACKE DYNAMIC_ARCH NO_AFFINITY Haswell) LAPACK: liblapack.so.3 LIBM: libopenlibm LLVM: libLLVM-3.3 And when I try to add the CXX package I receive the following error: Pkg.build(Cxx) INFO: Building Cxx Tuning for julia installation at: /usr/bin BuildBootstrap.Makefile:2: /usr/bin/../../deps/Versions.make: No such file or directory BuildBootstrap.Makefile:3: /usr/bin/../../Make.inc: No such file or directory make: *** No rule to make target '/usr/bin/../../Make.inc'. Stop. =[ ERROR: Cxx ]= LoadError: failed process: Process(`make -f BuildBootstrap.Makefile JULIA_HOME=/usr/bin`, ProcessExited(2)) [2] while loading /home/kostav/.julia/v0.4/Cxx/deps/build.jl, in expression starting on line 16 [ BUILD ERRORS ] WARNING: Cxx had build errors. - packages with build errors remain installed in /home/kostav/.julia/v0.4 - build the package(s) and all dependencies with `Pkg.build(Cxx)` - build a single package by running its `deps/build.jl` script I did try to search online about it and found some posts but didn't manage to solve the issue, so in case there are any suggestions I would be really glad to hear.
[julia-users] Re: Efficient way to compute X' diag(w) Y
Thanks, this is what I currently do :) However, I'd like to find a solution that is both memory efficient (X can be very large) and which does not modify X in place. Basically, I'm wondering whether there was a BLAS subroutine that would allow to compute cross(X, w, Y) in one pass without creating an intermediate matrix as large as X or Y.
[julia-users] pyinitialize() fails with Error: could not load module python: no error
I been trying to use PyPlot but I keep running into a problem. When I try using PyPlot I get the error: Warning: error initializing module PyPlot: ErrorException(could not load module python:no error) It seems the function pyinitialize() is causing the problem. When I run it I get the error: Error could not load module python:no error in pyinitialize at C:\Users\jonathan\.julia\v0.3\PyCall\src\pyinit.jl:324 in pyinitialize at C:\Users\jonathan\.julia\v0.3\PyCall\src\pyinit.jl:334. I've looked at the following thread about pyinitialize() https://groups.google.com/forum/#!searchin/julia-users/pyinitialize()/julia-users/Al7be9lZyQw/ZT1cYo9oU9wJ and I still can't seem to work out how to fix my problem. Can someone help me? Some info: I've set path using ENV[PYTHON] = C:\\Python27 but it doesn't help. My python distribution is Python(x,y) My Julia version is 0.3.10. Python version is 2.7.9 Any ideas?
[julia-users] Re: Did something change ~8 days ago with type parameterization in 0.4?
what's T in the last chunk of code? you have typemax(T), but no T as a type parameter. is that really working? On Tuesday, 7 July 2015 20:03:24 UTC-3, Seth wrote: I have the following code: type MinCutVisitor{T} : AbstractMASVisitor graph::SimpleGraph parities::AbstractArray{Bool,1} colormap::Vector{Int} bestweight::T cutweight::T visited::Integer distmx::AbstractArray{T, 2} vertices::Vector{Int} end function MinCutVisitor{T}(graph::SimpleGraph, distmx::AbstractArray{T, 2}) n = nv(graph) MinCutVisitor( graph, falses(n), zeros(Int,n), typemax(T), zero(T), zero(Int), distmx, @compat(Vector{Int}()) ) end and up until June 30th the outer constructor was working in 0.4: julia g = Graph(8) {8, 0} undirected graph julia LightGraphs.MinCutVisitor(g,spzeros(Float64,8,8)) LightGraphs.MinCutVisitor{Float64}({8, 0} undirected graph,Bool[false, false,false,false,false,false,false,false],[0,0,0,0,0,0,0,0],Inf,0.0,0,8x8 sparse matrix with 0 Float64 entries:,Int64[]) Now it fails on Float64 but not with Int: julia LightGraphs.MinCutVisitor(g,spzeros(Float64,8,8)) ERROR: MethodError: `convert` has no method matching convert(::Type{ LightGraphs.MinCutVisitor{T}}, ::LightGraphs.Graph, ::BitArray{1}, ::Array {Int64,1}, ::Float64, ::Float64, ::Int64, ::Base.SparseMatrix. SparseMatrixCSC{Float64,Int64}, ::Array{Int64,1}) This may have arisen from a call to the constructor LightGraphs. MinCutVisitor{T}(...), since type constructors fall back to convert methods. Closest candidates are: LightGraphs.MinCutVisitor{T}(::Union{LightGraphs.Graph,LightGraphs. DiGraph}, ::AbstractArray{Bool,1}, ::Array{Int64,1}, ::T, ::T, ::Integer, ::AbstractArray{T,2}, ::Array{Int64,1}) LightGraphs.MinCutVisitor{T}(::Union{LightGraphs.Graph,LightGraphs. DiGraph}, ::AbstractArray{T,2}) call{T}(::Type{T}, ::Any) ... in call at /Users/seth/.julia/v0.4/LightGraphs/src/maxadjvisit.jl:99 julia LightGraphs.MinCutVisitor(g,spzeros(Int,8,8)) LightGraphs.MinCutVisitor{Int64}({8, 0} undirected graph,Bool[false,false, false,false,false,false,false,false],[0,0,0,0,0,0,0,0],9223372036854775807 ,0,0,8x8 sparse matrix with 0 Int64 entries:,Int64[]) However, if I remove the type parameterization from the outer constructor: function MinCutVisitor(graph::SimpleGraph, distmx::AbstractArray) n = nv(graph) MinCutVisitor( graph, falses(n), zeros(Int,n), typemax(T), zero(T), zero(Int), distmx, @compat(Vector{Int}()) ) end it works. Can someone tell me why this is?
[julia-users] Re: Solving nonlinear equations quickly using FastAnonymous @anon and Julia 0.4
Okay, this just got fixed as much as I could with v0.1.15 (there is no fzero(f,j,guess) signature). On Tuesday, July 7, 2015 at 4:38:41 PM UTC-4, Andrew wrote: Just checked. So, Roots.fzero(f, guess) does work. However, Roots.fzero(f, j, guess) doesn't work, and neither does Roots.newton(f, j, guess). I looked at the Roots.jl source and I see ::Function annotations on the methods with the jacobian, but not the regular one. On Tuesday, July 7, 2015 at 4:22:17 PM UTC-4, j verzani wrote: It isn't your first choice, but `Roots.fzero` can have `@anon` functions passed to it, unless I forgot to tag a new version after making that change on master not so long ago. On Tuesday, July 7, 2015 at 2:29:51 PM UTC-4, Andrew wrote: I'm writing this in case other people are trying to do the same thing I've done, and also to see if anyone has any suggestions. Recently I have been writing some code that requires solving lots(tens of thousands) of simple non-linear equations. The application is economics, I am solving an intratemporal first order condition for optimal labor supply given the state and a savings decision. This requires solving the same equation many times, but with different parameters. As far as I know, the standard ways to do this are to either define a nested function which by the lexical scoping rules inherits the parameters of the outer function, or use an anonymous function. Both these methods are slow right now because Julia can't inline those functions. However, the FastAnonymous package lets you define an anonymous function, which behaves exactly like a function but isn't type ::Function, which is fast. Crucially for me, in Julia 0.4 you can modify the parameters of the function you get out of FastAnonymous. I rewrote some code I had which depended on solving a lot of non-linear equations, and it's now 3 times as fast, running in 2s instead of 6s. Here I'll describe a simplified version of my setup and point out a few issues. 1. I store the anonymous function in a type that I will pass along to the function which needs to solve the nonlinear equation. I use a parametric type here since the type of an anonymous function seems to vary with every instance. For example, typeof(UF.fhoursFOC) FastAnonymous.##Closure#11431{Ptr{Void} @0x7f2c2eb26e30,0x10e636ff02d85766,(:h,)} To construct the type, immutable CRRA_labor{T1, T2} : LaborChoice # : means subtype of sigmac::Float64 sigmal::Float64 psi::Float64 hoursmax::Float64 state::State # Encodes info on how to solve itself fhoursFOC::T1 fJACOBhoursFOC::T2 end To set up the anonymous functions fhoursFOC and fJACOBhoursFOC (the jacobian), I define a constructor function CRRA_labor(sigmac,sigmal,psi,hoursmax,state) fhoursFOC = @anon h - hoursFOC(CRRA_labor(sigmac,sigmal,psi, hoursmax,state,0., 0.) , h, state) fJACOBhoursFOC = @anon jh - JACOBhoursFOC(CRRA_labor(sigmac,sigmal, psi,hoursmax,state,0., 0.) , jh, state) CRRA_labor(sigmac,sigmal,psi,hoursmax,state,fhoursFOC, fJACOBhoursFOC) end This looks a bit complicated because the nonlinear equation I need to solve, hoursFOC, relies on the type CRRA_labor, as well as some aggregate and idiosyncratic state info, to set up the problem. To encode this information, I define a dummy instance of CRRA_labor, where I supply 0's in place of the anonymous functions. I tried to make a self-referential type here as described in the documentation, but I couldn't get it to work, so I went with the dummy instance instead. @anon sets up the anonymous function. This means that code like fhoursFOC(0.5) will return a value. 2. Now that I have my anonymous function taking only 1 variable, I can use the nonlinear equation solver. Unfortunately, the existing nonlinear equation solvers like Roots.fzero and NLsolve ask the argument to be of type ::Function. Since anonymous functions work like functions but are actually some different type, they wouldn't accept my argument. Instead, I wrote my own Newton method, which is like 5 lines of code, where I don't restrict the argument type. I think it would be very straightforward to make this a multivariate Newton method. function myNewton(f, j, x) for n = 1:100 fx , jx = f(x), j(x) abs(fx) 1e-6 return x d = fx/jx x = x - d end println(Too many iterations) return NaN end 3. The useful thing here in 0.4 is that you can edit the parameters of the anonymous function. The parameters are encoded in a custom type state::State, and I update the state. Then I call my nonlinear equation solver UF.fhoursFOC.state, UF.fJACOBhoursFOC.state = state, state f = UF.fhoursFOC j = UF.fJACOBhoursFOC hours = myNewton(f, j, hoursguess) This runs much faster than my old version which used NLsolve, which itself ran faster than a
[julia-users] Re: Julia-lang TCO / femto-lisp TCO
The point was doing it using recursion not iteration. When the data structures themselves are recursive, a recursive algorithm can be a lot simpler/shorter/more elegant than iteration. That's kind of the whole point of functional programming. On Tuesday, July 7, 2015 at 9:28:01 PM UTC+2, Steven G. Johnson wrote: On Tuesday, July 7, 2015 at 11:11:19 AM UTC-4, Steven Sagaert wrote: see http://blog.zachallaun.com/post/jumping-julia to work around not having TCO and still use recursion to traverse LARGE data structures without stackoverflow. That's also how a bunch of other languages (e.g. Scala F#) do this (called trampolining). (You could also just use a loop.)
[julia-users] Did something change ~8 days ago with type parameterization in 0.4?
I have the following code: type MinCutVisitor{T} : AbstractMASVisitor graph::SimpleGraph parities::AbstractArray{Bool,1} colormap::Vector{Int} bestweight::T cutweight::T visited::Integer distmx::AbstractArray{T, 2} vertices::Vector{Int} end function MinCutVisitor{T}(graph::SimpleGraph, distmx::AbstractArray{T, 2}) n = nv(graph) MinCutVisitor( graph, falses(n), zeros(Int,n), typemax(T), zero(T), zero(Int), distmx, @compat(Vector{Int}()) ) end and up until June 30th the outer constructor was working in 0.4: julia g = Graph(8) {8, 0} undirected graph julia LightGraphs.MinCutVisitor(g,spzeros(Float64,8,8)) LightGraphs.MinCutVisitor{Float64}({8, 0} undirected graph,Bool[false,false, false,false,false,false,false,false],[0,0,0,0,0,0,0,0],Inf,0.0,0,8x8 sparse matrix with 0 Float64 entries:,Int64[]) Now it fails on Float64 but not with Int: julia LightGraphs.MinCutVisitor(g,spzeros(Float64,8,8)) ERROR: MethodError: `convert` has no method matching convert(::Type{ LightGraphs.MinCutVisitor{T}}, ::LightGraphs.Graph, ::BitArray{1}, ::Array{ Int64,1}, ::Float64, ::Float64, ::Int64, ::Base.SparseMatrix.SparseMatrixCSC {Float64,Int64}, ::Array{Int64,1}) This may have arisen from a call to the constructor LightGraphs. MinCutVisitor{T}(...), since type constructors fall back to convert methods. Closest candidates are: LightGraphs.MinCutVisitor{T}(::Union{LightGraphs.Graph,LightGraphs.DiGraph }, ::AbstractArray{Bool,1}, ::Array{Int64,1}, ::T, ::T, ::Integer, :: AbstractArray{T,2}, ::Array{Int64,1}) LightGraphs.MinCutVisitor{T}(::Union{LightGraphs.Graph,LightGraphs.DiGraph }, ::AbstractArray{T,2}) call{T}(::Type{T}, ::Any) ... in call at /Users/seth/.julia/v0.4/LightGraphs/src/maxadjvisit.jl:99 julia LightGraphs.MinCutVisitor(g,spzeros(Int,8,8)) LightGraphs.MinCutVisitor{Int64}({8, 0} undirected graph,Bool[false,false, false,false,false,false,false,false],[0,0,0,0,0,0,0,0],9223372036854775807,0 ,0,8x8 sparse matrix with 0 Int64 entries:,Int64[]) However, if I remove the type parameterization from the outer constructor: function MinCutVisitor(graph::SimpleGraph, distmx::AbstractArray) n = nv(graph) MinCutVisitor( graph, falses(n), zeros(Int,n), typemax(T), zero(T), zero(Int), distmx, @compat(Vector{Int}()) ) end it works. Can someone tell me why this is?
[julia-users] Re: multifile compressed archives within julia?
What about the ZipFile package? https://zipfilejl.readthedocs.org/en/latest/ Am Sonntag, 5. Juli 2015 04:26:46 UTC+2 schrieb Jeffrey Sarnoff: After reviewing prior relevant topics, it is unclear that there is a recommended way to work from within julia with multiple files compressed as one. I have ~1500 very small files (text now, could be .jld) that are best combined into 2-4 compressed aggregates. Usually, I need to retrieve just a few. Please let me know if there is currently a best way (this is for version 0.4), if the shell is better, are there any programs which would be on all platforms?
[julia-users] ANN: SingularIntegralEquations.jl release v0.0.1
This is to announce the v0.0.1 release of the new package SingularIntegralEquations.jl https://github.com/ApproxFun/SingularIntegralEquations.jl Built on top of the fast linear algebra for function approximation in ApproxFun, this new package solves problems in acoustic scattering (Helmholtz and gravity Helmholtz equations), potential theory (Laplace equation), fracture mechanics, and Riemann--Hilbert problems. There is a preprint available on the package's readme for algorithmic details. Joint work with Sheehan Olver.
[julia-users] caveat: Givens does not transpose
In [1]: Z=givens(1.0,2.0,1,3,3) Z, Z', transpose(Z) Out[1]: ( 3x3 Givens{Float64}: 0.447214 0.0 0.894427 0.0 1.0 0.0 -0.894427 0.0 0.447214, 3x3 Givens{Float64}: 0.447214 0.0 0.894427 0.0 1.0 0.0 -0.894427 0.0 0.447214, 3x3 Givens{Float64}: 0.447214 0.0 0.894427 0.0 1.0 0.0 -0.894427 0.0 0.447214)
Re: [julia-users] how to determine if a particular method signature defined
Le mardi 07 juillet 2015 à 02:29 -0700, Simon Byrne a écrit : Thanks everyone. A bit more context: I'm trying to implement callbacks in RCall.jl. There are multiple ways R objects could be converted to Julia ones (as singletons, vectors, Nullable singletons/vectors, dictionaries, etc.), so what I was thinking was: 1) pass the R objects (known as SEXPRECs) straight to the Julia method 2) the first time a Julia function is converted to an R object (via the sexp function), I wanted to define an initial generic method which would take SEXPRECs and perform a default conversion. function sexp(f::Function) if method_defined(f,(VarArgs{SEXPREC},)) global f f(x::SEXPREC...) y = rcopy(...) # do default conversions f(y...) end end return callback(f) # construct R object for callback end In that way if a user wanted a different conversion, they could define their own method(s) to do this. But I don't want to overwrite this if it already exists. Any ideas? Interesting design issue! I think another solution could work, but I'm not sure it's really better. Instead of checking whether method_defined(f,(Vararg{SEXPREC},)) you could always define a fallback wrapper function like this: function f(x::AbstractSEXPREC...) y = rcopy(x...) # do default conversions f(y...) end AbstractSEXPREC would just be an abstract type from which SEXPREC would inherit, so that the fallback would not be called when a more specific f(x::SEXPREC...) method exists. Then, you would call f() on the VarArgs{SEXPREC}, and Julia would take care of calling either the user-defined function or your fallback, based on the standard dispatching rules. As I said, your solution might be equally good. My two cents -Simon On Monday, 6 July 2015 19:09:43 UTC+1, Mauro wrote: If I understand correctly what you want, then have a look at Traits.jl where I check method signatures of trait-definitions against methods of a generic function. See the loop at: https://github.com/mauro3/Traits.jl/blob/master/src/Traits.jl#L158 and in particular isfitting. Turns out that this is a relatively hard problem (unless I made a mess of it), in particular once parametric methods are involved. However, your problem might be a bit easier than what Traits does as it test for equality. Let me know if you got questions. On Mon, 2015-07-06 at 19:25, Simon Byrne simon...@gmail.com wrote: If I have a generic method foo, is there a way I can tell if a particular signature has been defined? Note that I don't want method_exists (which simply determines if something can be dispatched), I want to determine if a particular definition has been made, e.g. if foo(x) = x then I want method_defined(foo,(Int,)) == false method_defined(foo,(Any,)) == true
Re: [julia-users] how to determine if a particular method signature defined
Thanks everyone. A bit more context: I'm trying to implement callbacks in RCall.jl. There are multiple ways R objects could be converted to Julia ones (as singletons, vectors, Nullable singletons/vectors, dictionaries, etc.), so what I was thinking was: 1) pass the R objects (known as SEXPRECs) straight to the Julia method 2) the first time a Julia function is converted to an R object (via the sexp function), I wanted to define an initial generic method which would take SEXPRECs and perform a default conversion. function sexp(f::Function) if method_defined(f,(VarArgs{SEXPREC},)) global f f(x::SEXPREC...) y = rcopy(...) # do default conversions f(y...) end end return callback(f) # construct R object for callback end In that way if a user wanted a different conversion, they could define their own method(s) to do this. But I don't want to overwrite this if it already exists. Any ideas? -Simon On Monday, 6 July 2015 19:09:43 UTC+1, Mauro wrote: If I understand correctly what you want, then have a look at Traits.jl where I check method signatures of trait-definitions against methods of a generic function. See the loop at: https://github.com/mauro3/Traits.jl/blob/master/src/Traits.jl#L158 and in particular isfitting. Turns out that this is a relatively hard problem (unless I made a mess of it), in particular once parametric methods are involved. However, your problem might be a bit easier than what Traits does as it test for equality. Let me know if you got questions. On Mon, 2015-07-06 at 19:25, Simon Byrne simon...@gmail.com javascript: wrote: If I have a generic method foo, is there a way I can tell if a particular signature has been defined? Note that I don't want method_exists (which simply determines if something can be dispatched), I want to determine if a particular definition has been made, e.g. if foo(x) = x then I want method_defined(foo,(Int,)) == false method_defined(foo,(Any,)) == true
Re: [julia-users] Re: Help in understanding Julia's ways
This is how most Julia code looks like. The methods are the interface and thus which is exposed in public. It is pretty natural to program this way when doing generic programming and taking multiple dispatch into account. I agree that this could be better documented somewhere but the solution is not to bring more complexity into the language. This has also been discussed several times on the mailing list, so I would be very surprised that introducing warnings for field access would gain any acceptance among the core Julia developers. Am Dienstag, 7. Juli 2015 10:54:05 UTC+2 schrieb Pontus Stenetorp: On 7 July 2015 at 04:12, Ismael VC ismael...@gmail.com javascript: wrote: Couldn't we just get a warning and let people shoot themselves in their foot if that's what they want? Something like: Warning: Using private method/type in module Foo at foo.jl:n. I like that you are trying to find some middle ground. But if I am doing this intentionally the warning will be a constant annoyance, so, how should I silence it? Yet another command line option so that we approach gcc/clang with different levels of warnings? Yuck... Annotations (@silence?) like Java? Double yuck... Giving users this level of power is something that I am happy with. Last week it allowed me to add temporarily add a function looking deep into a composite type from a library when debugging. Sure, it can be abused, but it can equally well be used properly when necessary. Sure, it may break, but ultimately it is always up to the caller not to violate the API. Sure, sometimes base violates this idiom, but this is most likely due to historic reasons and a lack of a consensus more than anything else. In the future, given enough experience and evidence to the contrary, I would be happy to reconsider my position. But for now, using idioms like the one below, is how I write my Julia code. export Weights, W, b, fanin, fanout immutable Weights{T:FloatingPoint} W::Matrix{T} b::Vector{T} end Weights(fanin, fanout) = Weights(rand(fanout, fanin)./1024, zeros(fanout)) W(w) = w.w b(w) = w.b fanin(w) = size(W(w), 2) fanout(w) = size(W(w), 1) Apologies for the terribly short variable names, it is an example after all. Pontus
[julia-users] Re: Deserializing a serialized datastructure causes an end of file error.
that doesn't sound right. likely ether a bug in your code, your environment, or julia. can you make s small example program with the same error? On Tuesday, 7 July 2015 00:22:04 UTC-3, Gene Sher wrote: Hello Julia language community, I've developed a Neural Network based classifier using Julia, and to store the entire classifier in a file I used the serialize function. The problem I am now facing is that I can not recover my classifiers, due to the deserialize function giving me an end of file error. Has anyone came across this type of problem? Is there any way to recover my serialized neural network representing datastructures? I've read that one version of Julia will not deserialize file that was serialized by another version of Julia. But I've tried different versions to deserialize the files, one of which was the one I used to serialize the data structure. But it just causes the end of file error. Best regards, -Gene
Re: [julia-users] Performance of SharedArrays in a parallel loop
Without having spent the time to look carefully, one obvious point is that you have some serious type-instability in your SharedArray function. If you're running 0.4, try using @code_warntype and tweak your implementation until it is type stable. --Tim On Tuesday, July 07, 2015 08:19:58 AM Eduardo Lenz wrote: Hi. I was studying parallel methods with some silly routines. After some tries, I realized that reading Shared Arrays is slower than reading regular arrays, although I cannot understand the reason. I am using the latest version in Nightly. As an example, routine P2 takes more time than routine P1, and the only difference is the fact that A is Shared. Is it expected ? https://gist.github.com/CodeLenz/776c138c2c49dbd4a2d5 Thanks for your help.
[julia-users] Redirection STDERR to DevNull
I was attempting to redirect STDERR temporarily to DevNull, i.e. olderr = STDERR try redirect_stderr(DevNull) ... code that outputs warning messages ... finally redirect_stderr(olderr) end however, I got the following error: * deprecated exception on 1: ERROR: LoadError: type DevNullStream has no field handle in redirect_stderr at stream.jl:1045 (this was on a build of 0.4, made tonight). What *should* I be doing to stop the warnings from being displayed? Thanks, Scott
Re: [julia-users] Redirection STDERR to DevNull
related: https://github.com/JuliaLang/julia/issues/12050 On Tue, Jul 7, 2015 at 9:45 PM, Scott Jones scott.paul.jo...@gmail.com wrote: I was attempting to redirect STDERR temporarily to DevNull, i.e. olderr = STDERR try redirect_stderr(DevNull) ... code that outputs warning messages ... finally redirect_stderr(olderr) end however, I got the following error: * deprecated exception on 1: ERROR: LoadError: type DevNullStream has no field handle in redirect_stderr at stream.jl:1045 (this was on a build of 0.4, made tonight). What *should* I be doing to stop the warnings from being displayed? Thanks, Scott
Re: [julia-users] Re: Did something change ~8 days ago with type parameterization in 0.4?
Not sure what changed so it might worth a bug report/doc update but can you try this? function MinCutVisitor{T}(graph::SimpleGraph, distmx::AbstractArray{T, 2}) n = nv(graph) MinCutVisitor{T}( graph, falses(n), zeros(Int,n), typemax(T), zero(T), zero(Int), distmx, @compat(Vector{Int}()) ) end I.e. add `{T}` to the call to the default constructor On Tue, Jul 7, 2015 at 8:05 PM, andrew cooke and...@acooke.org wrote: what's T in the last chunk of code? you have typemax(T), but no T as a type parameter. is that really working? On Tuesday, 7 July 2015 20:03:24 UTC-3, Seth wrote: I have the following code: type MinCutVisitor{T} : AbstractMASVisitor graph::SimpleGraph parities::AbstractArray{Bool,1} colormap::Vector{Int} bestweight::T cutweight::T visited::Integer distmx::AbstractArray{T, 2} vertices::Vector{Int} end function MinCutVisitor{T}(graph::SimpleGraph, distmx::AbstractArray{T, 2}) n = nv(graph) MinCutVisitor( graph, falses(n), zeros(Int,n), typemax(T), zero(T), zero(Int), distmx, @compat(Vector{Int}()) ) end and up until June 30th the outer constructor was working in 0.4: julia g = Graph(8) {8, 0} undirected graph julia LightGraphs.MinCutVisitor(g,spzeros(Float64,8,8)) LightGraphs.MinCutVisitor{Float64}({8, 0} undirected graph,Bool[false,false,false,false,false,false,false,false],[0,0,0,0,0,0,0,0],Inf,0.0,0,8x8 sparse matrix with 0 Float64 entries:,Int64[]) Now it fails on Float64 but not with Int: julia LightGraphs.MinCutVisitor(g,spzeros(Float64,8,8)) ERROR: MethodError: `convert` has no method matching convert(::Type{LightGraphs.MinCutVisitor{T}}, ::LightGraphs.Graph, ::BitArray{1}, ::Array{Int64,1}, ::Float64, ::Float64, ::Int64, ::Base.SparseMatrix.SparseMatrixCSC{Float64,Int64}, ::Array{Int64,1}) This may have arisen from a call to the constructor LightGraphs.MinCutVisitor{T}(...), since type constructors fall back to convert methods. Closest candidates are: LightGraphs.MinCutVisitor{T}(::Union{LightGraphs.Graph,LightGraphs.DiGraph}, ::AbstractArray{Bool,1}, ::Array{Int64,1}, ::T, ::T, ::Integer, ::AbstractArray{T,2}, ::Array{Int64,1}) LightGraphs.MinCutVisitor{T}(::Union{LightGraphs.Graph,LightGraphs.DiGraph}, ::AbstractArray{T,2}) call{T}(::Type{T}, ::Any) ... in call at /Users/seth/.julia/v0.4/LightGraphs/src/maxadjvisit.jl:99 julia LightGraphs.MinCutVisitor(g,spzeros(Int,8,8)) LightGraphs.MinCutVisitor{Int64}({8, 0} undirected graph,Bool[false,false,false,false,false,false,false,false],[0,0,0,0,0,0,0,0],9223372036854775807,0,0,8x8 sparse matrix with 0 Int64 entries:,Int64[]) However, if I remove the type parameterization from the outer constructor: function MinCutVisitor(graph::SimpleGraph, distmx::AbstractArray) n = nv(graph) MinCutVisitor( graph, falses(n), zeros(Int,n), typemax(T), zero(T), zero(Int), distmx, @compat(Vector{Int}()) ) end it works. Can someone tell me why this is?
Re: [julia-users] Re: Help in understanding Julia's ways
Ok I have to say that I am not that sure about the immutable thingy as for the regular type. For me an immutable is just like say an Int32 that also does not try to hide that it has 32 bits. Thus I do not see why an ARGB immutable that is bitwise represented by 4 packed bytes should hide its structure. But again, I am not 100% certain about this. The issues you describe that packages use internals are real but shouldn't be the consequence be to simply fix them when found? Its absolutely normal that packages break when the standard library (base) moves on. I would say this is part of a regular development workflow and not any form of a serious issue. The Julia package landscape is healthy and IMHO in large parts of high quality. Am Dienstag, 7. Juli 2015 15:33:00 UTC+2 schrieb Scott Jones: I don't know, I think all of you have missed my point. I am *not* saying to change anything at all in the behavior of current Julia programs, only ADDING the capability to keep methods, types, or fields which are not intended for public consumption hidden. Being able to do this is very important to make any sorts of guarantees about a program. I've already seen very many cases where things will be broken in the future, because people are accessing internal fields and methods instead of calling exported methods. I had to do a PR specifically to fix a case where JSON.jl was using an internal function of utf16.jl just this week. Many times, there are methods that are needed to implement some functionality, but which depend on certain things being set up correctly, which you do not want to expose publicly, that are not part of the API, or which you plan on totally changing in the future, for example. The only way to make sure that nobody is using those is to have the language support marking those as private in some fashion, not just some convention that on the whole, doesn't seem to be followed. @tknopp You said that for immutables, that fields were the interface. Please explain that to me, because although Julia conflates a number of different things into the idea of immutable, I'd never heard that, and don't really see that it could be true. (Julia conflates the abstract idea of something not being settable, with the implementation detail of storing bitstypes directly in an immutable object, without boxing). I can think easily think of cases where you want a read-only object, where none of the properties were visible (all access via methods). That doesn't change just because Julia can dispatch on multiple arguments, instead of just the (hidden) self/this argument. @ninjin (Pontus) You said Sure, it may break, but ultimately it is always up to the caller not to violate the API I totally disagree with that. A language should *help* people be able to write correct code. I'm not saying that Julia should be like CLU, where you simply couldn't access the implementation from outside the cluster (module), but simply that the level of access should be controllable by the author of the module/package. String handling in julia is a great case for adding the ability to hide the implementation details, and only expose the methods you really want for the API. Currently there are cases of .data throughout base and packages, and the fact that UTF16String and UTF32String currently require a trailing 0 when being constructed is also exposed. Many people have said that Julia's philosophy is one of consenting adults, but I don't see that at all. To have consenting adults, you have to have consent on both sides. Julia makes it such that it is impossible to prevent any stranger from walking up and fiddling with your private parts (and maybe giving you a virus in the form of corrupted data to boot!). Julian is right about Julia suffering from https://en.wikipedia.org/wiki/Object_orgy although I don't think that member functions are necessary at all to solve this (just a private keyword, that keeps things visible only within the module (and submodules) On Tuesday, July 7, 2015 at 7:15:23 AM UTC-4, Tobias Knopp wrote: This is how most Julia code looks like. The methods are the interface and thus which is exposed in public. It is pretty natural to program this way when doing generic programming and taking multiple dispatch into account. I agree that this could be better documented somewhere but the solution is not to bring more complexity into the language. This has also been discussed several times on the mailing list, so I would be very surprised that introducing warnings for field access would gain any acceptance among the core Julia developers. Am Dienstag, 7. Juli 2015 10:54:05 UTC+2 schrieb Pontus Stenetorp: On 7 July 2015 at 04:12, Ismael VC ismael...@gmail.com wrote: Couldn't we just get a warning and let people shoot themselves in their foot if that's what they want? Something like:
Re: [julia-users] Re: questions about coroutines
Tasks are lighweight, not at all like threads. You can literally have thousands of them without any issues. Multi-threaded julia is a work-in-progress. Currently each julia process has a single thread of execution and tasks switch whenever there is I/O involved or on an explicit yield(). libuv provides the underlying event-driven I/O, timers, etc. On Tue, Jul 7, 2015 at 6:02 PM, andrew cooke and...@acooke.org wrote: it seems to me that coroutines (Tasks) in julia are very much intended for heavyweight multithread use. but lazy streams are really useful in general (they transformed how i use python), and even full-on coroutines can be useful in some single-threaded applications. so i wonder if there's a need for a lighter-weight implementation? maybe something already exists - it's not clear to me this needs to be part of the base language. andrew
Re: [julia-users] how to determine if a particular method signature defined
On 7 Jul 2015, at 13:13, Milan Bouchet-Valat nalimi...@club.fr wrote: I think another solution could work, but I'm not sure it's really better. Instead of checking whether method_defined(f,(Vararg{SEXPREC},)) you could always define a fallback wrapper function like this: function f(x::AbstractSEXPREC...) y = rcopy(x...) # do default conversions f(y...) end AbstractSEXPREC would just be an abstract type from which SEXPREC would inherit, so that the fallback would not be called when a more specific f(x::SEXPREC...) method exists. Then, you would call f() on the VarArgs{SEXPREC}, and Julia would take care of calling either the user-defined function or your fallback, based on the standard dispatching rules. Sorry, I should have said: SEXPREC is the abstract type (the subtypes are VecSxp, RealSxp, IntSxp, etc.), so this is essentially what I’m doing. However I wanted it so that the fallback wrapper is defined automatically the first time the method is used in this manner.
Re: [julia-users] Re: Help in understanding Julia's ways
I don't know, I think all of you have missed my point. I am *not* saying to change anything at all in the behavior of current Julia programs, only ADDING the capability to keep methods, types, or fields which are not intended for public consumption hidden. Being able to do this is very important to make any sorts of guarantees about a program. I've already seen very many cases where things will be broken in the future, because people are accessing internal fields and methods instead of calling exported methods. I had to do a PR specifically to fix a case where JSON.jl was using an internal function of utf16.jl just this week. Many times, there are methods that are needed to implement some functionality, but which depend on certain things being set up correctly, which you do not want to expose publicly, that are not part of the API, or which you plan on totally changing in the future, for example. The only way to make sure that nobody is using those is to have the language support marking those as private in some fashion, not just some convention that on the whole, doesn't seem to be followed. @tknopp You said that for immutables, that fields were the interface. Please explain that to me, because although Julia conflates a number of different things into the idea of immutable, I'd never heard that, and don't really see that it could be true. (Julia conflates the abstract idea of something not being settable, with the implementation detail of storing bitstypes directly in an immutable object, without boxing). I can think easily think of cases where you want a read-only object, where none of the properties were visible (all access via methods). That doesn't change just because Julia can dispatch on multiple arguments, instead of just the (hidden) self/this argument. @ninjin (Pontus) You said Sure, it may break, but ultimately it is always up to the caller not to violate the API I totally disagree with that. A language should *help* people be able to write correct code. I'm not saying that Julia should be like CLU, where you simply couldn't access the implementation from outside the cluster (module), but simply that the level of access should be controllable by the author of the module/package. String handling in julia is a great case for adding the ability to hide the implementation details, and only expose the methods you really want for the API. Currently there are cases of .data throughout base and packages, and the fact that UTF16String and UTF32String currently require a trailing 0 when being constructed is also exposed. Many people have said that Julia's philosophy is one of consenting adults, but I don't see that at all. To have consenting adults, you have to have consent on both sides. Julia makes it such that it is impossible to prevent any stranger from walking up and fiddling with your private parts (and maybe giving you a virus in the form of corrupted data to boot!). Julian is right about Julia suffering from https://en.wikipedia.org/wiki/Object_orgy although I don't think that member functions are necessary at all to solve this (just a private keyword, that keeps things visible only within the module (and submodules) On Tuesday, July 7, 2015 at 7:15:23 AM UTC-4, Tobias Knopp wrote: This is how most Julia code looks like. The methods are the interface and thus which is exposed in public. It is pretty natural to program this way when doing generic programming and taking multiple dispatch into account. I agree that this could be better documented somewhere but the solution is not to bring more complexity into the language. This has also been discussed several times on the mailing list, so I would be very surprised that introducing warnings for field access would gain any acceptance among the core Julia developers. Am Dienstag, 7. Juli 2015 10:54:05 UTC+2 schrieb Pontus Stenetorp: On 7 July 2015 at 04:12, Ismael VC ismael...@gmail.com wrote: Couldn't we just get a warning and let people shoot themselves in their foot if that's what they want? Something like: Warning: Using private method/type in module Foo at foo.jl:n. I like that you are trying to find some middle ground. But if I am doing this intentionally the warning will be a constant annoyance, so, how should I silence it? Yet another command line option so that we approach gcc/clang with different levels of warnings? Yuck... Annotations (@silence?) like Java? Double yuck... Giving users this level of power is something that I am happy with. Last week it allowed me to add temporarily add a function looking deep into a composite type from a library when debugging. Sure, it can be abused, but it can equally well be used properly when necessary. Sure, it may break, but ultimately it is always up to the caller not to violate the API. Sure, sometimes base violates this idiom, but this is most likely due to historic reasons
[julia-users] Re: Solving nonlinear equations quickly using FastAnonymous @anon and Julia 0.4
Just checked. So, Roots.fzero(f, guess) does work. However, Roots.fzero(f, j, guess) doesn't work, and neither does Roots.newton(f, j, guess). I looked at the Roots.jl source and I see ::Function annotations on the methods with the jacobian, but not the regular one. On Tuesday, July 7, 2015 at 4:22:17 PM UTC-4, j verzani wrote: It isn't your first choice, but `Roots.fzero` can have `@anon` functions passed to it, unless I forgot to tag a new version after making that change on master not so long ago. On Tuesday, July 7, 2015 at 2:29:51 PM UTC-4, Andrew wrote: I'm writing this in case other people are trying to do the same thing I've done, and also to see if anyone has any suggestions. Recently I have been writing some code that requires solving lots(tens of thousands) of simple non-linear equations. The application is economics, I am solving an intratemporal first order condition for optimal labor supply given the state and a savings decision. This requires solving the same equation many times, but with different parameters. As far as I know, the standard ways to do this are to either define a nested function which by the lexical scoping rules inherits the parameters of the outer function, or use an anonymous function. Both these methods are slow right now because Julia can't inline those functions. However, the FastAnonymous package lets you define an anonymous function, which behaves exactly like a function but isn't type ::Function, which is fast. Crucially for me, in Julia 0.4 you can modify the parameters of the function you get out of FastAnonymous. I rewrote some code I had which depended on solving a lot of non-linear equations, and it's now 3 times as fast, running in 2s instead of 6s. Here I'll describe a simplified version of my setup and point out a few issues. 1. I store the anonymous function in a type that I will pass along to the function which needs to solve the nonlinear equation. I use a parametric type here since the type of an anonymous function seems to vary with every instance. For example, typeof(UF.fhoursFOC) FastAnonymous.##Closure#11431{Ptr{Void} @0x7f2c2eb26e30,0x10e636ff02d85766,(:h,)} To construct the type, immutable CRRA_labor{T1, T2} : LaborChoice # : means subtype of sigmac::Float64 sigmal::Float64 psi::Float64 hoursmax::Float64 state::State # Encodes info on how to solve itself fhoursFOC::T1 fJACOBhoursFOC::T2 end To set up the anonymous functions fhoursFOC and fJACOBhoursFOC (the jacobian), I define a constructor function CRRA_labor(sigmac,sigmal,psi,hoursmax,state) fhoursFOC = @anon h - hoursFOC(CRRA_labor(sigmac,sigmal,psi,hoursmax ,state,0., 0.) , h, state) fJACOBhoursFOC = @anon jh - JACOBhoursFOC(CRRA_labor(sigmac,sigmal, psi,hoursmax,state,0., 0.) , jh, state) CRRA_labor(sigmac,sigmal,psi,hoursmax,state,fhoursFOC, fJACOBhoursFOC ) end This looks a bit complicated because the nonlinear equation I need to solve, hoursFOC, relies on the type CRRA_labor, as well as some aggregate and idiosyncratic state info, to set up the problem. To encode this information, I define a dummy instance of CRRA_labor, where I supply 0's in place of the anonymous functions. I tried to make a self-referential type here as described in the documentation, but I couldn't get it to work, so I went with the dummy instance instead. @anon sets up the anonymous function. This means that code like fhoursFOC(0.5) will return a value. 2. Now that I have my anonymous function taking only 1 variable, I can use the nonlinear equation solver. Unfortunately, the existing nonlinear equation solvers like Roots.fzero and NLsolve ask the argument to be of type ::Function. Since anonymous functions work like functions but are actually some different type, they wouldn't accept my argument. Instead, I wrote my own Newton method, which is like 5 lines of code, where I don't restrict the argument type. I think it would be very straightforward to make this a multivariate Newton method. function myNewton(f, j, x) for n = 1:100 fx , jx = f(x), j(x) abs(fx) 1e-6 return x d = fx/jx x = x - d end println(Too many iterations) return NaN end 3. The useful thing here in 0.4 is that you can edit the parameters of the anonymous function. The parameters are encoded in a custom type state::State, and I update the state. Then I call my nonlinear equation solver UF.fhoursFOC.state, UF.fJACOBhoursFOC.state = state, state f = UF.fhoursFOC j = UF.fJACOBhoursFOC hours = myNewton(f, j, hoursguess) This runs much faster than my old version which used NLsolve, which itself ran faster than a version using Roots.fzero. Issues: 1. Since the type of the anonymous function isn't ::Function, I had to write my own solver. I'm pretty sure a 1-line edit to
Re: [julia-users] Re: Help in understanding Julia's ways
I've seen what happens both in the internal code of a language, and the customer usage, over 3 decades, and know first-hand the issues that come up when you don't have the *option* of making things private. (having the distinction between public and private in the language was something we had to add after years of problems) Nothing about my proposal would *force* you to use it, it would not break *any* current julia code. Also, I don't think it is at all good that people consider it normal that things break constantly, esp. when there are simple ways that a lot of the pain could be avoided. On Tuesday, July 7, 2015 at 10:11:46 AM UTC-4, Tobias Knopp wrote: Ok I have to say that I am not that sure about the immutable thingy as for the regular type. For me an immutable is just like say an Int32 that also does not try to hide that it has 32 bits. Thus I do not see why an ARGB immutable that is bitwise represented by 4 packed bytes should hide its structure. But again, I am not 100% certain about this. The issues you describe that packages use internals are real but shouldn't be the consequence be to simply fix them when found? Its absolutely normal that packages break when the standard library (base) moves on. I would say this is part of a regular development workflow and not any form of a serious issue. The Julia package landscape is healthy and IMHO in large parts of high quality. Am Dienstag, 7. Juli 2015 15:33:00 UTC+2 schrieb Scott Jones: I don't know, I think all of you have missed my point. I am *not* saying to change anything at all in the behavior of current Julia programs, only ADDING the capability to keep methods, types, or fields which are not intended for public consumption hidden. Being able to do this is very important to make any sorts of guarantees about a program. I've already seen very many cases where things will be broken in the future, because people are accessing internal fields and methods instead of calling exported methods. I had to do a PR specifically to fix a case where JSON.jl was using an internal function of utf16.jl just this week. Many times, there are methods that are needed to implement some functionality, but which depend on certain things being set up correctly, which you do not want to expose publicly, that are not part of the API, or which you plan on totally changing in the future, for example. The only way to make sure that nobody is using those is to have the language support marking those as private in some fashion, not just some convention that on the whole, doesn't seem to be followed. @tknopp You said that for immutables, that fields were the interface. Please explain that to me, because although Julia conflates a number of different things into the idea of immutable, I'd never heard that, and don't really see that it could be true. (Julia conflates the abstract idea of something not being settable, with the implementation detail of storing bitstypes directly in an immutable object, without boxing). I can think easily think of cases where you want a read-only object, where none of the properties were visible (all access via methods). That doesn't change just because Julia can dispatch on multiple arguments, instead of just the (hidden) self/this argument. @ninjin (Pontus) You said Sure, it may break, but ultimately it is always up to the caller not to violate the API I totally disagree with that. A language should *help* people be able to write correct code. I'm not saying that Julia should be like CLU, where you simply couldn't access the implementation from outside the cluster (module), but simply that the level of access should be controllable by the author of the module/package. String handling in julia is a great case for adding the ability to hide the implementation details, and only expose the methods you really want for the API. Currently there are cases of .data throughout base and packages, and the fact that UTF16String and UTF32String currently require a trailing 0 when being constructed is also exposed. Many people have said that Julia's philosophy is one of consenting adults, but I don't see that at all. To have consenting adults, you have to have consent on both sides. Julia makes it such that it is impossible to prevent any stranger from walking up and fiddling with your private parts (and maybe giving you a virus in the form of corrupted data to boot!). Julian is right about Julia suffering from https://en.wikipedia.org/wiki/Object_orgy although I don't think that member functions are necessary at all to solve this (just a private keyword, that keeps things visible only within the module (and submodules) On Tuesday, July 7, 2015 at 7:15:23 AM UTC-4, Tobias Knopp wrote: This is how most Julia code looks like. The methods are the interface and thus which is exposed in public. It is pretty natural to
Re: [julia-users] Re: Help in understanding Julia's ways
On Tuesday, July 7, 2015 at 10:11:46 AM UTC-4, Tobias Knopp wrote: Ok I have to say that I am not that sure about the immutable thingy as for the regular type. For me an immutable is just like say an Int32 that also does not try to hide that it has 32 bits. Thus I do not see why an ARGB immutable that is bitwise represented by 4 packed bytes should hide its structure. But again, I am not 100% certain about this. So, in your case, it is fine to make everything public. However, what happens when you realize that a byte isn't enough, and you want a 64-bit quantity packed a different way? If you'd had accessor methods, getRed, getBlue, getGreen, etc. then you could merrily update your type, and everything would still work without breakage.