[julia-users] Reactive signals across processes?
Any idea on how to implement Reactive signals across processes? I know there is the concept of channels to interact across different parallel processes , I wonder if there exist an implementation that uses Signals instead.
[julia-users] Re: How would you implement setindices! elegenatly?
julia> A=rand(4,4) 4×4 Array{Float64,2}: 0.427998 0.720987 0.375013 0.432887 0.0333443 0.602459 0.946685 0.817995 0.402635 0.571399 0.553542 0.0234215 0.707829 0.339795 0.451387 0.358248 julia> ind = [1 1; 2 2; 3 3] 3×2 Array{Int64,2}: 1 1 2 2 3 3 julia> A[ind] 3×2 Array{Float64,2}: 0.427998 0.427998 0.0333443 0.0333443 0.402635 0.402635 julia> On Wednesday, October 26, 2016 at 6:19:45 PM UTC+3, Cedric St-Jean wrote: > > A[indices] = Values > > ? > > On Wednesday, October 26, 2016 at 9:53:17 AM UTC-4, Tsur Herman wrote: >> >> What would you suggest is a fast and elegant way to achieve indexing into >> an array using a set of indices? >> >> function setindices!(A,Values,Indices) >> assert(length(Values) == size(Indices,1)) >> for i=1:length(Values) >> setindex!(A,Values[i],(Indices[i,:]...)...) >> end >> end >> >> I am currently using this function and I was wondering whether I missed >> something. >> >>
[julia-users] How would you implement setindices! elegenatly?
What would you suggest is a fast and elegant way to achieve indexing into an array using a set of indices? function setindices!(A,Values,Indices) assert(length(Values) == size(Indices,1)) for i=1:length(Values) setindex!(A,Values[i],(Indices[i,:]...)...) end end I am currently using this function and I was wondering whether I missed something.
[julia-users] Re: Julia and the Tower of Babel
I noticed this also .. and this is why I chose to "rip" some packages for some of its functionality. >From what I observed the problem is the "coolness" of the language and the highly creative level of the package writers. Just as the first post here states the seemingly two advantages , cool language and super-creative package writers .. can some time have a "babel tower" effect. I encountered this with respect to image processing geometry primitive manipulation etc .. the problem is: too many types!! if something can be represented as an array with some convention for example MxN array where M is the Descriptor size and N is the number of Descriptors .. then it is better to use and support that than to declare more specialized types. At least for fast paced research and idea validation it is better. Probably for implementation and performance specialized types optimized for speed will be required..
Re: [julia-users] Current file name
Tnx On Tuesday, September 27, 2016 at 10:58:47 PM UTC+3, Yichao Yu wrote: > > On Tue, Sep 27, 2016 at 3:56 PM, Tsur Herman <tsur@gmail.com > > wrote: > > is there a way to know in run-time from which file the code that is > > currently executing comes from? > > @__FILE__ gives you the current file name > Base.source_path() gives you the current toplevel file. > > > >
[julia-users] Current file name
is there a way to know in run-time from which file the code that is currently executing comes from?
Re: [julia-users] Generators vs Comprehensions, Type-stability?
{We could use type inference on the function t -> t^2 (which is buried in the generator) to determine a more specific eltype.} We can declare : eltype(G::Base.Generator) = Base.code_typed(G.f,(eltype(G.iter),))[1].rettype element type of Generator G will be the inferred return type of G.f with arguments of type eltype(G.iter) And more generally the element type of function F with arguments args can be set to Base.code_typed(F,args)[1].rettype On Friday, September 23, 2016 at 11:14:35 AM UTC+3, Christoph Ortner wrote: > > why would type inference for sum(t^2 for t in r) be different from [t^2 > for t in r] ? > > On Friday, 23 September 2016 07:42:00 UTC+1, Michele Zaffalon wrote: >> >> On Fri, Sep 23, 2016 at 2:23 AM, Steven G. Johnson>> wrote: >>> >>> >>> We could use type inference on the function t -> t^2 (which is buried in >>> the generator) to determine a more specific eltype. >>> >> >> Does this not require evaluating the function on all inputs thereby >> losing the advantage of having a generator? >> >>
Re: [julia-users] Generators vs Comprehensions, Type-stability?
I can a see a point in what you say .. eltype of a function should be the return type of that function if it can be inferred. Because an array is just a special kind of function with a special notation. On Friday, 23 September 2016, Steven G. Johnson <stevenj@gmail.com> wrote: > > > On Thursday, September 22, 2016 at 6:10:29 PM UTC-4, Tsur Herman wrote: >> >> The real problem is that eltype(t^2 for t in rand(10)) returns Any. >> >> >> that is not a problem (t^2 for t in rand(10)) is a generator its element >> type is Any which means a pointer to something complex. >> >>> > It is a problem, because it means that the result type of sum cannot be > inferred. > > We could use type inference on the function t -> t^2 (which is buried in > the generator) to determine a more specific eltype. >
Re: [julia-users] Generators vs Comprehensions, Type-stability?
The real problem is that eltype(t^2 for t in rand(10)) returns Any. that is not a problem (t^2 for t in rand(10)) is a generator its element type is Any which means a pointer to something complex. On Friday, September 23, 2016 at 12:50:18 AM UTC+3, Steven G. Johnson wrote: > > I don't think the empty case should be the problem. If it can't infer the > type, sum just throws an error. So test1(r) actually always returns the > same type for r::Array{Float64} in any case where it returns a value at al. > > The real problem is that eltype(t^2 for t in rand(10)) returns Any. >
Re: [julia-users] Generators vs Comprehensions, Type-stability?
By the way my test3 functions is super fast @time test3(r) 0.32 seconds (4 allocations: 160 bytes) On Friday, September 23, 2016 at 12:48:50 AM UTC+3, Tsur Herman wrote: > > > On my side both function perform equally. although test2 had to be timed > twice to get to the same performance. > > julia> test2(x)= sum( [t^2 for t in x] ) > > julia> @time test2(r) > 0.017423 seconds (13.22 k allocations: 1.339 MB) > > julia> @time test2(r) > 0.000332 seconds (9 allocations: 781.531 KB) > > I think the discrepancy comes from the JITing process because if I time > it without using the macro @time, it works from the first run. > > julia> test2(x)= sum( [t^2 for t in x] ) > WARNING: Method definition test2(Any) in module Main at REPL[68]:1 > overwritten at REPL[71]:1. > test2 (generic function with 1 method) > > julia> tic();for i=1:1 ; test2(r);end;toc()/1 > elapsed time: 3.090764498 seconds > 0.0003090764498 > > About the memory footprint -> test2 first constructs the inner vector then > calls sum. > > since the type was not inferred the zero-element could not be created. >> > The sum of an empty set or vector is undefined it is not zero. > you can rewrite it in a more explicit way > > test3(r) = begin > total = Float64(0); > for t in r total+=t ;end;end > > > > > > > On Thursday, September 22, 2016 at 10:50:39 PM UTC+3, Patrick Kofod > Mogensen wrote: >> >> I've seen the same, and the answer I got at the JuliaLang gitter channel >> was that it could not be inferred because r could be of length 0, and in >> that case, the return type could not be inferred. My Julia-fu is too weak >> to then explain why the comprehension would be able to infer the return >> type. >> >> On Thursday, September 22, 2016 at 9:27:37 PM UTC+2, Stefan Karpinski >> wrote: >>> >>> I see the same, yet: >>> >>> julia> r = rand(10^5); >>> >>> julia> @time test1(r) >>> 0.000246 seconds (7 allocations: 208 bytes) >>> 33375.54531253989 >>> >>> julia> @time test2(r) >>> 0.001029 seconds (7 allocations: 781.500 KB) >>> 33375.54531253966 >>> >>> >>> So test1 is efficient, despite the codewarn output. Not sure what's up. >>> >>> On Thu, Sep 22, 2016 at 2:21 PM, Christoph Ortner <christop...@gmail.com >>> > wrote: >>> >>>> I hope that there is something I am missing, or making a mistake in the >>>> following example: >>>> >>>> r = rand(10) >>>> test1(r) = sum( t^2 for t in r ) >>>> test2(r)= sum( [t^2 for t in r] ) >>>> @code_warntype test1(r) # return type Any is inferred >>>> @code_warntype test2(r) # return type Float64 is inferred >>>> >>>> >>>> This caused a problem for me, beyond execution speed: I used a >>>> generator to create the elements for a comprehension, since the type was >>>> not inferred the zero-element could not be created. >>>> >>>> Is this a known issue? >>>> >>> >>>
Re: [julia-users] Generators vs Comprehensions, Type-stability?
On my side both function perform equally. although test2 had to be timed twice to get to the same performance. julia> test2(x)= sum( [t^2 for t in x] ) julia> @time test2(r) 0.017423 seconds (13.22 k allocations: 1.339 MB) julia> @time test2(r) 0.000332 seconds (9 allocations: 781.531 KB) I think the discrepancy comes from the JITing process because if I time it without using the macro @time, it works from the first run. julia> test2(x)= sum( [t^2 for t in x] ) WARNING: Method definition test2(Any) in module Main at REPL[68]:1 overwritten at REPL[71]:1. test2 (generic function with 1 method) julia> tic();for i=1:1 ; test2(r);end;toc()/1 elapsed time: 3.090764498 seconds 0.0003090764498 About the memory footprint -> test2 first constructs the inner vector then calls sum. since the type was not inferred the zero-element could not be created. > The sum of an empty set or vector is undefined it is not zero. you can rewrite it in a more explicit way test3(r) = begin total = Float64(0); for t in r total+=t ;end;end On Thursday, September 22, 2016 at 10:50:39 PM UTC+3, Patrick Kofod Mogensen wrote: > > I've seen the same, and the answer I got at the JuliaLang gitter channel > was that it could not be inferred because r could be of length 0, and in > that case, the return type could not be inferred. My Julia-fu is too weak > to then explain why the comprehension would be able to infer the return > type. > > On Thursday, September 22, 2016 at 9:27:37 PM UTC+2, Stefan Karpinski > wrote: >> >> I see the same, yet: >> >> julia> r = rand(10^5); >> >> julia> @time test1(r) >> 0.000246 seconds (7 allocations: 208 bytes) >> 33375.54531253989 >> >> julia> @time test2(r) >> 0.001029 seconds (7 allocations: 781.500 KB) >> 33375.54531253966 >> >> >> So test1 is efficient, despite the codewarn output. Not sure what's up. >> >> On Thu, Sep 22, 2016 at 2:21 PM, Christoph Ortner>> wrote: >> >>> I hope that there is something I am missing, or making a mistake in the >>> following example: >>> >>> r = rand(10) >>> test1(r) = sum( t^2 for t in r ) >>> test2(r)= sum( [t^2 for t in r] ) >>> @code_warntype test1(r) # return type Any is inferred >>> @code_warntype test2(r) # return type Float64 is inferred >>> >>> >>> This caused a problem for me, beyond execution speed: I used a generator >>> to create the elements for a comprehension, since the type was not inferred >>> the zero-element could not be created. >>> >>> Is this a known issue? >>> >> >>
[julia-users] Julia lang design pattern for objects that manage their own data
I am pondering the question on how to achieve the following functionality elegantly in Julia: 1) I am reading camera parameters from an ini file into an object of type camera 2) I want to monitor the ini file that used to create the object for any changes .. lets say every second, and to update the object's internal parameters if a change occurred. In C++ and in any other OO language that supports simple threading this is is a no-brainer , just define an internal class method for class camera that runs in a different thread and does just that ie : polling for changes every second and updating accordingly. It is also very easy to make it thread-safe with an object lock. What will be an elegant Julia equivalent? One way is to define a function poll_threaded() that accepts object of type camera and starts a thread that monitor the filesystem for changes and updates if a change occurred, The problem is that when the object is no longer referenced in the main application , the updating threaded function still holds a reference to it .. so effectively it will not get GCied Any ideas?
[julia-users] Debuging with Juno
did any one managed to get that working? I am getting an error Base.active_repl not defined . However debugging somewhat works in julia shell . I personally think that the ability to comfortably step into functions, including what is considered system packages is crucial for the language to take off. So does anyone have any clue as to why is this happening?
[julia-users] Re: How would you use Julia in a real world R Setup ?
Thank you for the time you took to answer. How do you go about debugging and inspecting? and making sure that changes you made gets compiled and that you are not accidentally running a previously imported version of a function? On Thursday, September 15, 2016 at 10:11:21 PM UTC+3, Tsur Herman wrote: > > Hi , I am new here . > Just started playing around with julia, my background is in Matlab c++ and > GPU computing(writing kernels) > > I am trying to figure out what will be good practice for rapid prototyping. > How would you use Julia and Juno IDE for a research and development > project that will > end up having numerous files and many lines of code , and a basic gui? > > Please share your experience , thanks. > Tsur > > >
[julia-users] How would you use Julia in a real world R Setup ?
Hi , I am new here . Just started playing around with julia, my background is in Matlab c++ and GPU computing(writing kernels) I am trying to figure out what will be good practice for rapid prototyping. How would you use Julia and Juno IDE for a research and development project that will end up having numerous files and many lines of code , and a basic gui? Please share your experience , thanks. Tsur