Re: [julia-users] How to determine which functions to overload, or, who is at the bottom of the function chain?
This was a very helpful answer. Thank you very much for responding. Cheers, Colin On 16 October 2016 at 20:23, Milan Bouchet-Valatwrote: > Le samedi 15 octobre 2016 à 20:36 -0700, colintbow...@gmail.com a > écrit : > > Hi all, > > > > Twice now I've thought I had overloaded the appropriate functions for > > a new type, only to observe apparent inconsistencies in the way the > > new type behaves. Of course, there were no inconsistencies. Instead, > > the observed behaviour stemmed from overloading a function that is > > not at the bottom of the function chain. The two examples where I > > stuffed up were: > > > > 1) overloading Base.< instead of overloading Base.isless, and > In this case, the help is quite explicit: > help?> < > search: < <= << <: .< .<= .<< > > <(x, y) > > Less-than comparison operator. New numeric types should implement this > function for two arguments of the new type. Because of the behavior of > floating-point NaN values, < implements a partial order. Types with a > canonical partial order should implement <, and types with a canonical > total > order should implement isless. > > > 2) overloading Base.string(x) instead of overloading Base.show(io, > > x). > This one is a bit trickier, since the printing code is complex, and not > completely stabilized yet. Though the help still gives some hints: > > help?> string > search: string String stringmime Cstring Cwstring RevString RepString > readstring > > string(xs...) > > Create a string from any values using the print function. > > So the more fundamental function to override is print(). The help for > print() says it falls back to show() if there's no print() method for a > given type. So if you don't have a special need for print(), override > show(). > > > My question is this: What is the communities best solution/resource > > for knowing which functions are at the bottom of the chain and thus > > are the ones that need to be overloaded for a new type? > In general, look at the help for a function. If there's no answer > (which is a most likely a lack in the documentation which should be > reported), look for it in the manual. The latter can always be useful, > even if the help already gives a reply. > > But documentation is perfectible, so do not hesitate to ask questions > and suggest enhancements (ideally via pull requests when you have found > out how it works). > > > Regards > > > > Cheers and thanks in advance to all repsonders, > > > > Colin >
Re: [julia-users] Re: How to build a range of -Inf to Inf
This is interesting thanks. I didn't realise you could re-assign j inside the index like that. A very neat notational trick. But I agree that looking forward the filter! option is probably best. I'll adjust my code accordingly, many thanks. As an aside, every time I post code to this list I get useful suggestions that make me a better Julia coder. I really appreciate it! Cheers, Colin On 4 June 2016 at 07:32, Steven G. Johnsonwrote: > > > On Thursday, June 2, 2016 at 11:42:32 PM UTC-4, colint...@gmail.com wrote: >> >> function Base.filter!{T}(x::AbstractVector{T}, r::BasicInterval{T}) >> for n = length(x):-1:1 >> !in(x[n], r) && deleteat!(x, n) >> end >> return(x) >> end >> > > I'm pretty sure this implementation has O(n^2) complexity, because > deleteat! for an array has to actually move all of the elements to fill the > hole. > > To get an O(n) algorithm, you could do something like: > > j = 0 > for i = 1:length(x) > if x[i] in r > x[j += 1] = x[i] > end > end > return resize!(x, j) > > Or you could just do filter!(x -> x in r, x), which is fast in Julia 0.5 >
Re: [julia-users] Re: How to build a range of -Inf to Inf
Yep. Agreed. Totally redundant :-) Cheers, Colin On 4 June 2016 at 05:11, DNFwrote: > Your in-method is a bit odd: > > Base.in{T}(x::T, r::BasicInterval{T}) = (r.start <= x <= r.stop) ? true : > false > > Why don't you just write > > Base.in{T}(x::T, r::BasicInterval{T}) = (r.start <= x <= r.stop) > ? > > The extra stuff is redundant. >
Re: [julia-users] Re: Deprecation warnings using julia on Atom
Hi Jeremy, This sounded promising, as at some point I did have IPython installed so I could use Jacob Quinn's Sublime-IJulia package. It is possible that something was left behind. However, unfortunately I haven't been able to find anything that looks suspicious (I'm on Ubuntu 14.04). For those who are interested, I was only able to locate the following files on my OS: sudo find /home/colin -name "*ipython*" -print /home/colin/.local/share/Trash/files/v0.3_OLD/IJulia/deps/ipython.jl /home/colin/.julia/v0.3/IJulia/deps/ipython.jl and sudo find /usr -name "*ipython*" -print /usr/share/app-install/icons/ipython.svg /usr/share/app-install/icons/ipython3.svg /usr/share/app-install/desktop/ipython3:ipython3.desktop /usr/share/app-install/desktop/ipython:ipython.desktop /usr/share/app-install/desktop/ipython-qtconsole:ipython-qtconsole.desktop /usr/share/app-install/desktop/ipython3-qtconsole:ipython3-qtconsole.desktop /usr/lib/calibre/calibre/utils/ipython.py Searches for "*Jupyter*" did not turn up anything. Cheers and thanks again for responding. Colin On 31 October 2015 at 21:29, Jeremy Cavanaghwrote: > Hi Colin, > > I was having the same problems while trying to get julia to work in atom > and was hoping that this thread would provide a solution. However, I was > alao trying to get the hydrogen to work as well but kept getting an error > which I could not figure out the cause so posted to an issue: > > https://github.com/willwhitney/hydrogen/issues/127#issuecomment-152661805 > > After following this great advice not only does hydrogen run without > errors, but, the deprecation warnings that you and I were getting also > disappeared. I am assuming that you are working on OS X. > > Hope this helps. > > > On Wednesday, October 28, 2015 at 12:57:43 AM UTC+1, colint...@gmail.com > wrote: >> >> Hi all, >> >> I'm using Julia v0.4 with the Atom package, on Atom 1.0 with the packages >> ink, julia-client, and language-julia (and I'm really enjoying this as an >> IDE solution). >> >> I can toggle the Julia console in Atom, and enter code directly into it >> without any errors or warnings. However, as soon as I try to evaluate a >> line of code from the Atom editor, I get a large number of deprecation >> warnings, either of the form: >> >> WARNING: Base.Uint8 is deprecated, use UInt8 instead. >> likely near no file:422 >> >> or >> >> WARNING: Base.Uint8 is deprecated, use UInt8 instead. >> likely near no file:422 >> in skip at /home/colin/.julia/v0.4/LNR/src/LNR.jl:171 >> >> Has anyone else encountered this and is there a fix? I had a look through >> the LNR source, and there is nothing in it that should be triggering a >> deprecation warning, nor is there even a line 171 (it only goes up to about >> line 130). >> >> Note, I can just ignore the deprecation warnings, and continue on working >> without a problem, so this isn't an urgent issue. Just wondering if I've >> stuffed up the install process somehow. >> >> Cheers, >> >> Colin >> >
Re: [julia-users] Deprecation warnings using julia on Atom
I've not heard of lightbox. Do you mean LightTable? I used to use LightTable with Mike Innes Juno package, but made the switch to Atom because my understanding is that from v0.4 onwards, Mike will be concentrating his efforts there. To be honest, I'm really happy with Atom and the Julia packages. I think Mike has already got it as friendly and feature-rich as Juno was (as long as you're on the Master branch of the packages, that is). And the deprecation warnings are not a deal-breaker - more a minor inconvenience. Cheers, Colin On 29 October 2015 at 03:45, endowdlywrote: > Have you tried using lightbox? > > I ran into similar problems with atom, which I prefer due to speed. I > could get lightbox to play well with the repl, however and "using jewel". > > On Tuesday, October 27, 2015 at 10:03:15 PM UTC-4, colint...@gmail.com > wrote: >> >> Done. Precompilation occurred at the REPL, and I didn't have to do it >> within Atom. Verified packages are all on Master. Unfortunately, I'm still >> getting all of the original deprecation warnings when evaluating from the >> editor in Atom. >> >> I'm happy to pursue this further, but am equally happy to wait it out if >> you're running short on ideas or time. >> >> Cheers and thanks, >> >> Colin >> >> On Wednesday, 28 October 2015 12:24:38 UTC+11, Spencer Russell wrote: >>> >>> You’re running into another known issue: >>> https://github.com/JuliaLang/julia/issues/12941 >>> >>> Try opening a normal REPL in the terminal and run “using Atom” to >>> trigger the precompilation, then it shouldn’t need to happen when you run >>> from Atom. >>> >>> -s >>> >>> >>> On Oct 27, 2015, at 9:09 PM, colint...@gmail.com wrote: >>> >>> Ah... understood. Many thanks. >>> >>> I'm afraid I'm still not getting the desired result however. After >>> running checkout on "Atom", "CodeTools" and "JuliaParser" I run >>> Pkg.status() and can verify I'm on the master branch for all 3. So I fire >>> up Atom again, try to evaluate in the editor, and get the following error: >>> >>> INFO: Recompiling stale cache file >>> /home/colin/.julia/lib/v0.4/JuliaParser.ji >>> Julia has stopped: 1, null >>> >>> So I close down Atom, open it again, and try to evaluate in the editor >>> again. This time I get: >>> >>> INFO: Recompiling stale cache file >>> /home/colin/.julia/lib/v0.4/CodeTools.ji >>> Julia has stopped: 1, null >>> >>> Close down Atom one more time, re-open it and try again. Now I get: >>> >>> INFO: Recompiling stale cache file /home/colin/.julia/lib/v0.4/Atom.ji >>> >>> but everything is now working fine. Problem solved? Unfortunately not. I >>> restart Atom again and I'm back to all the deprecation warnings, even >>> though Pkg.status() indicates I'm still on the master branch for Atom, >>> CodeTools, and JuliaParser. >>> >>> Apologies for the long message. Also, if this is one of those things >>> that will resolve itself over the next couple of weeks as changes from >>> master are pushed to the more stable branches, then I'm happy to ignore the >>> warnings for the time being and not waste anyone's time any further with >>> what is essentially a minor inconvenience. >>> >>> Cheers, >>> >>> Colin >>> >>> >>> >>> >>> On Wednesday, 28 October 2015 11:37:21 UTC+11, Spencer Russell wrote: `Pkg.checkout(…)` operates an an already-installed package, so it must be run after `Pkg.add(…)`. -s On Oct 27, 2015, at 8:31 PM, colint...@gmail.com wrote: I suppose I could clone the master branch. Is that a bad idea? On Wednesday, 28 October 2015 11:30:43 UTC+11, colint...@gmail.com wrote: > > Thanks for responding. > > Pkg.checkout("Atom") gives me the error: > > ERROR: Atom is not a git repo > in checkout at pkg/entry.jl:203 > in anonymous at pkg/dir.jl:31 > in cd at file.jl:22 > in cd at pkg/dir.jl:31 > in checkout at pkg.jl:37 > > (I originally did try using Pkg.checkout as per the instructions, but > got this error, and so went with Pkg.add instead). > > Any thoughts or is this a bug? > > Cheers, > > Colin > > > On Wednesday, 28 October 2015 11:23:30 UTC+11, Jonathan Malmaud wrote: >> >> You want to be on the master versions: >> >> Pkg.checkout("Atom") >> Pkg.checkout("CodeTools") >> > >>>
Re: [julia-users] Converting a string to a custom type results in a function that is not type stable
If you're interested, the source is here: https://github.com/colintbowers/DependentBootstrap.jl I haven't tried to make it into a registered package yet as I'm still tinkering with it a fair bit. But I think I'm nearly there. It is fairly comprehensive for univariate bootstrapping, ie lots of methods and block length selection procedures, but I haven't really thought about extending it to multivariate data yet. Maybe later this year. Cheers, Colin On 24 June 2015 at 23:42, John Myles White johnmyleswh...@gmail.com wrote: Excited you're working on dependent data bootstraps. I implemented one just the other day since it could be useful for analyzing benchmark data. Would be great to have other methods to do out. -- John On Wednesday, June 24, 2015 at 5:31:52 AM UTC-4, Milan Bouchet-Valat wrote: Le mercredi 24 juin 2015 à 01:18 -0700, colintbow...@gmail.com a écrit : Hi all, I've got an issue I don't really like in one of my modules, and I was wondering the best thing (if anything) to do about it. The module if for dependent bootstraps, but the problem is more of a project design issue. I have a type for each bootstrap method, e.g. `StationaryBootstrap`, `MovingBlockBootstrap` e.t.c. and they are all sub-types of an abstract `BootstrapMethod`. Then I have functions that can be called over these different bootstrap method types and multiple dispatch will make sure the appropriate code is called, e.g `bootindices(::StationaryBootstrap)` or `bootindices(::MovingBlockBootstrap)`. This all works nicely. I now want to define some keyword wrapper type functions in the module for users who don't want to learn much about how the types within the module work. For example, my wrapper might let the user describe the bootstrap procedure they want with a string, eg `bootindices(...; bootstrapmethod::ASCIIString=stationary)`. The keyword wrapper is called, and I have a variable `bootstrapMethod` which is a string. I need to convert it into the appropriate bootstrap method type so I can then call the appropriate method via multiple dispatch. Currently I have one function that does this and looks something like this: function boot_string_to_type(x::ASCIIString) x == stationary return(StationaryBootstrap()) x == movingBlock return(MovingBlock()) ... end The problem is that this function is not type-stable. Should I be worried? Does anyone have a better way of dealing with this kind of issue? Maybe something involving symbols or expressions, or anonymous functions etc? Note, the situation can sometimes get quite a bit more complicated than this, with multiple key-word arguments, all of which need to be combined into the constructor for the relevant type. I think the most Julian way to do this is to have users pass a type instead of a string. They would write bootindices{T:BootstrapMethod}(...; method::Type{T}=StationaryBootstrap) That's simpler for the user as passing a string (since autocompletion w ill work), you don't need to define boot_string_to_type(), and it's type-stable. This is the idiom used by fit() in StatsBase.jl (and GLM.jl) to choose which type of model should be estimated. Hope this helps PS: in the cases where you still want to pass a string as an argument, rather than a type, consider using symbols instead, as it is more efficient.
Re: [julia-users] Converting a string to a custom type results in a function that is not type stable
Thanks for responding. Yes, I think I will do it your way. I was initially hoping there would be a neat way to duplicate how R would do it, ie, with keyword arguments typically always strings or numbers since this is what many new users will be familiar with, but maybe in the end it would just be setting a bad example. Cheers, Colin On 24 June 2015 at 19:31, Milan Bouchet-Valat nalimi...@club.fr wrote: Le mercredi 24 juin 2015 à 01:18 -0700, colintbow...@gmail.com a écrit : Hi all, I've got an issue I don't really like in one of my modules, and I was wondering the best thing (if anything) to do about it. The module if for dependent bootstraps, but the problem is more of a project design issue. I have a type for each bootstrap method, e.g. `StationaryBootstrap`, `MovingBlockBootstrap` e.t.c. and they are all sub-types of an abstract `BootstrapMethod`. Then I have functions that can be called over these different bootstrap method types and multiple dispatch will make sure the appropriate code is called, e.g `bootindices(::StationaryBootstrap)` or `bootindices(::MovingBlockBootstrap)`. This all works nicely. I now want to define some keyword wrapper type functions in the module for users who don't want to learn much about how the types within the module work. For example, my wrapper might let the user describe the bootstrap procedure they want with a string, eg `bootindices(...; bootstrapmethod::ASCIIString=stationary)`. The keyword wrapper is called, and I have a variable `bootstrapMethod` which is a string. I need to convert it into the appropriate bootstrap method type so I can then call the appropriate method via multiple dispatch. Currently I have one function that does this and looks something like this: function boot_string_to_type(x::ASCIIString) x == stationary return(StationaryBootstrap()) x == movingBlock return(MovingBlock()) ... end The problem is that this function is not type-stable. Should I be worried? Does anyone have a better way of dealing with this kind of issue? Maybe something involving symbols or expressions, or anonymous functions etc? Note, the situation can sometimes get quite a bit more complicated than this, with multiple key-word arguments, all of which need to be combined into the constructor for the relevant type. I think the most Julian way to do this is to have users pass a type instead of a string. They would write bootindices{T:BootstrapMethod}(...; method::Type{T}=StationaryBootstrap) That's simpler for the user as passing a string (since autocompletion w ill work), you don't need to define boot_string_to_type(), and it's type-stable. This is the idiom used by fit() in StatsBase.jl (and GLM.jl) to choose which type of model should be estimated. Hope this helps PS: in the cases where you still want to pass a string as an argument, rather than a type, consider using symbols instead, as it is more efficient.
Re: [julia-users] Re: When are function arguments going to be inlined?
Looks good to me! Cheers, Colin On 24 June 2015 at 23:04, Andrew owen...@gmail.com wrote: Yup, I like that idea too. Multiple dispatch is quite useful here. This is my implementation. abstract UtilityFunction immutable CRRA : UtilityFunction sigmac::Float64 sigmal::Float64 psi::Float64 end immutable LogUtility : UtilityFunction end function u(UF::CRRA,consump,labor) sigmac = UF.sigmac sigmal = UF.sigmal psi = UF.psi (consump 0 labor 1) return consump.^(1-sigmac)/(1-sigmac) + psi*(1-labor).^(1-sigmal)/(1-sigmal) return -Inf end function u(UF::LogUtility,consump,labor) consump 0 return log(consump) return -Inf end function test1(UF::UtilityFunction) for i = 1:100 u(UF,-1. + 1/i, .5) end end function test2(UF::UtilityFunction) for i = 1:100 u(UF,-1. + 1/i ,.5) end end UF1 = CRRA(4,2,1) UF2 = LogUtility() @time test1(UF1) @time test2(UF2) elapsed time: 0.005229738 seconds (80 bytes allocated) elapsed time: 0.004894504 seconds (80 bytes allocated) On Tuesday, June 23, 2015 at 8:28:28 PM UTC-4, Colin Bowers wrote: Yes, that is pretty much how I would do it, although, as I said in my previous post, I would set `UtilityFunction` to an abstract type, and then define my actual utility function immutable, say `MyCustomUtilityFunc`, as a subtype of `UtilityFunction`. That way you can easily add different types of utility functions later without having to change your existing code. By the way, just for the record, a fair test between the two approaches would be as follows: abstract UtilityFunction immutable MyCustomUtilityFunction : UtilityFunction sigmac::Float64 sigmal::Float64 psi::Float64 end u4(sigmac, sigmal, psi, consump,labor) = consump.^(1-sigmac)/(1-sigmac) + psi*(1-labor).^(1-sigmal)/(1-sigmal) u(UF::MyCustomUtilityFunctionconsump,labor) = consump.^(1-UF.sigmac)/(1-UF.sigmac) + UF.psi*(1-labor).^(1-UF.sigmal)/(1-UF.sigmal) function test1(sigmac, sigmal, psi) for i = 1:100 u4(sigmac, sigmal, psi, 1.0 + 1/i, 0.5) end end function test2(UF::UtilityFunction) for i = 1:100 u(UF, 1.0 + 1/i , 0.5) end end UF = MyCustomUtilityFunction(4,2,1) test1(4.0, 2.0, 1.0) test2(UF) On my machine that returns: elapsed time: 0.090409383 seconds (80 bytes allocated) elapsed time: 0.091065473 seconds (80 bytes allocated) ie, no significant performance difference On 24 June 2015 at 00:56, Andrew owe...@gmail.com wrote: Thanks, this is all very useful. I think I am going to back away from using the @anon functions at the moment, so I'll postpone my idea to encapsulate the functions into a type. Instead, I will just pass a parameter type to an externally defined(not nested) function. I had thought this would be slow (see my question here https://groups.google.com/forum/#!topic/julia-users/6U-otLSx7B0 ), but I did a little testing. immutable UtilityFunction sigmac::Float64 sigmal::Float64 psi::Float64 end function u(UF::UtilityFunction,consump,labor) sigmac = UF.sigmac sigmal = UF.sigmal psi = UF.psi consump.^(1-sigmac)/(1-sigmac) + psi*(1-labor).^(1-sigmal)/(1-sigmal ) end function u4(consump,labor) consump.^(1-4)/(1-4) + 1*(1-labor).^(1-2)/(1-2) end function test1(UF) for i = 1:100 u4(1. + 1/i, .5) end end function test2(UF) for i = 1:100 u(UF,1. + 1/i ,.5) end end UF = UtilityFunction(4,2,1) @time test1(UF) @time test2(UF) elapsed time: 0.068562617 seconds (80 bytes allocated) elapsed time: 0.139422608 seconds (80 bytes allocated) So, even versus the extreme case where I built the constants into the function, the slowdown is not huge. I asume @anon would have similar performance to the constants built in case, which is nice. However, I want to be able to share my Julia code with others who aren't very experienced with the language, so I'd be uncomfortable asking them to understand the workings of FastAnonymous. It's useful to know about in case I need the speedup in my own personal code though. On Tuesday, June 23, 2015 at 8:51:25 AM UTC-4, colint...@gmail.com wrote: Yes, this proves to be an issue for me sometimes too. I asked a StackOverflow question on this topic a few months ago and got a very interesting response, as well as some interesting links. See here: http://stackoverflow.com/questions/28356437/julia-compiler-does-not-appear-to-optimize-when-a-function-is-passed-a-function As a general rule, if the function you are passing round is very simple and gets called a lot, then you will really notice the performance overhead. In other cases where the function is more complicated, or is not called that often, the overhead will be barely measurable. If the number of functions that you want to pass around is not that large, one way around this is to use
Re: [julia-users] Re: Julia computing problem in a loop
My apologies for not responding sooner. I have been clearing a stack of referee reports off my desk. No problems. It's end of semester here in Australia so I've been buried under piles of marking anyway :-) Thus, I would be very interested in learning more about your work in coding the MCS in Julia. Please, keep me updated. The source is here: https://github.com/colintbowers/ForecastEval.jl It is under heavy development at the moment - and I'm not up to the MCS yet (working on Reality Check and SPA test at the moment). But I'll add another message to this thread once I think it is ready to be cloned. The fixed coefficient structural VAR version of the Gibbs sampler is running in Julia, but it is slow If the source of this is available online I'll try and take a look at some point (although unlikely to be within the next few weeks). I am by no means an expert at writing fast Julia, but I like to think I'm starting to get a handle on it now (been using it full-time for close to a year). Also, I'd be interested in hearing how Mauro and my recommendations go in speeding up your code, once you get a chance to look at it again. I realised after my last post you can also save the memory allocation on my slice variable by using `sub`. On v0.3.x though, this is not guaranteed to improve performance for various reasons. But I think a recent thread here indicated that v0.4 may be getting close to an official release. Cheers, Colin On 25 June 2015 at 03:07, jamesmna...@gmail.com wrote: Dear Colin: Thanks for your comments. My apologies for not responding sooner. I have been clearing a stack of referee reports off my desk. Yes, I am fortunate to be a coauthor of Peter Hansen and Asgar Lunde. But at the moment, I have no plans to code the MCS into Julia. Peter and Asgar work mostly in Ox. Thus, I would be very interested in learning more about your work in coding the MCS in Julia. Please, keep me updated. My question was generated by my attempt to port MatLab code of Canova and Pérez Forero (2015, Estimating Overidentified, Non-Recursive, Time Varying Coefficients Structural VARs, Quantitative Economics, forthcoming) into Julia. Canova and Perez Forero propose a Metropolis within Gibbs sampler to estimate TVP-VARs with Leeper-Sims-Zha non-recursive identifications. The fixed coefficient structural VAR version of the Gibbs sampler is running in Julia, but it is slow. My thinking was to start with the fixed coefficient model as a way to learn Julia. Stay in touch. Best wishes. Jim On Wednesday, June 17, 2015 at 9:26:15 PM UTC-4, colint...@gmail.com wrote: Hi Jim, A couple of points: 1) Maybe I'm missing something, but you appear to be calculating the same inverse twice on every iteration of your loop. That is, inv(Om_e[(j-1)*nvar+1:j*nvar,:]) gets called twice. 2) As Mauro points out, memory allocation is currently triggered when slicing into 2d arrays on v0.3.x. You can read more about this at the following StackOverflow question: http://stackoverflow.com/questions/28271308/avoid-memory-allocation-when-indexing-an-array-in-julia. Since you are indexing with ranges, my understanding is that in v0.4 you should be able to avoid the allocation. In the meantime, you could try performing the slice once and assign it to a new variable on each iteration, and then use that variable in your matrix calls. 3) I'll strongly second Mauro's suggestion that you pass in to your function everything that is not explicitly defined as a global constant. This should provide a significant performance improvement. For more reading on this, check out the first item in the Performance Tips section of the official docs ( http://julia.readthedocs.org/en/latest/manual/performance-tips/) So taking all these things together, my version of your function would look something like this: function bb_update(bbj, bbcovj, capZ, nvar, Om_e, yyy) for j = 1:size(yyy, 2) currentInverse = inv(Om_e[(j-1)*nvar+1:j*nvar,:]) currentZSlice = capZ[(j-1)*nvar+1:j*nvar,:] bbcovj += currentZSlice' * currentInverse * currentZSlice bbj += currentZSlice' * currentInverse * yyy[:,j] end return (bbj, bbcovj) end A final question: are you planning on implementing the Model Confidence Set in Julia at any time soon (I'm making the possibly incorrect assumption that you're the same James Nason from Hansen, Lunde, Nason (2011))? Bit of a co-incidence, but I was hoping to implement the Model Confidence Set in Julia sometime in the next few weeks as part of a forecast evaluation package. If you're interested, I can point you to the github source once it is done. Cheers, Colin On Wednesday, 17 June 2015 14:11:49 UTC+10, james...@gmail.com wrote: Hi All: I am a novice using Julia. As a way to learn Julia, my project is to convert MatLab code that estimates Bayesian vector autoregressions. The estimator uses Gibbs
Re: [julia-users] Re: When are function arguments going to be inlined?
Yes, that is pretty much how I would do it, although, as I said in my previous post, I would set `UtilityFunction` to an abstract type, and then define my actual utility function immutable, say `MyCustomUtilityFunc`, as a subtype of `UtilityFunction`. That way you can easily add different types of utility functions later without having to change your existing code. By the way, just for the record, a fair test between the two approaches would be as follows: abstract UtilityFunction immutable MyCustomUtilityFunction : UtilityFunction sigmac::Float64 sigmal::Float64 psi::Float64 end u4(sigmac, sigmal, psi, consump,labor) = consump.^(1-sigmac)/(1-sigmac) + psi*(1-labor).^(1-sigmal)/(1-sigmal) u(UF::MyCustomUtilityFunctionconsump,labor) = consump.^(1-UF.sigmac)/(1-UF.sigmac) + UF.psi*(1-labor).^(1-UF.sigmal)/(1-UF.sigmal) function test1(sigmac, sigmal, psi) for i = 1:100 u4(sigmac, sigmal, psi, 1.0 + 1/i, 0.5) end end function test2(UF::UtilityFunction) for i = 1:100 u(UF, 1.0 + 1/i , 0.5) end end UF = MyCustomUtilityFunction(4,2,1) test1(4.0, 2.0, 1.0) test2(UF) On my machine that returns: elapsed time: 0.090409383 seconds (80 bytes allocated) elapsed time: 0.091065473 seconds (80 bytes allocated) ie, no significant performance difference On 24 June 2015 at 00:56, Andrew owen...@gmail.com wrote: Thanks, this is all very useful. I think I am going to back away from using the @anon functions at the moment, so I'll postpone my idea to encapsulate the functions into a type. Instead, I will just pass a parameter type to an externally defined(not nested) function. I had thought this would be slow (see my question here https://groups.google.com/forum/#!topic/julia-users/6U-otLSx7B0 ), but I did a little testing. immutable UtilityFunction sigmac::Float64 sigmal::Float64 psi::Float64 end function u(UF::UtilityFunction,consump,labor) sigmac = UF.sigmac sigmal = UF.sigmal psi = UF.psi consump.^(1-sigmac)/(1-sigmac) + psi*(1-labor).^(1-sigmal)/(1-sigmal) end function u4(consump,labor) consump.^(1-4)/(1-4) + 1*(1-labor).^(1-2)/(1-2) end function test1(UF) for i = 1:100 u4(1. + 1/i, .5) end end function test2(UF) for i = 1:100 u(UF,1. + 1/i ,.5) end end UF = UtilityFunction(4,2,1) @time test1(UF) @time test2(UF) elapsed time: 0.068562617 seconds (80 bytes allocated) elapsed time: 0.139422608 seconds (80 bytes allocated) So, even versus the extreme case where I built the constants into the function, the slowdown is not huge. I asume @anon would have similar performance to the constants built in case, which is nice. However, I want to be able to share my Julia code with others who aren't very experienced with the language, so I'd be uncomfortable asking them to understand the workings of FastAnonymous. It's useful to know about in case I need the speedup in my own personal code though. On Tuesday, June 23, 2015 at 8:51:25 AM UTC-4, colint...@gmail.com wrote: Yes, this proves to be an issue for me sometimes too. I asked a StackOverflow question on this topic a few months ago and got a very interesting response, as well as some interesting links. See here: http://stackoverflow.com/questions/28356437/julia-compiler-does-not-appear-to-optimize-when-a-function-is-passed-a-function As a general rule, if the function you are passing round is very simple and gets called a lot, then you will really notice the performance overhead. In other cases where the function is more complicated, or is not called that often, the overhead will be barely measurable. If the number of functions that you want to pass around is not that large, one way around this is to use types and multiple dispatch instead of functions, eg abstract UtilityFunctions type QuadraticUtility : UtilityFunctions a::Float64 b::Float64 c::Float64 end evaluate(x::Number, f::QuadraticUtility) = f.a*x^2 + f.b*x + f.c Now your function would be something like: function solveModel(f::UtilityFunctions, ...) and you would call evaluate at the appropriate place in the function body and multiple dispatch will take care of the rest. There is no performance overhead with this approach. Of course, if you want to be able to just pass in any arbitrary function that a user might think up, then this approach is not tenable. On Tuesday, 23 June 2015 01:07:25 UTC+10, Andrew wrote: I'm trying to write some abstract Julia code to solve a variety of economics models. Julia provides powerful abstraction tools which I think makes it very well-suited to this; however, I've read in several places that Julia doesn't yet know how to inline functions passed as arguments, hence code like function SolveModel(Utility::Function, ProductionTechnology::Function ,...) ... will be slow. I performed this very simple test. function ftest1() u(x) = log(x)
Re: [julia-users] Re: Index into an array with an IntSet
Cool, works for me. Many thanks. On 16 June 2015 at 10:27, David Gold david.gol...@gmail.com wrote: `collect(myIntSet)` should also do it, I believe. On Monday, June 15, 2015 at 8:20:15 PM UTC-4, colint...@gmail.com wrote: Ah, I understand. Thanks for responding and pointing me to the appropriate pull request. So currently if we want to index with an IntSet, the best thing to do is probably just convert the IntSet to a Vector{Int} using something like: [ i for i in myIntSet ] yes? Cheers, Colin On Monday, 15 June 2015 13:15:55 UTC+10, Matt Bauman wrote: No, this isn't implemented in 0.4, either. It is something I've thought about, but IntSet's current semantics aren't quite right for the job. See: https://github.com/JuliaLang/julia/pull/10065#issuecomment-93853097 It may be worth splitting that PR out into the IndexSet package for this purpose.