Hi Ferran,
First of all, it is so much easier for people to help you if you post the
code you don't understand isn't working.
Best,
Patrick
On Monday, November 14, 2016 at 9:35:03 AM UTC+1, Ferran Mazzanti wrote:
>
> Hi again,
>
> thanks Scott. That doesn't work on my ubuntu machine. Looks
My take: make the move or don't.
Stuff like forwarding posts only creates confusion, and people start
answering things that aren't seen by the person asking, because of the
one-way-street-thing.
On Wednesday, November 9, 2016 at 2:24:18 AM UTC+1, Valentin Churavy wrote:
>
> `julia-dev` has now
Does that work for you? I have to write
A .= (*).(A,B)
On Wednesday, November 2, 2016 at 3:51:54 AM UTC+1, Chris Rackauckas wrote:
>
> It's the other way around. .* won't fuse because it's still an operator.
> .= will. It you want .* to fuse, you can instead do:
>
> A .= *.(A,B)
>
> since this
ce-107-julia.html
>
> Are you having difficulty with access?
>
>
> Brock Palen
> 1 (989) 277-6075
> bro...@mlds-networks.com
> www.mlds-networks.com
> Websites, Linux, Hosting, Joomla, Consulting
>
>
>
> > On Oct 10, 2016, at 6:34 AM, Patrick
Is the mentioned blogpost live? :)
On Saturday, October 8, 2016 at 4:46:02 PM UTC+2, Stefan Karpinski wrote:
>
> . Looking forward to hearing the episode! Thanks, Brock.
>
> On Wed, Oct 5, 2016 at 6:45 PM, Brock Palen > wrote:
>
>> Thank you everyone for the support
As the error says, they both export a function called rle, so it is not
possible to know which one you're trying to call, if you don't qualify
them. Qualifying means writing "package name dot" and then the function, as
seen below
module A
export f
f(x::Int64) = x
end
module B
export f
I still don't quite get why a) inference between the generator and the
comprehension is different, and b) why inference went down the drain when I
added the type annotation for the return value in my example above... Sorry
if the answer is in this discussion somewhere!
On Friday, September 23,
And worst of all, no Julia-speedometer at
http://nirajkadu.me/index.php/about/ either!
On Thursday, September 22, 2016 at 9:23:39 PM UTC+2, Stefan Karpinski wrote:
>
> Yikes... recycled static IP address :|
>
> On Thu, Sep 22, 2016 at 1:02 PM, mmh
> wrote:
>
>>
There might be a perfectly valid explanation for this, but this also
surprises me.
r = rand(10)
f(x) = x^2
test1(r) = sum( f(x) for t in r )
test2(r) = sum( [f(x) for t in r] )
@code_warntype test1(r) # return type Any is inferred
@code_warntype test2(r) # return type Float64 is inferred
I've seen the same, and the answer I got at the JuliaLang gitter channel
was that it could not be inferred because r could be of length 0, and in
that case, the return type could not be inferred. My Julia-fu is too weak
to then explain why the comprehension would be able to infer the return
How does this sync with the "original website"? I mean, what if something
changes on the original website?
On Thursday, September 22, 2016 at 8:35:38 AM UTC+2, Ismael Venegas
Castelló wrote:
>
> I forgot, you guys can see the staging domain here:
>
>
>- http://julialanges.github.io
>
>
Wonderful, congrats to everyone, and good luck towards v1.0!
On Tuesday, September 20, 2016 at 11:08:44 AM UTC+2, Tony Kelman wrote:
>
> At long last, we can announce the final release of Julia 0.5.0! See the
> release notes at
> https://github.com/JuliaLang/julia/blob/release-0.5/NEWS.md for
This surprised me as well, where did you find this syntax?
On Monday, September 12, 2016 at 1:59:33 PM UTC+2, DNF wrote:
>
> I haven't looked very closely at your code, but a brief look reveals that
> you are defining your functions in a very unusual way. Two examples:
>
> function
s hitting something like
> this https://github.com/JuliaLang/julia/issues/15276
> Smaller repro
> function c!{T}(::T,P)
> if length(P)>2
> maxV = one(T)
> d = x->maxV
> end
> P
> end
>
>
>
> On Friday, September 9, 2016 at 1:02:29 PM UTC+3
So, I am kind of confused here. In my code, a maxV = maximum(V) is labeled
as Core.Box in @code_warntype, but if I remove a line after the line where
maxV is calculated, it is correctly labelled as eltype(V). Can anyone
explain what happens/what I am doing wrong here? This is not the whole
Hmm... No 4K resolution?
Anyway, great package!
On Monday, September 5, 2016 at 12:36:30 AM UTC+2, Kristoffer Carlsson
wrote:
>
> Hello everyone,
>
> I would like to announce my REPL enhancing package PimpMyREPL. The
> package aims to be the for the Julia REPL as Pimp My Ride was for cars :)
>
Since this isn't Github I cannot :+1:, but great stuff!
On Friday, August 19, 2016 at 6:56:38 AM UTC+2, Viral Shah wrote:
>
> I have uploaded Julia-0.5 on Power8 binaries here. These are built with
> the latest openblas (that passes all julia tests) and hence there is no
> need to use ATLAS.
>
At Github :)
https://github.com/stevengj/PyCall.jl/issues/new for PyCall
or
https://github.com/stevengj/PyPlot.jl/issues/new for PyPlot
On Thursday, August 18, 2016 at 9:42:49 PM UTC+2, Ferran Mazzanti wrote:
>
> Yes of course... how do I do that?
> Thanks,
> Ferran.
>
> On Thursday, August
1
>
> On Tuesday, August 16, 2016 at 4:52:09 PM UTC+2, Patrick Kofod Mogensen
> wrote:
>>
>> julia> using Compat
>>
>> julia> abstract Optimizer
>>
>> julia> immutable NelderMead <: Optimizer
>>end
>
julia> using Compat
julia> abstract Optimizer
julia> immutable NelderMead <: Optimizer
end
julia> immutable OptimizationState{T <: Optimizer}
iteration::Int
value::Float64
g_norm::Float64
Hello,
We recently did some changes in Optim.jl, that unfortunately broke a line
in DSGE.jl. I was actually unaware that OptimizationTrace was exported, so
I didn't think about deprecating it. After Tony Kelman pointed this out to
me, I tried to make a fix. Basically we have a typealias
t; Hello,
>>> in the README it says, "for more information, see the documentation".
>>> Is the documentation online available? I cannot find it.
>>>
>>> Uwe
>>>
>>> On Saturday, August 13, 2016 at 10:07:57 PM UTC+2, Patrick Kofod
>>> Mog
The vast majority of Julia packages have links to the documentation tied to
the badges. I could make it even more visible by giving the table a
"Resource links" header, or I could make "see [the documentation]" a link.
I don't think it should be necessary to have a Documentation-section.
On
We are happy to announce that Optim v0.6 has been released.
Since v0.5, we've seen some great additions and changes to the package. Much
of this has happened with the help of new and old contributors, and
therefore
this version tag should really be dedicated to them. We've had nine
contributors
I get what you're saying, but the master might not be ready to be tagged in all
packages, so it might not simply be "just tag it already".
Congrats, and good looking benchmarks you're sporting :)
On Tuesday, July 26, 2016 at 11:07:44 PM UTC+2, Bill Hart wrote:
>
> Hi all,
>
> We are pleased to release version 0.5 of Nemo, our computer algebra
> package written in Julia.
>
> Instructions on how to get Nemo are on our website:
>
>
Vector{T} means a vector where the elements are of type T. If T = Vector{S}
then every element in the original vector is itself a vector with elements
of type S. In your case, S = Float64 and T = Vector{S} = Vector{Float64}. I
think it's a pretty good idea to make sure you understand this, as
On Thursday, July 14, 2016 at 10:14:38 PM UTC+2, Tim Holy wrote:
>
> *Until we get "diagonal dispatch,"* I think the only way to do this is to
> expand
> the number of arguments in the function:
>
Triangular dispatch, right? Diagonal dispatch is exactly what we have :)
>
This may not be the advice you are looking for, but do you have a specific
problem you want to solve? I started programming by reading texts online
about C - and it killed my enthusiasm. People are different, but I didn't
"get" programming before I had a problem I actually wanted solved!
On
I think Michael might be aware of this, but was hoping that the formula
interface was as flexible as in R, which it is not. I believe there used to
be support for more transformations in ~, but it was removed.
On Saturday, June 25, 2016 at 1:12:27 PM UTC+2, Douglas Bates wrote:
>
> On Friday,
We need more information, but Marco might be using nightlies on JuliaBox. I
get "kernel died" at least, no matter what code I try to run.
On Friday, June 24, 2016 at 7:30:48 PM UTC+2, cdm wrote:
>
>
> what system are you on (Mac, Linux, Win ...)?
>
> have you run Pkg.update() lately ?
>
>
> more
https://github.com/JuliaLang/julia/pull/17089 should fix it!
On Friday, June 24, 2016 at 9:38:04 AM UTC+2, Lutfullah Tomak wrote:
>
> While experimenting this, I don't know if it is intentional but [[M] [M];]
> makes it sparse matrix of matrices. :)
>
> On Friday, June 24, 2016 at 5:39:51 AM
Not intended, thanks for noticing!
On Friday, June 24, 2016 at 9:38:04 AM UTC+2, Lutfullah Tomak wrote:
>
> While experimenting this, I don't know if it is intentional but [[M] [M];]
> makes it sparse matrix of matrices. :)
>
> On Friday, June 24, 2016 at 5:39:51 AM UTC+3, Dan wrote:
>>
>> [[M]
But that is exactly what Forgy's code does
expression && do_something
is the same as
if expression
do_something
end
On Wednesday, June 8, 2016 at 10:44:18 AM UTC+2, Anonymous wrote:
>
> I think maybe I didn't write exactly what I meant to. The expression in
> my original post should have
The contributors are happy to announce, that Optim.jl version 0.5 has now
been released.
Notice that this version is breaking in a few different ways, so please
update your code as necessary, and don't hesitate to ask over at
https://github.com/JuliaOpt/Optim.jl/issues or join Kristoffer
File an issue :)
On Tuesday, May 31, 2016 at 6:59:24 PM UTC+2, Boylan, Ross wrote:
>
> I believe Calculus.hessian does finite differences. I had not expected it
> to computer both h[i, j] and h[j, i], since they should be the same. That
> is, AFAIK they are the same mathematically;
Same as v0.4, or same as before you changed the code?
On Sunday, May 8, 2016 at 8:55:00 PM UTC+2, feza wrote:
>
> roughly the same speed.
>
> On Sunday, May 8, 2016 at 2:44:19 PM UTC-4, Patrick Kofod Mogensen wrote:
>>
>> out of curiosity, what about v0.5?
>
>
out of curiosity, what about v0.5?
For what it's worth it run in about 3-4 seconds on my computer on latest
v0.4.
CPU : Intel(R) Core(TM) i7-4600U CPU @ 2.10GHz
On Sunday, May 8, 2016 at 10:33:14 AM UTC+2, Patrick Kofod Mogensen wrote:
>
> As for the v0.5 performance (which is horrible), I think it's the boxing
&
As for the v0.5 performance (which is horrible), I think it's the boxing
issue with closure https://github.com/JuliaLang/julia/issues/15276 . Right?
On Sunday, May 8, 2016 at 10:29:59 AM UTC+2, STAR0SS wrote:
>
> You are using a lot of vectorized operations and Julia isn't as good as
> matlab
What Kristoffer and Miles said, but specifically all algorithm constructors
accept a linesearch! keyword, and in the case of ConjugateGradient we have
(see
https://github.com/JuliaOpt/Optim.jl/blob/1cf3fbe053edeb8a388ff0d03af75386ad2e3457/src/cg.jl#L106
)
immutable ConjugateGradient{T} <:
The contributors are happy to announce, that Optim.jl version 0.4.5 has now
been released.
This is the last patch before the next minor version bump to v0.5.0. The
next release will remove deprecations introduced in this patch, and will
also include breaking changes to the API. This patch
Can you try to show a small script that reproduces the error?
On Tuesday, May 3, 2016 at 6:16:43 AM UTC+2, new to Julia wrote:
>
> Thanks for your reply.
> A=sparse(nrow,ncol,vals_final);
> nrow, ncol, vals_final are all 1 dimensional vector.
>
> On Monday, May 2, 2016 at 11:10:12 PM UTC-5,
> if overload is the right word in the previous sentence.
>
> Interestingly things work fine if I first define my function and then load
> the package.
>
> I will further look into this.
>
> Am Montag, 18. April 2016 10:15:02 UTC+2 schrieb Patrick Kofod Mogensen:
>
Say I have something like
type MyType{T}
a::T
b::Vector{T}
c::Vector{Vector{T}}
d::Vector{Matrix{T}}
end
MyType(3, [3, 3], Vector{Int64}[[3,3], [4,4]], Matrix{Int64}[[3 1],[1 3]])
MyType(3, [3, 3], Vector[[3,3], [4,4]], Matrix[[3 1],[1 3]])
The first call to the MyType
What do you mean by slow and a lot of memory here?
julia> using Benchmarks
julia> @benchmark gini(ab,cd,ef)
Benchmark Results
Time per evaluation: 3.68 ms [3.54 ms, 3.82 ms]
Proportion of time in GC: 0.12% [0.00%, 0.84%]
Memory allocated:
+1 for !
On Saturday, April 9, 2016 at 8:09:23 AM UTC+2, Christof Stocker wrote:
>
> I also prefer the ! so I know it modifies an existing plot. If I don't use
> a ! then I expect to create a new one
>
> On 2016-04-09 02:52, Daniel Carrera wrote:
>
> Ok. Thanks for the explanation.
>
> Cheers,
> s
> 11-element Array{AbstractString,1}:
> "LightGraphs"
> "QDXML"
> "LispSyntax"
> "RobustShortestPath"
> "QuantEcon"
> "BayesNets"
> "Metis"
If you're willing to use Plots, here's the syntax (might be very close to
Pyplot, not sure). It uses the pyplot backend
using Plots
function rosenbrock(x::Vector)
return (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2
end
default(size=(600,600), fc=:heat)
x, y = -1.5:0.1:1.5, -1.5:0.1:1.5
z =
Very cool that you added user-defined functions (and AD). Congrats on the
new version.
On Saturday, February 27, 2016 at 11:14:16 PM UTC+1, Miles Lubin wrote:
>
> The JuMP team is happy to announce the release of JuMP 0.12.
>
> This release features a complete rewrite of JuMP's automatic
>
maud wrote:
>
> That site is just a tech demo so we can see what it would be like to
> switch to Discourse; it wouldn't be in actual use until a decision was made
> to switch over to it.
>
> On Sat, Feb 20, 2016 at 3:10 PM Patrick Kofod Mogensen <
> patrick@gmail.com > wr
Isn't it dead? Latest posts seem to be from October last year, or am I looking
at the wrong forum? Obviously, people don't mind this format.
Is it possible to create a colormap as described in the title using
Colors.jl? I couldn't seem to find anything on Colormaps and
transparency/opacity.
Best,
Patrick
ll cases. At least in
> plotting/rendering using RGB is prevalent.
>
> On Friday, February 12, 2016 at 10:48:06 AM UTC+1, Patrick Kofod Mogensen
> wrote:
>>
>> Is it possible to create a colormap as described in the title using
>> Colors.jl? I couldn't seem to find anything on Co
That is because it is not registered, you have to use
Pkg.clone("https://github.com/johnmyleswhite/Benchmarks.jl;)
On Thursday, February 4, 2016 at 11:56:32 AM UTC+1, Lytu wrote:
>
> The problem is that it is not possible for me to add Benchmarks.jl package
> when i do Pkg,add("Benchmarks"), i
How did you verify that? Or are you guessing? Did you @show the iteration
counter's increment? If not, how do you know it starts high?
On Sunday, January 24, 2016 at 7:45:48 PM UTC+1, grande...@gmail.com wrote:
>
> thanks but that s not the issue. for some reasons. the number of
> iterations is
What do you study?
On Sunday, January 17, 2016 at 8:47:05 PM UTC+1, noufal n wrote:
>
> I'm a student and i wish to study and contribute to julia community. As a
> part my professional degree i like to do a project in julia. Need
> suggestions
>
I don't have a computer neaby, but it is not surprising that calling npreg
tales a lotion the total time, since that is basically most of your script.
What lines in npreg are the slow one?Or maybe I didn't see where you put the
@time's
Did you try to use the profiler? Where is the time spent?
I get it, thanks for pointing that out.
On Monday, December 14, 2015 at 11:38:14 AM UTC+1, FQ wrote:
>
> The code does exit at some point, returning a 2049x2049 complex matrix.
> note that delta phi is reassigned inside the loop at some point.
>
> Am 14.12.2015 um 10:31 schrie
Are you sure this version works? I get Δϕ == 1.0 in all iterations, so I
can't see how it would ever exit the while loop...
On Monday, December 14, 2015 at 4:35:43 AM UTC+1, 博陈 wrote:
>
> This is my first julia code, I am happy it did the right thing, but
> compared with the Matlab code that
+1 Can reproduce.
On Thursday, December 10, 2015 at 11:04:57 AM UTC+1, Tomas Lycken wrote:
>
> I don't know if this is an issue with JuliaBox or with IJulia (that's why
> I'm posting here, rather than as an issue on either project) but I can't
> get a working 0.4.2 kernel running in JuliaBox -
Fellow economist here, great stuff! I'm curious to see what choices were
made, and how it compares to other DSGE toolboxes and tools out there.
Is it going to be registered in METADATA? If so, would a name like DSGE be
"allowed"?
On Thursday, December 3, 2015 at 3:05:57 PM UTC+1, Spencer Lyon
Of course you were right, that I needed to make the eltypes a bit more
flexible, so forwarddiff can do its magic.
With that fixed, it's still not working though. It runs for ~10 seconds,
and then it just says:
julia> j = ForwardDiff.jacobian(phi, P)
Killed
I tried to search for "Killed" in
I know help is much easier to provide with a MWE, but it must be the
explicit BLAS-calls, I'm sure. I will re-write the function using Julia
functions only. If it doesn't work, I'll have to try to reduce my mapping
to a simpler version I can show here. Thanks for taking your time so far.
On
So I have a map from a set of strategies of players to the best response of
each player given the original strategies (a discrete probability
distribution for each player). I want to find the jacobian of this mapping,
because I need to calculate the spectral radius of it. I am using finite
).
>
> On Saturday, November 28, 2015 at 5:41:55 PM UTC+1, Patrick Kofod Mogensen
> wrote:
>>
>> So I have a map from a set of strategies of players to the best response
>> of each player given the original strategies (a discrete probability
>> distribution for each p
I know this is not a solution to the problem, but what do you mean by "lost
code" ? Couldn't you just copy the text in the browser to a local editor?
On Monday, November 23, 2015 at 6:09:45 AM UTC+1, Thomas Moore wrote:
>
> I've been really enjoying using JuliaBox for some simulations related to
Well, try running
isa(zeros(2,4,3), AbstractMatrix)
The problem here is that a(n) (Abstract)Matrix is 2-dimensional array.
On Monday, November 16, 2015 at 12:54:15 PM UTC+1, Evan wrote:
>
> For a 2d array the following works:
>
> type mytype{T}
> var :: AbstractMatrix{T}
> end
>
> julia> t
get is not the array, but the reference to the array,
correct? So when I did a duM2 = copy(duM), all that is copied is the
references, not the arrays at duM[1] and duM[2].
On Friday, November 6, 2015 at 7:59:08 AM UTC+1, Patrick Kofod Mogensen
wrote:
>
> So I have a Matrix{Matrix{F
the "highlight"... to remove it I think either 0 width or 0
> alpha will work. You should be able to pass RGBA colors, so that includes
> opacity.
>
> On Thu, Nov 5, 2015 at 10:40 AM, Patrick Kofod Mogensen <
> patrick@gmail.com > wrote:
>
>> I've tried loo
So I have a Matrix{Matrix{Float64}}, and I want to access the first Matrix,
and multiply it by a parameter vector.
nX = 175
X = 0:1:nX-1
duM = Matrix{Float64}[[zeros(nX) -0.001*X],
[-ones(nX) zeros(nX)]];
Say I have a parameter Vector b = [9; 2], and want to perform uM[:,
I've tried looking through the documentation, and I feel like maybe I am
just not understanding the types in Color.jl well enough, but say I have a
scatter plot, how do I
a) Remove the border
b) Make the points opaque?
Looking at http://gadflyjl.org/geom_point.html I am not quite sure why the
Great read.
On Tuesday, November 3, 2015 at 6:43:24 AM UTC+1, Jason Merrill wrote:
>
> I decided to write a blog post based on the Kakuro puzzle solver problem
> from this thread:
>
> http://squishythinking.com/2015/11/02/optimizing-kakuro-in-julia/
>
> On Wednesday, October 21, 2015 at 7:08:35
I know this does _not_ answer your question, but are you really sure you
want to do this? Can't you just push your variables to an array, and access
them as x[1], x[2], ... ?
On Tuesday, November 3, 2015 at 2:40:34 PM UTC+1, cormu...@mac.com wrote:
>
> I can't work out the syntax for creating
It is currently intended, but as far as I understand, singleton dimensions
should be dropped automatically in v0.5.
I'll get in the "Thank you!" line. I'm still learning every day, but I use
Julia for pretty much everything (Economics Ph.D. student here). So yeah,
thanks a lot - and a special thanks to Andreas Noack for transmitting the
Julia-bug before leaving UCPH.
On Monday, October 26, 2015 at 4:30:26
I'm not aware of any packages in metadata, but you can perhaps get
inspiration from
https://github.com/andreasnoack/Civecm.jl (Cointegrated VAR models)
https://github.com/tomaskrehlik/VARmodels.jl (VAR models)
Neither are from repositories registered in METADATA.jl, so don't expect
too much
I think most people simply tag a final v0.3 supported version, and then
develop for v0.4 from then. Alternatively you can use the Compat package.
On Friday, October 16, 2015 at 10:40:10 PM UTC-4, feza wrote:
>
> On a related note. What is the recommended procedure for dealing with
>
Such exciting news! v0.3 -> v0.4 contains so many great fixes and additions
to the language. Looking forward to following the work on v0.5!
Thanks to all devs for making this happen!
On Friday, October 9, 2015 at 7:20:32 AM UTC-4, Tony Kelman wrote:
>
> At long last, we can announce the final
//gist.github.com/SimonDanisch/c614016d9f9551f8f64c#file-jl
>
> When in doubt, @allocated, @time and @code_llvm and @code_native are your
> friends to find out what's going on under the hood.
>
>
> Am Mittwoch, 7. Oktober 2015 22:25:00 UTC+2 schrieb Patrick Kofod Mogensen:
&
Oh, let me answer my own question 2. I did not realize I had not given
types to phat and vector in type ML. Sanity restored! I still can't find
the answer to question 1 though.
On Wednesday, October 7, 2015 at 4:25:00 PM UTC-4, Patrick Kofod Mogensen
wrote:
>
> So I asked a questio
dthedocs.org/en/latest/manual/arrays/#indexing> ?
>
> Am Mittwoch, 7. Oktober 2015 22:25:00 UTC+2 schrieb Patrick Kofod Mogensen:
>>
>> So I asked a question over at
>> https://github.com/lindahua/Devectorize.jl/issues/48#issuecomment-146307811
>> and it seems that I h
So I asked a question over at
https://github.com/lindahua/Devectorize.jl/issues/48#issuecomment-146307811
and it seems that I have got something completely wrong.
It seems that the following
index = rand(8000)
phat = zeros(8000)
phat = 1./(1+exp(-index))
binds the output of the calculation
That was supposed to be "A * B only allocates..." right?
On Tuesday, October 6, 2015 at 1:52:18 PM UTC-4, Steven G. Johnson wrote:
>
>
>
> On Tuesday, October 6, 2015 at 12:29:04 PM UTC-4, Christoph Ortner wrote:
>>
>> a *= b is equivalent to a = a * b, which allocates a temporary variable I
>>
Well, I guess your table pretty much shows it, right? It seems as it
allocates a lot of temporary memory to carry out the calculations.
On Tuesday, October 6, 2015 at 10:28:29 AM UTC-4, Lionel du Peloux wrote:
>
> Dear all,
>
> I'm looking for the fastest way to do element-wise vector
I have not tested it, but I really should. Why? I still work with a lot of
people who are MATLAB-users, and are not going to transition. This could
potentially help Julia <-> MATLAB collaborators a lot!
On Friday, October 2, 2015 at 7:45:49 PM UTC-4, Tracy Wadleigh wrote:
>
> I'm pleased to
Say I have an array with come results of a computation, and I have to do
this computation many times. I want to use the memory I've already
allocated, but I want to start from zero. Is the best way to zero an array:
A = rand(3000,3000)
A[:] = 0.
Or is there a better way?
Patrick
>> On Monday, October 5, 2015 at 10:21:08 AM UTC-4, Patrick Kofod Mogensen
>> wrote:
>>>
>>> Say I have an array with come results of a computation, and I have to do
>>> this computation many times. I want to use the memory I've already
>>> all
r... part.
On Friday, September 18, 2015 at 8:00:08 AM UTC-4, Steven G. Johnson wrote:
>
>
>
> On Thursday, September 17, 2015 at 11:38:29 PM UTC-4, Patrick Kofod
> Mogensen wrote:
>>
>> Yes, I did indeed just notice I had made a mistake with the ;.
>>
>> My
Gotcha, thought you said that [A, B] worked now.
On Wednesday, September 30, 2015 at 8:12:31 PM UTC-4, Steven G. Johnson
wrote:
>
> As I said, [A; B] concatenates vertically into a single matrix. For an
> array of arrays, do typeof(A)[A, B]. In a future Julia version you will be
> able to do
As Stefan - I love it. Imo, stuff like this really enhances the user
experience and productivity.
On Tuesday, September 29, 2015 at 3:24:13 PM UTC-4, Tim Holy wrote:
>
> Simon Danisch and I are pleased to introduce FileIO.jl
> (https://github.com/JuliaIO/FileIO.jl), a package for centralizing
Is it because you use it many times? just define a clinspace that collects
the linspace.
On Tuesday, September 29, 2015 at 6:44:43 PM UTC-4, feza wrote:
>
> In matlab x = linspace(0,1,n) creates a vector of floats of length n. In
> julia it seems like the only way to do this is to use x =
No:
julia> logspace(0,3,5)
5-element Array{Float64,1}:
1.0
5.62341
31.6228
177.828
1000.0
On Tuesday, September 29, 2015 at 8:50:47 PM UTC-4, Luke Stagner wrote:
>
> Thats interesting. Does logspace also return a range?
>
> On Tuesday, September 29, 2015 at 5:43:28 PM
Well, at least we have learned that people are looking for a good IDE for
Julia :)
On Friday, September 18, 2015 at 3:45:41 PM UTC-4, Daniel Carrera wrote:
>
> Hello,
>
> Just for fun, does anyone want to help me model the distribution of posts
> per thread in the julia-users list?
>
> The
I know the title is extremely general, so let me explain myself.
I am an economist, and I often work with discrete choice models. Some agent
can take an action, a, indexed by j = 1 ... J. When solving the the model,
there are often a lot of features which depend on the action - a transition
e sure I'm not jumping all over the place) or
what does "locality" mean in this context? Sorry if this is basic stuff.
On Thursday, September 17, 2015 at 11:25:45 PM UTC-4, Steven G. Johnson
wrote:
>
>
>
> On Thursday, September 17, 2015 at 11:11:57 PM UTC-4, Patrick Kofod
&
trices, then a
> vector of matrices sounds fine. Are there computations that require the
> full stacked matrix?
>
> On Thu, Sep 17, 2015 at 11:11 PM, Patrick Kofod Mogensen <
> patrick@gmail.com > wrote:
>
>> I know the title is extremely general, so let me explain my
all temporaries, writing out the loops is the best way at the
> moment.
>
> On Mon, Sep 14, 2015 at 9:36 PM, Patrick Kofod Mogensen <
> patrick@gmail.com > wrote:
>
>> Thank you both. Maybe I was somewhat unclear still, but it should be
>>
>>
>> for
Super exciting work! And I must say: great with a video along with the ANN.
(Good idea with the playlist as well, Viral)
On Tuesday, September 15, 2015 at 11:29:37 AM UTC-4, Tim Holy wrote:
>
> Since yesterday I've added a couple of new features:
> - A "Save As" button to the toolbar so you can
1 - 100 of 135 matches
Mail list logo