I emailed John Myles White a few months back about merging. One of his
concerns was that OnlineStats looks more ambitious, but he wanted to work
together. I was focused on the implementation progress to show off for my
oral prelim (I'm a PhD student in statistics), so nothing ever came of it.
would reflect that.
I'd love to have a discussion about any redesign/merge. It sounds to me
like OnlineStats is the more natural destination for any merge... But maybe
John should weigh in?
On Apr 26, 2015, at 8:51 AM, Josh Day emailj...@gmail.com javascript:
wrote:
I emailed John
I've been working on https://github.com/joshday/OnlineStats.jl. The
src/README shows the implementation progress. It's partially a playground
for my research (on online algorithms for statistics).
Please take a look and let me know what you think, but my regression stuff
is currently in
yword-arguments
>
> --Tim
>
> On Thursday, January 07, 2016 11:02:03 AM Josh Day wrote:
> > Suppose I have a function that takes several arguments of different
> types
> > and each has a default value. What is the best way to specify all
> possible
> > m
e you could turn
> this into a macro fairly easily.
>
>
> On Thu, Jan 7, 2016 at 2:02 PM, Josh Day <emailj...@gmail.com
> > wrote:
>
>> Suppose I have a function that takes several arguments of different types
>> and each has a default value. What is the bes
Suppose I have a function that takes several arguments of different types
and each has a default value. What is the best way to specify all possible
methods where a user can specify an argument without entering the defaults
that come before it? I don't want to force a user to remember the
Sounds great, I'm in. I'd be especially interested in talking about
LearnBase.
I'm working on https://github.com/joshday/SparseRegression.jl for penalized
regression problems. I'm still optimizing the code, but a test set of that
size is not a problem.
julia> n, p = 1000, 262144; x = randn(n, p); y = x*randn(p) + randn(n);
julia> @time o = SparseReg(x, y,
I think a lot of what you're looking for already exists. It's just that
things like "run a regression according to variable names" wouldn't belong
in base Julia. If you haven't already, I'd take a look at StatsBase.jl,
DataFrames.jl, and GLM.jl.
Yes, I just saw it and searched about it to find this post.
How about releasing it as a Julia package? You can handle your (Julia)
dependencies with the REQUIRE file.
I'm working on a reproducible PhD thesis, and that's the route I'm going.
All my julia code and tex files will be in there and can be built from
scratch.
On Thursday, February 11, 2016
I believe that's iTerm being used
with https://github.com/Keno/TerminalExtensions.jl. Depending on the
complexity of your plots, https://github.com/Evizero/UnicodePlots.jl may be
sufficient for you.
On Saturday, April 2, 2016 at 6:45:22 AM UTC-4, Oliver Schulz wrote:
>
> Hi,
>
> I'm looking
Hi all. Suppose I have a large matrix with entries {0, 1} and I'd like to
keep storage small by using a BitMatrix. Are there any tricks to squeeze
better performance out of BitMatrix multiplication? I'm also curious about
the performance difference between Matrix{Bool} and Matrix{Int8}.
There might be some related issues already reported, but I didn't see one
that was quite what I'm seeing. I can clone my repos outside of Julia
normally , but Pkg.clone in Julia results in repeatedly getting asked for
my username/password. I don't get an error if I provide an incorrect
I completely forgot about 2FA. That's the culprit. Thanks for the pointer
and sorry for the noise.
Sure thing: https://github.com/JuliaLang/julia/issues/18908
After much confusion, I discovered That BLAS.syr! gives incorrect results
when using a view. Is this a bug, or is it not recommended to use views
with BLAS functions?
julia> a1 = zeros(2,2); a2 = zeros(2, 2); x = randn(5, 2);
julia> BLAS.syr!('U', 1.0, view(x, 1, :), a1)
2×2
17 matches
Mail list logo