Please note that this is a really a TOY nn project, actually a direct
translation of gigasquid's nn hello-world. It is ridiculous to compare it
with a library that delegates nn work to cuDNN. And it is a really old
version of affe, that ddosic has improved in the meantime, but have not yet
rele
Small addition to this post:
There is a tiny library (toy project) of ddosic, who build a neural network
with neanderthal. It might be interesting as a benchmark of what speed
neanderthal can reach (although it might not currently be a good reflection
of neanderthal), versus larger packages wit
This is amazing!! Thanks so much for releasing it. Very excited to dig in.
On Wed, Oct 12, 2016 at 7:44 PM, wrote:
> Hi,
> We've made cortex public:
>
> https://github.com/thinktopic/cortex
>
> Fork away, and we hope that this contributes to a growing ML community in
> Clojur
Hi,
We've made cortex public:
https://github.com/thinktopic/cortex
Fork away, and we hope that this contributes to a growing ML community in
Clojure. Thoughts, ideas, feedback are welcome!
Cheers,
Jeff
On Saturday, October 8, 2016 at 6:00:21 PM UTC-6, je...@thinktopic.com
Hey,
I'm glad this came up. We were initially thinking we'd wait until the
API and design had stabilized before releasing Cortex, but there is enough
of value now that I think Kovas is right. We should release it.
Cortex has been a collaboration between ThinkTopic and Mike Anderson, and
it'
>
>
> If people build on the 'wrong' api, thats a good problem to have. The
> field is so in flux anyway. The problem can also be mitigated through
> minimalism in what is released in the beginning.
>
> This.
--
You received this message because you are subscribed to the Google
Groups "Cloju
Hi Mike,
Thanks for the update.
> Opening the source is not entirely my decision, this is a collaboration
> with the Thinktopic folks (Jeff Rose et al.). I'm personally in favour of
> being pretty open about this stuff but I do think that it would be a
> mistake if people build too much stuff
Hi Kovas,
> One question:
>
> Is it possible to feed Neanderthal's matrix representation (the underlying
> bytes) into one of these other libraries, to obtain
> computations Neanderthal doesn't support?
>
There are two parts to that question, I think: 1) How can you make
Neanderthal work wit
For those interested: here's some work on using deeplearning4j in clojure
(no abstractions added, simply a port of the nd4j and deeplearning4j API's
to 'clojuresque' functions and multimethods.)
https://github.com/engagor/dl4clj
If anybody wants to contribute they're welcome of course!
Op vrij
I think deeplearning4J is a contender for deeplearning in clojure. I have
not used it .. but I repeatedly see the sponsored link on clojure-reddit.
Since nobody mentioned it .. I thought of mentioning it
On Fri, Oct 7, 2016 at 7:40 AM, kovas boguta wrote:
> On Thu, Oct 6, 2016 at 9:26 PM, Mikera
On Thu, Oct 6, 2016 at 9:26 PM, Mikera wrote:
>
> I'm hoping to work with Dragan to get core.matrix integration working with
> Neanderthal, now that Windows support is finally arriving. This would get
> you a few big advantages:
>
Yes, I can see how my problem relates to the core.matrix vision.
On Thu, Oct 6, 2016 at 9:20 PM, Mikera wrote:
> Hi Dragan,
>
> We have things working quite well (including stuff like cuDNN integration
> for convolution networks on the GPU). We also have all of the standard
> stuff (many different layer types, dropout, noise function, regularisation
> etc.). H
On Friday, 7 October 2016 08:25:31 UTC+8, kovasb wrote:
>
> On Thu, Oct 6, 2016 at 4:46 PM, Dragan Djuric > wrote:
>
>
>> s more harm than good. I prefer to give users one Ford model T, than let
>> them choose between 20 different horse carriages. And, if they can even
>> choose the color, pro
Hi Dragan,
We have things working quite well (including stuff like cuDNN integration
for convolution networks on the GPU). We also have all of the standard
stuff (many different layer types, dropout, noise function, regularisation
etc.). However I think it still needs a bunch of work before we
On Thu, Oct 6, 2016 at 4:46 PM, Dragan Djuric wrote:
> s more harm than good. I prefer to give users one Ford model T, than let
> them choose between 20 different horse carriages. And, if they can even
> choose the color, provided that their choice is black :)
>
Thanks the for the comments, whi
Just a small addition: I looked at BidiMat's code and even at the JNI/C
level they are doing some critical things that work on small scale but byte
unexpectedly when JVM needs to rearrange memory and also may trigger
copying.
On Thursday, October 6, 2016 at 10:46:04 PM UTC+2, Dragan Djuric wrot
Hi Kovas,
> By the way, I'd love to see matrix/tensor benchmarks of Neanderthal and
> Vectorz vs ND4J, MXNet's NDArray, and BidMat.. :)
>
I don't have exact numbers, but will try to give you a few pointers to help
you if you decide to investigate this further:
0. Neanderthal's scope is matri
+1 to Dragan's inquiry.
FWIW, was reviewing the state of affairs the other day:
- MXNet currently has the best JVM interop story, among DL frameworks that
have competitive perf. - DL4J has improved a lot recently but still looks
like it has a ways to go in terms of perf.
Right now I'm more inter
Hey Mike,
A friend asked me if I know of any good (usable) deep learning libraries
for Clojure. I remembered you had some earlier neural networks library that
was at least OK for experimenting, but seems abandoned for your current
work in a similar domain. A bit of digging lead me to this post.
On Tue, May 31, 2016 at 9:36 AM, atucker wrote:
> Given that the TensorFlow website invites people to build interfaces from
> other languages using SWIG, I guess they feel that access to the C++
> component is the major thing. So while I agree with Christian about
> reinventing the wheel, it may
On Tue, May 31, 2016 at 7:51 AM, Christian Weilbach <
whitesp...@polyc0l0r.net> wrote:
>
> Almost all of the development in deep learning is done in Python, so
> having to reproduce this work on a different runtime (and language)
> seems non-Clojure-like for me (compared to being hosted on the JVM
On Tue, May 31, 2016 at 1:17 AM, Mikera
wrote:
> I've been working with a number of collaborators on a deep learning
> library for Clojure.
>
> Some key features:
> - An abstract API for key machine learning functionality
> - Ability to declare graphs / stacks of operations (somewhat analogous to
Given that the TensorFlow website invites people to build interfaces from
other languages using SWIG, I guess they feel that access to the C++
component is the major thing. So while I agree with Christian about
reinventing the wheel, it may be that to interface at that level would
involve rein
> - Ability to declare graphs / stacks of operations (somewhat analogous to
> tensorflow)
>
I'd be interested to know more as I've been working with factor graphs in
Clojure with core.matrix, and it sounds related -- have you done anything
like message-passing on graphs ?
--
You received t
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
On 31.05.2016 07:17, Mikera wrote:
> I've been working with a number of collaborators on a deep
> learning library for Clojure.
>
> Some key features: - An abstract API for key machine learning
> functionality - Ability to declare graphs / stacks of o
I've been working with a number of collaborators on a deep learning library
for Clojure.
Some key features:
- An abstract API for key machine learning functionality
- Ability to declare graphs / stacks of operations (somewhat analogous to
tensorflow)
- Support for multiple underlying implementa
26 matches
Mail list logo