Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-20 Thread Dragan Djuric
Please note that this is a really a TOY nn project, actually a direct 
translation of gigasquid's nn hello-world. It is ridiculous to compare it 
with a library that delegates nn work to cuDNN. And it is a really old 
version of affe, that ddosic has improved in the meantime, but have not yet 
released.

What could be fair, for example, is to compare affe with another toy 
library built with core.matrix, or to compare a cuDNN bindings built with 
help from neanderthal's CUDA engine (not yet available) with Cortex, or to 
compare Cortex with something that delegates to one of experimental 
OpenCL-based "real" nn libraries...

On Thursday, October 20, 2016 at 3:13:40 PM UTC+2, Boris V. Schmid wrote:
>
> Small addition to this post:
>
> There is a tiny library (toy project) of ddosic, who build a neural 
> network with neanderthal. It might be interesting as a benchmark of what 
> speed neanderthal can reach (although it might not currently be a good 
> reflection of neanderthal), versus larger packages with more overhead.
>
> https://github.com/ddosic/affe
>
> On Monday, May 30, 2016 at 8:34:41 PM UTC+2, kovasb wrote:
>>
>> Anyone seriously working on deep learning with Clojure?
>>
>> I'm working with Torch at the day job, and have done work integrating 
>> Tensorflow into Clojure, so I'm fairly familiar with the challenges of what 
>> needs to be done. A bit too much to bite off on my own in my spare time. 
>>
>> So is anyone out there familiar enough with these tools to have a 
>> sensible conversation of what could be done in Clojure?
>>
>> The main question on my mind is: what level of abstraction would be 
>> useful?
>>
>> All the existing tools have several layers of abstraction. In Tensorflow, 
>> at the bottom theres the DAG of operations, and above that a high-level 
>> library of python constructs to build the DAG (and now of course libraries 
>> going higher still). In Torch, its more complicated: there's the excellent 
>> tensor library at the bottom; the NN modules that are widely used; and 
>> various non-orthogonal libraries and modules stack on top of those. 
>>
>> One could try to integrate at the bottom layer, and then re-invent the 
>> layers above that in Clojure. Or one could try to integrate at the higher 
>> layers, which is more complicated, but gives more leverage from the 
>> existing ecosystem. 
>>
>> Any thoughts?
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-20 Thread Boris V. Schmid
Small addition to this post:

There is a tiny library (toy project) of ddosic, who build a neural network 
with neanderthal. It might be interesting as a benchmark of what speed 
neanderthal can reach (although it might not currently be a good reflection 
of neanderthal), versus larger packages with more overhead.

https://github.com/ddosic/affe

On Monday, May 30, 2016 at 8:34:41 PM UTC+2, kovasb wrote:
>
> Anyone seriously working on deep learning with Clojure?
>
> I'm working with Torch at the day job, and have done work integrating 
> Tensorflow into Clojure, so I'm fairly familiar with the challenges of what 
> needs to be done. A bit too much to bite off on my own in my spare time. 
>
> So is anyone out there familiar enough with these tools to have a sensible 
> conversation of what could be done in Clojure?
>
> The main question on my mind is: what level of abstraction would be useful?
>
> All the existing tools have several layers of abstraction. In Tensorflow, 
> at the bottom theres the DAG of operations, and above that a high-level 
> library of python constructs to build the DAG (and now of course libraries 
> going higher still). In Torch, its more complicated: there's the excellent 
> tensor library at the bottom; the NN modules that are widely used; and 
> various non-orthogonal libraries and modules stack on top of those. 
>
> One could try to integrate at the bottom layer, and then re-invent the 
> layers above that in Clojure. Or one could try to integrate at the higher 
> layers, which is more complicated, but gives more leverage from the 
> existing ecosystem. 
>
> Any thoughts?
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-12 Thread kovas boguta
This is amazing!! Thanks so much for releasing it. Very excited to dig in.


On Wed, Oct 12, 2016 at 7:44 PM,  wrote:

> Hi,
>   We've made cortex public:
>
>  https://github.com/thinktopic/cortex
>
> Fork away, and we hope that this contributes to a growing ML community in
> Clojure.  Thoughts, ideas, feedback are welcome!
>
> Cheers,
> Jeff
>
>
> On Saturday, October 8, 2016 at 6:00:21 PM UTC-6, je...@thinktopic.com
> wrote:
>>
>> Hey,
>>   I'm glad this came up.  We were initially thinking we'd wait until the
>> API and design had stabilized before releasing Cortex, but there is enough
>> of value now that I think Kovas is right.  We should release it.
>>
>> Cortex has been a collaboration between ThinkTopic and Mike Anderson, and
>> it's design is somewhat similar to Torch.  (Neural network layers that
>> implement protocols for forward computation and backward propagation of
>> gradients along with a set of optimizers.)  Although we are already using
>> Cortex models in production, it's definitely still a library in flux.  We
>> had arrived at a pretty good set of base abstractions in pure Clojure using
>> core.matrix, but then when we decided to add gpu support we realized we had
>> to refactor things a bit.  The CuDNN and CuBLAS libraries from Nvidia
>> provide a lot of useful functionality, but they also come with their own
>> abstractions for matrix computation, neural network layers, optimizers,
>> etc.  To take advantage of the GPU effectively we also need to be able to
>> sequence and interleave tasks using streams, which requires a variety of
>> design changes unless we want to maintain separate implementations of
>> almost everything for both the CPU and the GPU.  You can build and run
>> networks now, but we are still exploring how this should all come together.
>>
>> So, as long as you keep in mind that things will change, it would be
>> great to have you all join the conversation and help experiment with
>> abstractions and apis.
>>
>> Give us a couple days to do some house keeping, and then we'll open
>> source it.
>>
>> -Jeff
>>
>>
>> On Thursday, October 6, 2016 at 8:08:41 PM UTC-6, kovasb wrote:
>>>
>>> On Thu, Oct 6, 2016 at 9:20 PM, Mikera  wrote:
>>>
 Hi Dragan,

 We have things working quite well (including stuff like cuDNN
 integration for convolution networks on the GPU). We also have all of the
 standard stuff (many different layer types, dropout, noise function,
 regularisation etc.). However I think it still needs a bunch of work before
 we stabilise on the core API.

>>>
>>>
 Things I'm paricularly keen to have nailed down in particular before we
 go public:

>>>
>>> FWIW it sounds like you've achieved a huge amount already.
>>>
>>> There are many people in the Clojure community who can come up with a
>>> DAG abstraction. There are very, very few who have the skill and time to
>>> assess and integrate the various native libs necessary to achieve the
>>> fundamental operations in a clojure-friendly way.
>>>
>>> If people build on the 'wrong' api, thats a good problem to have. The
>>> field is so in flux anyway. The problem can also be mitigated through
>>> minimalism in what is released in the beginning.
>>>
>>> In any case, looking forward to hopefully seeing this stuff one day.
>>>
>>>
>>>
>>>
>>>
>>> --
> You received this message because you are subscribed to the Google
> Groups "Clojure" group.
> To post to this group, send email to clojure@googlegroups.com
> Note that posts from new members are moderated - please be patient with
> your first post.
> To unsubscribe from this group, send email to
> clojure+unsubscr...@googlegroups.com
> For more options, visit this group at
> http://groups.google.com/group/clojure?hl=en
> ---
> You received this message because you are subscribed to the Google Groups
> "Clojure" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to clojure+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-12 Thread jeff
Hi,
  We've made cortex public:

 https://github.com/thinktopic/cortex

Fork away, and we hope that this contributes to a growing ML community in 
Clojure.  Thoughts, ideas, feedback are welcome!

Cheers,
Jeff

On Saturday, October 8, 2016 at 6:00:21 PM UTC-6, je...@thinktopic.com 
wrote:
>
> Hey,
>   I'm glad this came up.  We were initially thinking we'd wait until the 
> API and design had stabilized before releasing Cortex, but there is enough 
> of value now that I think Kovas is right.  We should release it.
>
> Cortex has been a collaboration between ThinkTopic and Mike Anderson, and 
> it's design is somewhat similar to Torch.  (Neural network layers that 
> implement protocols for forward computation and backward propagation of 
> gradients along with a set of optimizers.)  Although we are already using 
> Cortex models in production, it's definitely still a library in flux.  We 
> had arrived at a pretty good set of base abstractions in pure Clojure using 
> core.matrix, but then when we decided to add gpu support we realized we had 
> to refactor things a bit.  The CuDNN and CuBLAS libraries from Nvidia 
> provide a lot of useful functionality, but they also come with their own 
> abstractions for matrix computation, neural network layers, optimizers, 
> etc.  To take advantage of the GPU effectively we also need to be able to 
> sequence and interleave tasks using streams, which requires a variety of 
> design changes unless we want to maintain separate implementations of 
> almost everything for both the CPU and the GPU.  You can build and run 
> networks now, but we are still exploring how this should all come together.
>
> So, as long as you keep in mind that things will change, it would be great 
> to have you all join the conversation and help experiment with abstractions 
> and apis.  
>
> Give us a couple days to do some house keeping, and then we'll open source 
> it.
>
> -Jeff
>
>
> On Thursday, October 6, 2016 at 8:08:41 PM UTC-6, kovasb wrote:
>>
>> On Thu, Oct 6, 2016 at 9:20 PM, Mikera  wrote:
>>
>>> Hi Dragan,
>>>
>>> We have things working quite well (including stuff like cuDNN 
>>> integration for convolution networks on the GPU). We also have all of the 
>>> standard stuff (many different layer types, dropout, noise function, 
>>> regularisation etc.). However I think it still needs a bunch of work before 
>>> we stabilise on the core API.
>>>
>>  
>>
>>> Things I'm paricularly keen to have nailed down in particular before we 
>>> go public:
>>>
>>
>> FWIW it sounds like you've achieved a huge amount already. 
>>
>> There are many people in the Clojure community who can come up with a DAG 
>> abstraction. There are very, very few who have the skill and time to assess 
>> and integrate the various native libs necessary to achieve the fundamental 
>> operations in a clojure-friendly way. 
>>
>> If people build on the 'wrong' api, thats a good problem to have. The 
>> field is so in flux anyway. The problem can also be mitigated through 
>> minimalism in what is released in the beginning. 
>>
>> In any case, looking forward to hopefully seeing this stuff one day. 
>>
>>
>>
>>  
>>
>>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-08 Thread jeff
Hey,
  I'm glad this came up.  We were initially thinking we'd wait until the 
API and design had stabilized before releasing Cortex, but there is enough 
of value now that I think Kovas is right.  We should release it.

Cortex has been a collaboration between ThinkTopic and Mike Anderson, and 
it's design is somewhat similar to Torch.  (Neural network layers that 
implement protocols for forward computation and backward propagation of 
gradients along with a set of optimizers.)  Although we are already using 
Cortex models in production, it's definitely still a library in flux.  We 
had arrived at a pretty good set of base abstractions in pure Clojure using 
core.matrix, but then when we decided to add gpu support we realized we had 
to refactor things a bit.  The CuDNN and CuBLAS libraries from Nvidia 
provide a lot of useful functionality, but they also come with their own 
abstractions for matrix computation, neural network layers, optimizers, 
etc.  To take advantage of the GPU effectively we also need to be able to 
sequence and interleave tasks using streams, which requires a variety of 
design changes unless we want to maintain separate implementations of 
almost everything for both the CPU and the GPU.  You can build and run 
networks now, but we are still exploring how this should all come together.

So, as long as you keep in mind that things will change, it would be great 
to have you all join the conversation and help experiment with abstractions 
and apis.  

Give us a couple days to do some house keeping, and then we'll open source 
it.

-Jeff


On Thursday, October 6, 2016 at 8:08:41 PM UTC-6, kovasb wrote:
>
> On Thu, Oct 6, 2016 at 9:20 PM, Mikera  > wrote:
>
>> Hi Dragan,
>>
>> We have things working quite well (including stuff like cuDNN integration 
>> for convolution networks on the GPU). We also have all of the standard 
>> stuff (many different layer types, dropout, noise function, regularisation 
>> etc.). However I think it still needs a bunch of work before we stabilise 
>> on the core API.
>>
>  
>
>> Things I'm paricularly keen to have nailed down in particular before we 
>> go public:
>>
>
> FWIW it sounds like you've achieved a huge amount already. 
>
> There are many people in the Clojure community who can come up with a DAG 
> abstraction. There are very, very few who have the skill and time to assess 
> and integrate the various native libs necessary to achieve the fundamental 
> operations in a clojure-friendly way. 
>
> If people build on the 'wrong' api, thats a good problem to have. The 
> field is so in flux anyway. The problem can also be mitigated through 
> minimalism in what is released in the beginning. 
>
> In any case, looking forward to hopefully seeing this stuff one day. 
>
>
>
>  
>
>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-07 Thread Dragan Djuric

>
>
> If people build on the 'wrong' api, thats a good problem to have. The 
> field is so in flux anyway. The problem can also be mitigated through 
> minimalism in what is released in the beginning. 
>
> This. 

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-07 Thread Dragan Djuric
Hi Mike,

Thanks for the update.


> Opening the source is not entirely my decision, this is a collaboration 
> with the Thinktopic folks (Jeff Rose et al.). I'm personally in favour of 
> being pretty open about this stuff but I do think that it would be a 
> mistake if people build too much stuff on top of the wrong API which is 
> likely to happen if we release prematurely.
>
> I believe that number of users that would try this is small anyway, and 
they will use exclusively for evaluation and hobby at first, so, as Kovas 
said, having people build too much on it is something we can only hope for! 
:) I hope the Thinktopic folks would support opening it...
 

> Things I'm paricularly keen to have nailed down in particular before we go 
> public:
> 1. A tensorflow-like DAG model that allows arbitrary operations to be 
> composed compose into (possibly nested) graphs
> 2. Enough abstraction that different backends work (ClojureScript, 
> Pure-JVM, Native, GPU etc.). core.matrix provides most of this, but there 
> are still some deep-learning specific operations that need tuning and can 
> be quite backend-specific, e.g. cuDNN has some specific ways of dealing 
> with mini-batches which we need to get right. I'd love to try Neanderthal 
> with this too if we can get the core.matrix integration working.
> 3. Decent, stable APIs for the algorithms that you typically want to run 
> for mini-batch training, cross-validation etc.
> 4. Pluggable gradient optimisation methods (we currently have stuff like 
> ADADELTA, SDG, ADAM etc. but would like to make sure this is sufficiently 
> general to support any optimisation method)
>
> I'll have a talk with the Thinktopic folks and see if we can come up with 
> a timeline for a open source release. In the meantime, if anyone is 
> *really* interested then we may be able to arrange collaboration on a 
> private basis.
>

It's not urgent, so any rough estimate would be great! 

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-07 Thread Dragan Djuric
Hi Kovas,


> One question:
>
> Is it possible to feed Neanderthal's matrix representation (the underlying 
> bytes) into one of these other libraries, to obtain 
> computations Neanderthal doesn't support? 
>

There are two parts to that question, I think: 1) How can you make 
Neanderthal work with any other library that you need to interoperate with? 
2) Does Neanderthal have the operation that you need, and what to do if it 
doesn't?
I'll start with 2:

2) Currently, Neanderthal supports all BLAS 1, 2 and 3 operations for 
vectors and dense matrices on CPU & GPU. The ultimate goal is to support 
all other standard matrix formats (TR, sparse, etc.) AND LAPACK, which has 
extensive support for linear algebra. The good news is that what is there 
works really, really well, because I concentrated my efforts on solving the 
hardest problem first (and succeeded!). Now it is a matter of putting the 
grunt work to repeat what's done for the existing things to cover more of 
CBLAS and LAPACK API, and even to do the integration with CUDA in a similar 
way I did the OpenCL integration. I could have even done it by now, but I 
preferred to work on other things, one of those being a bayesian data 
analysis library Bayadera, that puts what Neanderthal offers to great use 
:) I have also seen that the Yieldbot people forked Neanderthal and 
implemented some part of LAPACK, but did not release anything nor issued a 
PR. So, if the methods you need fall into the scope of matrices and linear 
algebra (BLAS + LAPACK), there is a good chance it will be supported, 
either by you or some other user providing it, or bugging me often enough 
that I realize it is urgent that I add it :)

1) There are at least two parts to the interoperation story - API for 
operations (like mm vs mmult or whatever) and that is the very easy part. 
The hard part is a multitude of matrix formats and those formats' 
representations in memory. This is what makes or breaks your performance, 
and not by a few percents but by a few orders of magnitude. The sad part is 
that almost all focus is always on the easy part, completely ignoring the 
hard part or just thinking that it will magically solve itself. So, suppose 
that you have data laid out in memory in the format A. That format may or 
may not be suitable for operation X, and if it is not, it is often a bad 
idea to shoehorn it in for convenience, instead of thinking harder about 
data flow and transition data to format B to be used appropriately. That 
means that even inside the same library, you often need to do the data 
transformations to best suit what you want to do with the data. Long story 
short, you need to do data transformations anyway, so having Neanderthal 
and ND4J support core.matrix mmult operation won't help you a bit here. 
You'll have to transform data from the one to the other. If you are lucky, 
they use the same underlying format, so the transformation is easy or even 
automatic, or can be, but the point is that someone needs to create 
explicit transformations to ensure the optimal way instead on relying on 
generic interoperability layer (at least for now).
 

> My situation: Neanderthal covers some of the algorithms I'd like to 
> implement. It looks easier to get started with and understand than the 
> alternatives. But down the line I'll likely want to do convolution, 
> sigmoid, the typical Neural Network layers. Note, I don't care about 
> 'tensors' exactly; the bookkeeping to simulate a tensor can be automated, 
> but the fundamental operations cannot. So there is a question of whether to 
> just absorb the learning curve and shortcomings of these more general 
> libraries rather than to risk switching horses midstream. 
>

I think it is important to note that the operations you mentioned are not 
in the scope of a matrix library, but in the scope of a neural networks 
library. Neanderthal is simply not a library that should have those 
operations, nor it can have all operations for all ML techniques (that are 
countlesss :)
 
On the other hand, what I created Neanderthal for, is exactly as a building 
block for such libraries. The focus is on: if you need to build a NN 
library, Neanderthal should (ideally) give you a standard matrix methods 
for computations and data shuffling, Clojure should enable you to create a 
great interface layer, and (if needed) ClojureCL should help you write 
custom optimized low-level algorithms for GPU and CPU. 

I imagine I'm not alone in this.. if there was a straightforward story for 
> how to interop Neanderthal when necessary with some more general library 
> that would be powerful. Unfortunately I'm not sufficiently expert to 
> evaluate which of the choices would be most pragmatic and how to pull it 
> off. 
>

Today's state (in Clojure, Java, and probably elsewhere) is, IMO: if you 
need a ready-made solution for NNs/DL, you have to pick one library that 
has the stuff that you need, and go with what they recommend 

Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-07 Thread Joachim De Beule
For those interested: here's some work on using deeplearning4j in clojure 
(no abstractions added, simply a port of the nd4j and deeplearning4j API's 
to 'clojuresque' functions and multimethods.)

https://github.com/engagor/dl4clj

If anybody wants to contribute they're welcome of course!

Op vrijdag 7 oktober 2016 06:22:48 UTC+2 schreef Sunil Nandihalli:
>
> I think deeplearning4J is a contender for deeplearning in clojure. I have 
> not used it .. but I repeatedly see the sponsored link on clojure-reddit. 
> Since nobody mentioned it .. I thought of mentioning it
>
> On Fri, Oct 7, 2016 at 7:40 AM, kovas boguta  > wrote:
>
>> On Thu, Oct 6, 2016 at 9:26 PM, Mikera > > wrote:
>>
>>>
>>> I'm hoping to work with Dragan to get core.matrix integration working 
>>> with Neanderthal, now that Windows support is finally arriving. This would 
>>> get you a few big advantages:
>>>
>>
>> Yes, I can see how my problem relates to the core.matrix vision. The only 
>> missing piece is the actual operations I want (nn layers) :)
>>
>>
>> -- 
>> You received this message because you are subscribed to the Google
>> Groups "Clojure" group.
>> To post to this group, send email to clo...@googlegroups.com 
>> 
>> Note that posts from new members are moderated - please be patient with 
>> your first post.
>> To unsubscribe from this group, send email to
>> clojure+u...@googlegroups.com 
>> For more options, visit this group at
>> http://groups.google.com/group/clojure?hl=en
>> --- 
>> You received this message because you are subscribed to the Google Groups 
>> "Clojure" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to clojure+u...@googlegroups.com .
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-06 Thread Sunil S Nandihalli
I think deeplearning4J is a contender for deeplearning in clojure. I have
not used it .. but I repeatedly see the sponsored link on clojure-reddit.
Since nobody mentioned it .. I thought of mentioning it

On Fri, Oct 7, 2016 at 7:40 AM, kovas boguta  wrote:

> On Thu, Oct 6, 2016 at 9:26 PM, Mikera 
> wrote:
>
>>
>> I'm hoping to work with Dragan to get core.matrix integration working
>> with Neanderthal, now that Windows support is finally arriving. This would
>> get you a few big advantages:
>>
>
> Yes, I can see how my problem relates to the core.matrix vision. The only
> missing piece is the actual operations I want (nn layers) :)
>
>
> --
> You received this message because you are subscribed to the Google
> Groups "Clojure" group.
> To post to this group, send email to clojure@googlegroups.com
> Note that posts from new members are moderated - please be patient with
> your first post.
> To unsubscribe from this group, send email to
> clojure+unsubscr...@googlegroups.com
> For more options, visit this group at
> http://groups.google.com/group/clojure?hl=en
> ---
> You received this message because you are subscribed to the Google Groups
> "Clojure" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to clojure+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-06 Thread kovas boguta
On Thu, Oct 6, 2016 at 9:26 PM, Mikera  wrote:

>
> I'm hoping to work with Dragan to get core.matrix integration working with
> Neanderthal, now that Windows support is finally arriving. This would get
> you a few big advantages:
>

Yes, I can see how my problem relates to the core.matrix vision. The only
missing piece is the actual operations I want (nn layers) :)

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-06 Thread kovas boguta
On Thu, Oct 6, 2016 at 9:20 PM, Mikera  wrote:

> Hi Dragan,
>
> We have things working quite well (including stuff like cuDNN integration
> for convolution networks on the GPU). We also have all of the standard
> stuff (many different layer types, dropout, noise function, regularisation
> etc.). However I think it still needs a bunch of work before we stabilise
> on the core API.
>


> Things I'm paricularly keen to have nailed down in particular before we go
> public:
>

FWIW it sounds like you've achieved a huge amount already.

There are many people in the Clojure community who can come up with a DAG
abstraction. There are very, very few who have the skill and time to assess
and integrate the various native libs necessary to achieve the fundamental
operations in a clojure-friendly way.

If people build on the 'wrong' api, thats a good problem to have. The field
is so in flux anyway. The problem can also be mitigated through minimalism
in what is released in the beginning.

In any case, looking forward to hopefully seeing this stuff one day.

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-06 Thread Mikera
On Friday, 7 October 2016 08:25:31 UTC+8, kovasb wrote:
>
> On Thu, Oct 6, 2016 at 4:46 PM, Dragan Djuric  > wrote:
>  
>
>> s more harm than good. I prefer to give users one Ford model T, than let 
>> them choose between 20 different horse carriages. And, if they can even 
>> choose the color, provided that their choice is black :)
>>
>
> Thanks the for the comments, which I largely agree with. 
>
> I understand your design philosophy and agree its a useful point in the 
> space to have. 
>
> One question:
>
> Is it possible to feed Neanderthal's matrix representation (the underlying 
> bytes) into one of these other libraries, to obtain 
> computations Neanderthal doesn't support? 
>
> My situation: Neanderthal covers some of the algorithms I'd like to 
> implement. It looks easier to get started with and understand than the 
> alternatives. But down the line I'll likely want to do convolution, 
> sigmoid, the typical Neural Network layers. Note, I don't care about 
> 'tensors' exactly; the bookkeeping to simulate a tensor can be automated, 
> but the fundamental operations cannot. So there is a question of whether to 
> just absorb the learning curve and shortcomings of these more general 
> libraries rather than to risk switching horses midstream. 
>
> I imagine I'm not alone in this.. if there was a straightforward story for 
> how to interop Neanderthal when necessary with some more general library 
> that would be powerful. Unfortunately I'm not sufficiently expert to 
> evaluate which of the choices would be most pragmatic and how to pull it 
> off. 
>

I'm hoping to work with Dragan to get core.matrix integration working with 
Neanderthal, now that Windows support is finally arriving. This would get 
you a few big advantages:
1. Neanderthal could be used as a drop-in replacement for other core.matrix 
implementations if it fits your use case
2. You would get extra core.matrix features (additional operations, higher 
dimensional array operations, type conversion, broadcasting etc. 
essentially for free)
3. We minimise the risk of fragmentation in the Clojure numerical ecosystem 
:-)
4. We could use Neanderthal as a optional backend for libraries like 
Incanter or our new NN libraries without any major re-engineering

I'm certainly willing to contribute a fully compliant core.matrix 
implementation for Neanderthal (will take me a couple of days as long as 
the Windows build works smoothly). This might not optimise all of the 
possible operations, but should get high performance in most of the common 
use cases.

Hope you will all support this effort!
 

>  

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-06 Thread Mikera
Hi Dragan,

We have things working quite well (including stuff like cuDNN integration 
for convolution networks on the GPU). We also have all of the standard 
stuff (many different layer types, dropout, noise function, regularisation 
etc.). However I think it still needs a bunch of work before we stabilise 
on the core API.

Opening the source is not entirely my decision, this is a collaboration 
with the Thinktopic folks (Jeff Rose et al.). I'm personally in favour of 
being pretty open about this stuff but I do think that it would be a 
mistake if people build too much stuff on top of the wrong API which is 
likely to happen if we release prematurely.

Things I'm paricularly keen to have nailed down in particular before we go 
public:
1. A tensorflow-like DAG model that allows arbitrary operations to be 
composed compose into (possibly nested) graphs
2. Enough abstraction that different backends work (ClojureScript, 
Pure-JVM, Native, GPU etc.). core.matrix provides most of this, but there 
are still some deep-learning specific operations that need tuning and can 
be quite backend-specific, e.g. cuDNN has some specific ways of dealing 
with mini-batches which we need to get right. I'd love to try Neanderthal 
with this too if we can get the core.matrix integration working.
3. Decent, stable APIs for the algorithms that you typically want to run 
for mini-batch training, cross-validation etc.
4. Pluggable gradient optimisation methods (we currently have stuff like 
ADADELTA, SDG, ADAM etc. but would like to make sure this is sufficiently 
general to support any optimisation method)

I'll have a talk with the Thinktopic folks and see if we can come up with a 
timeline for a open source release. In the meantime, if anyone is *really* 
interested then we may be able to arrange collaboration on a private basis.




On Thursday, 6 October 2016 20:44:11 UTC+8, Dragan Djuric wrote:
>
> Hey Mike,
>
> A friend asked me if I know of any good (usable) deep learning libraries 
> for Clojure. I remembered you had some earlier neural networks library that 
> was at least OK for experimenting, but seems abandoned for your current 
> work in a similar domain. A bit of digging lead me to this post.
>
> I understand that this library may not be completely ready yet, but I 
> wandered wheter now you were able to give a better estimation of where it 
> stands in comparison with other DL offerings, like what deeplearning4j guys 
> are doing, or even with the established non-Java libraries such as Theano, 
> Torch, Caffe, and TensorFlow. What is the chance of you releasing it even 
> if it is not 100% ready? 
>
> I get the reluctance to commit to a certain API, but I don't think 
> everyone will rush to commit their code to the API you release anyway, and 
> the open development will certainly help both the (potential) users and 
> your team (by returning free testing & feedback).
>
>
> On Tuesday, May 31, 2016 at 7:17:35 AM UTC+2, Mikera wrote:
>>
>> I've been working with a number of collaborators on a deep learning 
>> library for Clojure. 
>>
>> Some key features:
>> - An abstract API for key machine learning functionality
>> - Ability to declare graphs / stacks of operations (somewhat analogous to 
>> tensorflow)
>> - Support for multiple underlying implementations (ClojureScript, JVM, 
>> CPU, GPU)
>> - Integration with core.matrix for N-dimensional data processing
>>
>> We intend to release as open source. We haven't released yet because we 
>> want to get the API right first but it is looking very promising.
>>
>> On Tuesday, 31 May 2016 02:34:41 UTC+8, kovasb wrote:
>>>
>>> Anyone seriously working on deep learning with Clojure?
>>>
>>> I'm working with Torch at the day job, and have done work integrating 
>>> Tensorflow into Clojure, so I'm fairly familiar with the challenges of what 
>>> needs to be done. A bit too much to bite off on my own in my spare time. 
>>>
>>> So is anyone out there familiar enough with these tools to have a 
>>> sensible conversation of what could be done in Clojure?
>>>
>>> The main question on my mind is: what level of abstraction would be 
>>> useful?
>>>
>>> All the existing tools have several layers of abstraction. In 
>>> Tensorflow, at the bottom theres the DAG of operations, and above that a 
>>> high-level library of python constructs to build the DAG (and now of course 
>>> libraries going higher still). In Torch, its more complicated: there's the 
>>> excellent tensor library at the bottom; the NN modules that are widely 
>>> used; and various non-orthogonal libraries and modules stack on top of 
>>> those. 
>>>
>>> One could try to integrate at the bottom layer, and then re-invent the 
>>> layers above that in Clojure. Or one could try to integrate at the higher 
>>> layers, which is more complicated, but gives more leverage from the 
>>> existing ecosystem. 
>>>
>>> Any thoughts?
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>

-- 
You received 

Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-06 Thread kovas boguta
On Thu, Oct 6, 2016 at 4:46 PM, Dragan Djuric  wrote:


> s more harm than good. I prefer to give users one Ford model T, than let
> them choose between 20 different horse carriages. And, if they can even
> choose the color, provided that their choice is black :)
>

Thanks the for the comments, which I largely agree with.

I understand your design philosophy and agree its a useful point in the
space to have.

One question:

Is it possible to feed Neanderthal's matrix representation (the underlying
bytes) into one of these other libraries, to obtain
computations Neanderthal doesn't support?

My situation: Neanderthal covers some of the algorithms I'd like to
implement. It looks easier to get started with and understand than the
alternatives. But down the line I'll likely want to do convolution,
sigmoid, the typical Neural Network layers. Note, I don't care about
'tensors' exactly; the bookkeeping to simulate a tensor can be automated,
but the fundamental operations cannot. So there is a question of whether to
just absorb the learning curve and shortcomings of these more general
libraries rather than to risk switching horses midstream.

I imagine I'm not alone in this.. if there was a straightforward story for
how to interop Neanderthal when necessary with some more general library
that would be powerful. Unfortunately I'm not sufficiently expert to
evaluate which of the choices would be most pragmatic and how to pull it
off.

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-06 Thread Dragan Djuric
Just a small addition: I looked at BidiMat's code and even at the JNI/C 
level they are doing some critical things that work on small scale but byte 
unexpectedly when JVM needs to rearrange memory and also may trigger 
copying.

On Thursday, October 6, 2016 at 10:46:04 PM UTC+2, Dragan Djuric wrote:
>
> Hi Kovas,
>
>
>> By the way, I'd love to see matrix/tensor benchmarks of Neanderthal and 
>> Vectorz vs ND4J, MXNet's NDArray, and BidMat..  :)
>>
>
> I don't have exact numbers, but will try to give you a few pointers to 
> help you if you decide to investigate this further:
>
> 0. Neanderthal's scope is matrices and linear algebra. NNs and other stuff 
> is something that could be built on top of it (assuming that the features 
> needed are implemented, which may or may not be true yet), but certainly 
> not in Neanderthal.
>
> 1. Neanderthal is a 100% Clojure solution. One of the main goals, other 
> than the speed, of course, is that it is simple and straightforward, with 
> no overhead. That means that you always know what backend you are using, 
> and you get exactly the speed of that backend. If it works, you are sure it 
> works at the full speed, with no slow fallback. Theoretically, of course, 
> there is always some overhead of FFI, but in Neanderthal it is so miniscule 
> that you can ignore it for all uses that come to my mind. So, basically, 
> Neanderthal is as fast as ATLAS on CPU and CLBlast on GPU (both offer 
> state-of-the-art speed) or any (not yet existing) pure java engine that I 
> might plug in in the future if necessary.
>
> 2. All those other libraries, besides not targeting Clojure at all except 
> for general "you can call Java from Clojure", are trying to be everything 
> for everyone. That has its strengths, because you are, generally, able to 
> accommodate more use cases. On the other hand, it complicates things too 
> much, and can lead to overblown beasts. For example, it might seem good to 
> support MKL, and, ATLAS, and OpenBLAS, and netlib BLAS, and some imaginary 
> fallback solution, like ND4J does (or tries to do), but what's the point of 
> it when today they have more or less the same performance (MKL being a bit 
> faster - in percentages only - but requires $$), and supporting all that 
> stuff makes code AND the installation much, much, more compelex. BLAS is so 
> mature that I think it is better to choose one solution and offer it out of 
> the box. Technically, neanderthal can support all other native blas 
> libraries too, but I intentionally restricted that option because I think 
> fiddling with it does more harm than good. I prefer to give users one Ford 
> model T, than let them choose between 20 different horse carriages. And, if 
> they can even choose the color, provided that their choice is black :)
>
> 3. ND4J is, in my opinion, a typical overblown solution. deeplearning4j 
> guys got the investment dollars, and have to rush to the market with 
> business-friendly solution, which usually favors having a checklist of 
> features regardless of whether those features make sense for the little 
> guy. I hope they succeed in the business sense, but the code I'm seeing 
> from them does not seem promising to me regarding Java getting a great 
> DL/NN library.
>
> 4. NDArray is actually, as the name suggests, a nd array library, and not 
> a matrix library. Why is this important?  Vectors and matrices are 
> something that has been very well researched through decades. The ins and 
> outs of algorithm/architecture fit are known and implemented in BLAS 
> libraries, so you are sure that you get the full performance. N-dimensional 
> vectors (sometime referred as tensors, although that name is not accurate 
> IMO) not so much. So, it is easy that an operation that looks convenient 
> does not lead to a good performance. I do not say it is bad, because if you 
> need that operation, it is better to have something than nothing, but for 
> now I decided to not support 3+ dimensions. This is something that might 
> belong to Neanderthal or on top of it. A long term goal, to be sure. 
> Another aspect of that story is knowledge: most books that I read from the 
> fields of ML/AI give all formulas as vectors of matrices. Basically, 
> matrices are at least 95% (if not more) of potential users need or even 
> understand!
>
> 5. BidiMat seems to have much larger scope. For example, at their 
> benchmark page, I see benchmarks for machine learning algorithms, but for 
> nothing matrix-y.
>
> The speed comparison; it boils down to this: both Neanderthal and those 
> libraries use (or can be linked to use) the same native BLAS libraries. I 
> took great care to make sure Neanderthal does not incur any copying or 
> calling overhead. From what I saw glancing at the code of other libraries, 
> they didn't. They might support that if you set up everything well among 
> lots of options, or they don't if you do not know how to ensure this. So I 
> doubt any of those could 

Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-06 Thread Dragan Djuric
Hi Kovas,


> By the way, I'd love to see matrix/tensor benchmarks of Neanderthal and 
> Vectorz vs ND4J, MXNet's NDArray, and BidMat..  :)
>

I don't have exact numbers, but will try to give you a few pointers to help 
you if you decide to investigate this further:

0. Neanderthal's scope is matrices and linear algebra. NNs and other stuff 
is something that could be built on top of it (assuming that the features 
needed are implemented, which may or may not be true yet), but certainly 
not in Neanderthal.

1. Neanderthal is a 100% Clojure solution. One of the main goals, other 
than the speed, of course, is that it is simple and straightforward, with 
no overhead. That means that you always know what backend you are using, 
and you get exactly the speed of that backend. If it works, you are sure it 
works at the full speed, with no slow fallback. Theoretically, of course, 
there is always some overhead of FFI, but in Neanderthal it is so miniscule 
that you can ignore it for all uses that come to my mind. So, basically, 
Neanderthal is as fast as ATLAS on CPU and CLBlast on GPU (both offer 
state-of-the-art speed) or any (not yet existing) pure java engine that I 
might plug in in the future if necessary.

2. All those other libraries, besides not targeting Clojure at all except 
for general "you can call Java from Clojure", are trying to be everything 
for everyone. That has its strengths, because you are, generally, able to 
accommodate more use cases. On the other hand, it complicates things too 
much, and can lead to overblown beasts. For example, it might seem good to 
support MKL, and, ATLAS, and OpenBLAS, and netlib BLAS, and some imaginary 
fallback solution, like ND4J does (or tries to do), but what's the point of 
it when today they have more or less the same performance (MKL being a bit 
faster - in percentages only - but requires $$), and supporting all that 
stuff makes code AND the installation much, much, more compelex. BLAS is so 
mature that I think it is better to choose one solution and offer it out of 
the box. Technically, neanderthal can support all other native blas 
libraries too, but I intentionally restricted that option because I think 
fiddling with it does more harm than good. I prefer to give users one Ford 
model T, than let them choose between 20 different horse carriages. And, if 
they can even choose the color, provided that their choice is black :)

3. ND4J is, in my opinion, a typical overblown solution. deeplearning4j 
guys got the investment dollars, and have to rush to the market with 
business-friendly solution, which usually favors having a checklist of 
features regardless of whether those features make sense for the little 
guy. I hope they succeed in the business sense, but the code I'm seeing 
from them does not seem promising to me regarding Java getting a great 
DL/NN library.

4. NDArray is actually, as the name suggests, a nd array library, and not a 
matrix library. Why is this important?  Vectors and matrices are something 
that has been very well researched through decades. The ins and outs of 
algorithm/architecture fit are known and implemented in BLAS libraries, so 
you are sure that you get the full performance. N-dimensional vectors 
(sometime referred as tensors, although that name is not accurate IMO) not 
so much. So, it is easy that an operation that looks convenient does not 
lead to a good performance. I do not say it is bad, because if you need 
that operation, it is better to have something than nothing, but for now I 
decided to not support 3+ dimensions. This is something that might belong 
to Neanderthal or on top of it. A long term goal, to be sure. Another 
aspect of that story is knowledge: most books that I read from the fields 
of ML/AI give all formulas as vectors of matrices. Basically, matrices are 
at least 95% (if not more) of potential users need or even understand!

5. BidiMat seems to have much larger scope. For example, at their benchmark 
page, I see benchmarks for machine learning algorithms, but for nothing 
matrix-y.

The speed comparison; it boils down to this: both Neanderthal and those 
libraries use (or can be linked to use) the same native BLAS libraries. I 
took great care to make sure Neanderthal does not incur any copying or 
calling overhead. From what I saw glancing at the code of other libraries, 
they didn't. They might support that if you set up everything well among 
lots of options, or they don't if you do not know how to ensure this. So I 
doubt any of those could be noticeably faster, and they can be much slower 
if you slip somewhere.
I would also love to see straightforward numbers, but I was unable to find 
anything like that for those libraries. BidiMat, for example, gives 
benchmarks of K-Means on MNIST dataset - I do not know how this can be used 
to discern how fast is it with matrices, other that it is, generally, fast 
at K-Means.

-- 
You received this message because you are subscribed to 

Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-06 Thread kovas boguta
+1 to Dragan's inquiry.

FWIW, was reviewing the state of affairs the other day:

- MXNet currently has the best JVM interop story, among DL frameworks that
have competitive perf. - DL4J has improved a lot recently but still looks
like it has a ways to go in terms of perf.

Right now I'm more interesting in word2vec type things, which don't require
a deep net, so I might give Neanderthal a shot.

By the way, I'd love to see matrix/tensor benchmarks of Neanderthal and
Vectorz vs ND4J, MXNet's NDArray, and BidMat..  :)



On Thu, Oct 6, 2016 at 8:44 AM, Dragan Djuric  wrote:

> Hey Mike,
>
> A friend asked me if I know of any good (usable) deep learning libraries
> for Clojure. I remembered you had some earlier neural networks library that
> was at least OK for experimenting, but seems abandoned for your current
> work in a similar domain. A bit of digging lead me to this post.
>
> I understand that this library may not be completely ready yet, but I
> wandered wheter now you were able to give a better estimation of where it
> stands in comparison with other DL offerings, like what deeplearning4j guys
> are doing, or even with the established non-Java libraries such as Theano,
> Torch, Caffe, and TensorFlow. What is the chance of you releasing it even
> if it is not 100% ready?
>
> I get the reluctance to commit to a certain API, but I don't think
> everyone will rush to commit their code to the API you release anyway, and
> the open development will certainly help both the (potential) users and
> your team (by returning free testing & feedback).
>
>
> On Tuesday, May 31, 2016 at 7:17:35 AM UTC+2, Mikera wrote:
>>
>> I've been working with a number of collaborators on a deep learning
>> library for Clojure.
>>
>> Some key features:
>> - An abstract API for key machine learning functionality
>> - Ability to declare graphs / stacks of operations (somewhat analogous to
>> tensorflow)
>> - Support for multiple underlying implementations (ClojureScript, JVM,
>> CPU, GPU)
>> - Integration with core.matrix for N-dimensional data processing
>>
>> We intend to release as open source. We haven't released yet because we
>> want to get the API right first but it is looking very promising.
>>
>> On Tuesday, 31 May 2016 02:34:41 UTC+8, kovasb wrote:
>>>
>>> Anyone seriously working on deep learning with Clojure?
>>>
>>> I'm working with Torch at the day job, and have done work integrating
>>> Tensorflow into Clojure, so I'm fairly familiar with the challenges of what
>>> needs to be done. A bit too much to bite off on my own in my spare time.
>>>
>>> So is anyone out there familiar enough with these tools to have a
>>> sensible conversation of what could be done in Clojure?
>>>
>>> The main question on my mind is: what level of abstraction would be
>>> useful?
>>>
>>> All the existing tools have several layers of abstraction. In
>>> Tensorflow, at the bottom theres the DAG of operations, and above that a
>>> high-level library of python constructs to build the DAG (and now of course
>>> libraries going higher still). In Torch, its more complicated: there's the
>>> excellent tensor library at the bottom; the NN modules that are widely
>>> used; and various non-orthogonal libraries and modules stack on top of
>>> those.
>>>
>>> One could try to integrate at the bottom layer, and then re-invent the
>>> layers above that in Clojure. Or one could try to integrate at the higher
>>> layers, which is more complicated, but gives more leverage from the
>>> existing ecosystem.
>>>
>>> Any thoughts?
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> --
> You received this message because you are subscribed to the Google
> Groups "Clojure" group.
> To post to this group, send email to clojure@googlegroups.com
> Note that posts from new members are moderated - please be patient with
> your first post.
> To unsubscribe from this group, send email to
> clojure+unsubscr...@googlegroups.com
> For more options, visit this group at
> http://groups.google.com/group/clojure?hl=en
> ---
> You received this message because you are subscribed to the Google Groups
> "Clojure" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to clojure+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more 

Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-10-06 Thread Dragan Djuric
Hey Mike,

A friend asked me if I know of any good (usable) deep learning libraries 
for Clojure. I remembered you had some earlier neural networks library that 
was at least OK for experimenting, but seems abandoned for your current 
work in a similar domain. A bit of digging lead me to this post.

I understand that this library may not be completely ready yet, but I 
wandered wheter now you were able to give a better estimation of where it 
stands in comparison with other DL offerings, like what deeplearning4j guys 
are doing, or even with the established non-Java libraries such as Theano, 
Torch, Caffe, and TensorFlow. What is the chance of you releasing it even 
if it is not 100% ready? 

I get the reluctance to commit to a certain API, but I don't think everyone 
will rush to commit their code to the API you release anyway, and the open 
development will certainly help both the (potential) users and your team 
(by returning free testing & feedback).


On Tuesday, May 31, 2016 at 7:17:35 AM UTC+2, Mikera wrote:
>
> I've been working with a number of collaborators on a deep learning 
> library for Clojure. 
>
> Some key features:
> - An abstract API for key machine learning functionality
> - Ability to declare graphs / stacks of operations (somewhat analogous to 
> tensorflow)
> - Support for multiple underlying implementations (ClojureScript, JVM, 
> CPU, GPU)
> - Integration with core.matrix for N-dimensional data processing
>
> We intend to release as open source. We haven't released yet because we 
> want to get the API right first but it is looking very promising.
>
> On Tuesday, 31 May 2016 02:34:41 UTC+8, kovasb wrote:
>>
>> Anyone seriously working on deep learning with Clojure?
>>
>> I'm working with Torch at the day job, and have done work integrating 
>> Tensorflow into Clojure, so I'm fairly familiar with the challenges of what 
>> needs to be done. A bit too much to bite off on my own in my spare time. 
>>
>> So is anyone out there familiar enough with these tools to have a 
>> sensible conversation of what could be done in Clojure?
>>
>> The main question on my mind is: what level of abstraction would be 
>> useful?
>>
>> All the existing tools have several layers of abstraction. In Tensorflow, 
>> at the bottom theres the DAG of operations, and above that a high-level 
>> library of python constructs to build the DAG (and now of course libraries 
>> going higher still). In Torch, its more complicated: there's the excellent 
>> tensor library at the bottom; the NN modules that are widely used; and 
>> various non-orthogonal libraries and modules stack on top of those. 
>>
>> One could try to integrate at the bottom layer, and then re-invent the 
>> layers above that in Clojure. Or one could try to integrate at the higher 
>> layers, which is more complicated, but gives more leverage from the 
>> existing ecosystem. 
>>
>> Any thoughts?
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-05-31 Thread kovas boguta
On Tue, May 31, 2016 at 9:36 AM, atucker  wrote:

> Given that the TensorFlow website invites people to build interfaces from
> other languages using SWIG, I guess they feel that access to the C++
> component is the major thing.  So while I agree with Christian about
> reinventing the wheel, it may be that to interface at that level would
> involve reinventing only a relatively few spokes.
>

I looked into this and unfortunately Tensorflow is not as language-agnostic
as it might seem.

If you just want predictions from a trained model, you can connect with
SWIG.

However, for training, it is required to use Python because Python installs
a bunch of backprop definitions into the graph. It should be possible to
pull this off (and then call into it from Clojure).

The difficulty of doing so is hard for me to assess, so I'm hoping someone
can chime in on this.

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-05-31 Thread kovas boguta
On Tue, May 31, 2016 at 7:51 AM, Christian Weilbach <
whitesp...@polyc0l0r.net> wrote:

>
> Almost all of the development in deep learning is done in Python, so
> having to reproduce this work on a different runtime (and language)
> seems non-Clojure-like for me (compared to being hosted on the JVM and
> leveraging this ecosystem). deeplearning4j is already attempting it
> from the Java side with the argument of distributed scaling, but
> again, there is a ton of work done for the Python toolkits including
> massive scaling (e.g. with tensorflow) and most research is there, so
> my question would be what the actual goals for a Clojure stack would be?


Good question. Basic answer is that we are in the early stages of
discovering the right abstractions for these systems; Clojure provides a
strong platform for exploring the basic issues. The biggies for me are:

1. Model composition. Most of these systems have some flavor of
compilation, often with a heavy dose of mutability in the construction of
the graph. I'd argue its easier and simpler to explore the design space
starting from immutable data representations. Given the speed of evolution
in model architectures, this seems pretty important.

2. Introspection & interactivity. With Clojurescript (and even just
Clojure) we have excellent options for creating interactive UIs on top of
the models. Yes, theres IPython, but thats not a really equivalent to the
power provided by Clojurescipt.

Now, I'm all in favor of not reinventing the wheel, thats why I'm wondering
what the best available foundation would be. JyNI is quite interesting and
I hope it takes off. But just having a wrapper on a high-level Python lib
probably isn't worth the hassle, unless you get some fundamentally new
leverage.

Poking around I actually discovered something closer to what I want:
https://github.com/dmlc/mxnet

Its pretty much the only DL platform intended to be consumed as a library,
from any language. And there are already sensible JVM bindings to it.

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-05-31 Thread kovas boguta
On Tue, May 31, 2016 at 1:17 AM, Mikera 
wrote:

> I've been working with a number of collaborators on a deep learning
> library for Clojure.
>
> Some key features:
> - An abstract API for key machine learning functionality
> - Ability to declare graphs / stacks of operations (somewhat analogous to
> tensorflow)
> - Support for multiple underlying implementations (ClojureScript, JVM,
> CPU, GPU)
> - Integration with core.matrix for N-dimensional data processing
>
> We intend to release as open source. We haven't released yet because we
> want to get the API right first but it is looking very promising.
>

Looking forward to seeing this.

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-05-31 Thread atucker
Given that the TensorFlow website invites people to build interfaces from 
other languages using SWIG, I guess they feel that access to the C++ 
component is the major thing.  So while I agree with Christian about 
reinventing the wheel, it may be that to interface at that level would 
involve reinventing only a relatively few spokes.
  I did an MSc in neural networks some years ago (before it was 
fashionable) and could do with brushing up on the latest.  I might be able 
to find some time for a straightforward task such as translating 
TensorFlow's Python into Clojure.
  A


On Tuesday, 31 May 2016 12:51:45 UTC+1, Christian Weilbach wrote:
>
> -BEGIN PGP SIGNED MESSAGE- 
> Hash: SHA1 
>
> On 31.05.2016 07:17, Mikera wrote: 
> > I've been working with a number of collaborators on a deep 
> > learning library for Clojure. 
> > 
> > Some key features: - An abstract API for key machine learning 
> > functionality - Ability to declare graphs / stacks of operations 
> > (somewhat analogous to tensorflow) - Support for multiple 
> > underlying implementations (ClojureScript, JVM, CPU, GPU) - 
> > Integration with core.matrix for N-dimensional data processing 
> > 
> > We intend to release as open source. We haven't released yet 
> > because we want to get the API right first but it is looking very 
> > promising. 
>
> Almost all of the development in deep learning is done in Python, so 
> having to reproduce this work on a different runtime (and language) 
> seems non-Clojure-like for me (compared to being hosted on the JVM and 
> leveraging this ecosystem). deeplearning4j is already attempting it 
> from the Java side with the argument of distributed scaling, but 
> again, there is a ton of work done for the Python toolkits including 
> massive scaling (e.g. with tensorflow) and most research is there, so 
> my question would be what the actual goals for a Clojure stack would be? 
>
> Machine learning systems usually can be implemented very well as a 
> "microservice" because they have no state (assumed the model is 
> trained and constant) and just need to return an output given a sample 
> (act like a pure function). This has worked out fine for me in Clojure 
> so far. 
> There are many other important machine learning concepts and papers 
> beyond deep learning, which can be used that way. The network IO can 
> be problematic in some cases ofc., e.g. small models+low-latency 
> requirement, but I think the leverage is much higher and will increase 
> unless the ML community switches away from Python to the JVM (highly 
> unlikely atm.). I personally will rather focus on ML papers and 
> improving the algorithm on paper and in Python than reimplementing a 
> ton of work in Clojure just to get on parity in my favourite language. 
>
> Another point is performance. For many areas it doesn't matter too 
> much whether you are 5 or 10% faster as long as you can scale. In 
> machine learning it is often critical though as these 5-10% are 
> already hours and scaling can be hard model-wise. I have tried to get 
> performance on par with Theano out of Clatrix (both on CPU) a year ago 
> and Theano was 2-4x times faster on my machine. Even if a Clojure 
> stack would address this, there is a lot of work necessary to 
> constantly optimize the stack for different GPU architectures and 
> compute graphs. Theano has atm. 40 contributors or so, not to speak of 
> TensorFlow being backed by Google & DeepMind now. 
>
> I think a much better approach would be to bring Python native 
> bindings to the JVM. There is http://www.jyni.org/ and they seem to be 
> getting closer to numpy support, but are only two guys atm. (I think). 
> While this is a more low-level effort and has little to do with 
> Clojure, it would allow to use all Python libraries natively from 
> Clojure (together with core.matrix backends for numpy, etc.). Jython 
> is considered as slow as CPython, so one could expect to have 
> Python-like performance with a small wrapper library, because most of 
> the performance critical code is already in the native libraries. From 
> there emancipating with Clojure through a core.matrix based-stack 
> would be a non-uphill battle similar to the development of the Clojure 
> ecosystem. 
>
> What points would speak against an approach like this? 
>
>
> Christian 
>
>
>
> -BEGIN PGP SIGNATURE- 
> Version: GnuPG v2.0.22 (GNU/Linux) 
>
> iQIcBAEBAgAGBQJXTXq+AAoJEICbLivqiPOFR1sQAKZYWzGz3mEFVQuItOUgFz8p 
> /zRh3oj2jLYOT5rHxEehZkZfQEjuRVMn6NW5nPR8c6mEzUc2FRNUTJHbDAgqaWSp 
> LxIOy5qfqzuA1J1x/hlsn1JRGMrvjZv+NvW2PpG8WSgZYwblIxdzzcRCYiRQ4+tQ 
> IhGDg1CKc2awGOJHLuJmzTuHtI+fIhvhDxRBEjvlfTdIInKugS5K0rwyiXr50jcx 
> zoO5jnhibcZB9LxskmW0J/8kH/hT2RD8mwjeI5oQYZuHZ/LvZX0+U9ocihyxoL2B 
> YFd2TwDc7ebgx71gsAnSTPcrIOfIwItprP4ka2gWtmXGJR3PZxfm5JlBir/gJIYf 
> Aa3uqR19qOFCgxwUCqFCWTVgojVFcF4F+VU7dXtfrQE5hkmBycSwbXiDh4CC11jV 
> Lhlff+yjv4OL2IYrPMBbVVU/KeWH+o4ETR0GaePRfGuBOEc04048F4Xz84NBZ6ke 
> 

Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-05-31 Thread Bobby Bobble


> - Ability to declare graphs / stacks of operations (somewhat analogous to 
> tensorflow)
>

I'd be interested to know more as I've been working with factor graphs in 
Clojure with core.matrix, and it sounds related -- have you done anything 
like message-passing on graphs ? 

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-05-31 Thread Christian Weilbach
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On 31.05.2016 07:17, Mikera wrote:
> I've been working with a number of collaborators on a deep
> learning library for Clojure.
> 
> Some key features: - An abstract API for key machine learning
> functionality - Ability to declare graphs / stacks of operations
> (somewhat analogous to tensorflow) - Support for multiple
> underlying implementations (ClojureScript, JVM, CPU, GPU) -
> Integration with core.matrix for N-dimensional data processing
> 
> We intend to release as open source. We haven't released yet
> because we want to get the API right first but it is looking very
> promising.

Almost all of the development in deep learning is done in Python, so
having to reproduce this work on a different runtime (and language)
seems non-Clojure-like for me (compared to being hosted on the JVM and
leveraging this ecosystem). deeplearning4j is already attempting it
from the Java side with the argument of distributed scaling, but
again, there is a ton of work done for the Python toolkits including
massive scaling (e.g. with tensorflow) and most research is there, so
my question would be what the actual goals for a Clojure stack would be?

Machine learning systems usually can be implemented very well as a
"microservice" because they have no state (assumed the model is
trained and constant) and just need to return an output given a sample
(act like a pure function). This has worked out fine for me in Clojure
so far.
There are many other important machine learning concepts and papers
beyond deep learning, which can be used that way. The network IO can
be problematic in some cases ofc., e.g. small models+low-latency
requirement, but I think the leverage is much higher and will increase
unless the ML community switches away from Python to the JVM (highly
unlikely atm.). I personally will rather focus on ML papers and
improving the algorithm on paper and in Python than reimplementing a
ton of work in Clojure just to get on parity in my favourite language.

Another point is performance. For many areas it doesn't matter too
much whether you are 5 or 10% faster as long as you can scale. In
machine learning it is often critical though as these 5-10% are
already hours and scaling can be hard model-wise. I have tried to get
performance on par with Theano out of Clatrix (both on CPU) a year ago
and Theano was 2-4x times faster on my machine. Even if a Clojure
stack would address this, there is a lot of work necessary to
constantly optimize the stack for different GPU architectures and
compute graphs. Theano has atm. 40 contributors or so, not to speak of
TensorFlow being backed by Google & DeepMind now.

I think a much better approach would be to bring Python native
bindings to the JVM. There is http://www.jyni.org/ and they seem to be
getting closer to numpy support, but are only two guys atm. (I think).
While this is a more low-level effort and has little to do with
Clojure, it would allow to use all Python libraries natively from
Clojure (together with core.matrix backends for numpy, etc.). Jython
is considered as slow as CPython, so one could expect to have
Python-like performance with a small wrapper library, because most of
the performance critical code is already in the native libraries. From
there emancipating with Clojure through a core.matrix based-stack
would be a non-uphill battle similar to the development of the Clojure
ecosystem.

What points would speak against an approach like this?


Christian



-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.22 (GNU/Linux)

iQIcBAEBAgAGBQJXTXq+AAoJEICbLivqiPOFR1sQAKZYWzGz3mEFVQuItOUgFz8p
/zRh3oj2jLYOT5rHxEehZkZfQEjuRVMn6NW5nPR8c6mEzUc2FRNUTJHbDAgqaWSp
LxIOy5qfqzuA1J1x/hlsn1JRGMrvjZv+NvW2PpG8WSgZYwblIxdzzcRCYiRQ4+tQ
IhGDg1CKc2awGOJHLuJmzTuHtI+fIhvhDxRBEjvlfTdIInKugS5K0rwyiXr50jcx
zoO5jnhibcZB9LxskmW0J/8kH/hT2RD8mwjeI5oQYZuHZ/LvZX0+U9ocihyxoL2B
YFd2TwDc7ebgx71gsAnSTPcrIOfIwItprP4ka2gWtmXGJR3PZxfm5JlBir/gJIYf
Aa3uqR19qOFCgxwUCqFCWTVgojVFcF4F+VU7dXtfrQE5hkmBycSwbXiDh4CC11jV
Lhlff+yjv4OL2IYrPMBbVVU/KeWH+o4ETR0GaePRfGuBOEc04048F4Xz84NBZ6ke
lJhGL63JpUKqBJAPjZlU57VMoNMIczHdlMGF1oRhqkWzo0gD4ygX5C8g90xXGXLq
NjF/GiFEUWR1xzPvqLTNIX2kTveW46ZBDTiYCYCD8j+8yxGw04ow5wEoflbhz3Gd
PdWk7wb9bXlSHm6+b0Ax8CGQqMeDbb/RXqHneTDCQBjDW4olmzWfpRokUGj+K0ne
/1dgMs5AjIB+QHMW+PHZ
=3hFS
-END PGP SIGNATURE-

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit 

Re: Clojure with Tensorflow, Torch etc (call for participation, brainstorming etc)

2016-05-30 Thread Mikera
I've been working with a number of collaborators on a deep learning library 
for Clojure. 

Some key features:
- An abstract API for key machine learning functionality
- Ability to declare graphs / stacks of operations (somewhat analogous to 
tensorflow)
- Support for multiple underlying implementations (ClojureScript, JVM, CPU, 
GPU)
- Integration with core.matrix for N-dimensional data processing

We intend to release as open source. We haven't released yet because we 
want to get the API right first but it is looking very promising.

On Tuesday, 31 May 2016 02:34:41 UTC+8, kovasb wrote:
>
> Anyone seriously working on deep learning with Clojure?
>
> I'm working with Torch at the day job, and have done work integrating 
> Tensorflow into Clojure, so I'm fairly familiar with the challenges of what 
> needs to be done. A bit too much to bite off on my own in my spare time. 
>
> So is anyone out there familiar enough with these tools to have a sensible 
> conversation of what could be done in Clojure?
>
> The main question on my mind is: what level of abstraction would be useful?
>
> All the existing tools have several layers of abstraction. In Tensorflow, 
> at the bottom theres the DAG of operations, and above that a high-level 
> library of python constructs to build the DAG (and now of course libraries 
> going higher still). In Torch, its more complicated: there's the excellent 
> tensor library at the bottom; the NN modules that are widely used; and 
> various non-orthogonal libraries and modules stack on top of those. 
>
> One could try to integrate at the bottom layer, and then re-invent the 
> layers above that in Clojure. Or one could try to integrate at the higher 
> layers, which is more complicated, but gives more leverage from the 
> existing ecosystem. 
>
> Any thoughts?
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.