Checkout the examples 
<https://github.com/dmlc/MXNet.jl/tree/master/examples> and the documents 
<http://mxnetjl.readthedocs.org/en/latest/?badge=latest>.

Relation to Mocha.jl: currently I will maintain both packages, but I will 
treat MXNet.jl as a successor to Mocha.jl when the project become more 
mature. dmlc/mxnet is a collaboration from authors of multiple different 
deep learning libraries and we provide support for Python, R, and Julia 
(and maybe more). MXNet.jl introduce an external dependency, but the 
default CPU only dependency is very easy to build automatically, basically 
acting as the CPU Backend of Mocha.jl. For GPUs, the built-in multi-GPU of 
MXNet.jl support is definitely attractive when comparing to Mocha.jl.

v.0.03 (2015.10.27)
   
   - Model prediction API.
   - Model checkpoint loading and saving.
   - IJulia Notebook example of using pre-trained imagenet model as 
   classifier.
   - Symbol saving and loading.
   - NDArray saving and loading.
   - Optimizer gradient clipping.
   - Model training callback APIs, default checkpoint and speedometer 
   callbacks.
   - Julia Array / NDArray data iterator.
   - Sphinx documentation system and documents for dynamically imported 
   libmxnet APIs.


Enjoy,
pluskid

Reply via email to