All,

If J would like to stay relevant in today's programmimg world, providing J
code for the most common machine learning and deep learning algorithms such
as gradient decent, neural networks, word2vec etc. would likely attract
some attention. Many of the basic ML algorithms are already published in J,
but they are scattered in various locations on the J website and other
places. Collecting them together in one place would help show J's relevance
to the hot field of ML research.

Here's a list of some of the most basic ML algorithms:
Linear Regression
Linear Regression w/ Gradient Decent
Logistic Function
Logistic Regression
Linear Discriminant Analysis
Gini Coefficient
Classification and Regression Trees
Naive Bayes
Gaussian
Gaussian Naive Bayes
Nearest Neighbors
Vector Quantization
Support Vector Machines
Bagged Decision Trees
Adaptive Boosting

Jason Brownlee, Ph.D. maintains a website focused on ML, called: "Machine
Learning Mastery" https://machinelearningmastery.com/
On his website, Brownnlee sells several books that he has wtten on various
aspects of of ML, Deep Learning, and Natural Language Processing.
In one book, entitled  A gentle step-by-step introduction to 10 top machine
learning algorithms
<https://machinelearningmastery.com/master-machine-learning-algorithms/>. (for
the ML beginner) *he provides Excel spreadsheets for all the basic ML
algorithms I mentioned above.*

Here are two more of Brownlee's books where he shows how R & Python (With
NumPy) can be used for ML algorithms.

Machine Learning Mastery with R
<https://machinelearningmastery.com/machine-learning-with-r/> (for the ML
intermediate)

Deep Learning with Python
<https://machinelearningmastery.com/deep-learning-with-python/> (for the
Deep Learning afacionado)

IMO, J's implementation of these algorithms would be more clear & concise,
with much less reliance on external routines. Making a J adjunct workbook
to Brownlee's books, though a huge task, would be a showcase for why a true
matrix language is the optimal way to describe these algorithms.

Also, attached below is an email I just received containing a topical
discussion about operations on sparse matrices using Python's NumPy addon,
as well as an interesting article about math operations on different-sized
arrays called "Broadcasting". Jason sends these emails out as a weekly ML
newsletter.

Skip Cave
Cave Consulting LLC

​<<<>>>

---------- Forwarded message ----------
From: Jason @ ML Mastery <[email protected]>
Date: Thu, Mar 15, 2018 at 1:11 PM
Subject: Broadcasting, Sparsity and Deep Learning
To: [email protected]

Hi, this week we have two important tutorials and an overview of linear
algebra for deep learning.
Broadcasting is a handy shortcut to performing arithmetic operations on
arrays with differing sizes. Discover how broadcasting works in this
tutorial:
>> A Gentle Introduction to Broadcasting with NumPy Arrays
<http://t.dripemail2.com/c/eyJhY2NvdW50X2lkIjoiOTU1NjU4OCIsImRlbGl2ZXJ5X2lkIjoiMjI5MzQ1MDYyOSIsInVybCI6Imh0dHBzOi8vbWFjaGluZWxlYXJuaW5nbWFzdGVyeS5jb20vYnJvYWRjYXN0aW5nLXdpdGgtbnVtcHktYXJyYXlzLz9fX3M9dWIxYnBpaG9la3Fic3BmdnF6cnMifQ>

Sparse vectors and matrices are an important an under-discussed area of
applied machine learning. Discover sparsity and how to work with sparse
data in this tutorial:
>> A Gentle Introduction to Sparse Matrices for Machine Learning
<http://t.dripemail2.com/c/eyJhY2NvdW50X2lkIjoiOTU1NjU4OCIsImRlbGl2ZXJ5X2lkIjoiMjI5MzQ1MDYyOSIsInVybCI6Imh0dHBzOi8vbWFjaGluZWxlYXJuaW5nbWFzdGVyeS5jb20vc3BhcnNlLW1hdHJpY2VzLWZvci1tYWNoaW5lLWxlYXJuaW5nP19fcz11YjFicGlob2VrcWJzcGZ2cXpycyJ9>

Linear algebra is a required tool for understanding precise descriptions of
deep learning methods. Discover the linear algebra topics required for deep
learning in this post:
>> Linear Algebra for Deep Learning
<http://t.dripemail2.com/c/eyJhY2NvdW50X2lkIjoiOTU1NjU4OCIsImRlbGl2ZXJ5X2lkIjoiMjI5MzQ1MDYyOSIsInVybCI6Imh0dHBzOi8vbWFjaGluZWxlYXJuaW5nbWFzdGVyeS5jb20vbGluZWFyLWFsZ2VicmEtZm9yLWRlZXAtbGVhcm5pbmc_X19zPXViMWJwaWhvZWtxYnNwZnZxenJzIn0>
I'll speak to you soon.
​
​Jason​

<​<<>>>​
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to