Re: [Jprogramming] Fast derivative of Softmax function

2017-02-27 Thread Henry Rich
le. I assume fixing d. and D. are not priorities for new J releases. Regards, Jon On Tue, 2/28/17, Henry Rich wrote: Subject: Re: [Jprogramming] Fast derivative of Softmax function To: programm...@jsoftware.com Date: Tuesday, February 28, 2017,

Re: [Jprogramming] Fast derivative of Softmax function

2017-02-27 Thread 'Jon Hough' via Programming
for new J releases. Regards, Jon On Tue, 2/28/17, Henry Rich wrote: Subject: Re: [Jprogramming] Fast derivative of Softmax function To: programm...@jsoftware.com Date: Tuesday, February 28, 2017, 8:47 AM The Bugs page is festooned with anomalies in

Re: [Jprogramming] Fast derivative of Softmax function

2017-02-27 Thread 'Jon Hough' via Programming
rote: Subject: Re: [Jprogramming] Fast derivative of Softmax function To: programm...@jsoftware.com Date: Tuesday, February 28, 2017, 1:05 AM It's not clear what you had in mind for the dyadic function "softmax" . Anyway,  is this what you require? I    =: =/~@:i

Re: [Jprogramming] Fast derivative of Softmax function

2017-02-27 Thread Henry Rich
rx=. rx , smx * (1 - smx) >> else. rx=. rx ,(j softmax y)* (0 - smx) end. >> end. end. >> rx >> ) >> >> >> - Original Message - >> From: 'Jon Hough' via Programming >> To: Programming Forum >> S

Re: [Jprogramming] Fast derivative of Softmax function

2017-02-27 Thread 'Jon Hough' via Programming
. On Tue, 2/28/17, Raul Miller wrote: Subject: Re: [Jprogramming] Fast derivative of Softmax function To: "Programming forum" Date: Tuesday, February 28, 2017, 12:24 AM I was about to point out the same thing.    a =: 0.5 0.6 0.23 0.66    sm=:(] % +/ )@:^ N

Re: [Jprogramming] Fast derivative of Softmax function

2017-02-27 Thread 'Jon Hough' via Programming
Yes, thanks. This seems to be what I need! On Mon, 2/27/17, Louis de Forcrand wrote: Subject: Re: [Jprogramming] Fast derivative of Softmax function To: programm...@jsoftware.com Date: Monday, February 27, 2017, 10:05 PM You probably know

Re: [Jprogramming] Fast derivative of Softmax function

2017-02-27 Thread Raul Miller
That looks right (and is about 10x faster than my attempts). -- Raul On Mon, Feb 27, 2017 at 11:05 AM, 'Mike Day' via Programming wrote: > It's not clear what you had in mind for the dyadic function "softmax" . > > Anyway, is this what you require? > > I=: =/~@:i.@#NB. Identity ma

Re: [Jprogramming] Fast derivative of Softmax function

2017-02-27 Thread 'Mike Day' via Programming
It's not clear what you had in mind for the dyadic function "softmax" . Anyway, is this what you require? I=: =/~@:i.@#NB. Identity matrix (Kronecker delta) pdsm =: (*"1 (-~I))@:sm NB. Partial derivative (matrix) of sm pdsm a 0.186192 _0.0676431 _0.0467234 _0.0718259 _0.06764

Re: [Jprogramming] Fast derivative of Softmax function

2017-02-27 Thread Raul Miller
ntested. >> >> >> dsoftmax=: 4 : 0 >> rx=. '' >> for_i. x do. >> smx=. i softmax y >> for_j. i.#vals do. >> if. j_index. = i_index. do. rx=. rx , smx * (1 - smx) >> else. rx=. rx ,(j softmax y)* (0 - smx) end. >> end. end. >> rx

Re: [Jprogramming] Fast derivative of Softmax function

2017-02-27 Thread Louis de Forcrand
=. i softmax y > for_j. i.#vals do. > if. j_index. = i_index. do. rx=. rx , smx * (1 - smx) > else. rx=. rx ,(j softmax y)* (0 - smx) end. > end. end. > rx > ) > > > - Original Message - > From: 'Jon Hough' via Programming > To: Programming For

Re: [Jprogramming] Fast derivative of Softmax function

2017-02-27 Thread 'Pascal Jasmin' via Programming
) end. end. end. rx ) - Original Message - From: 'Jon Hough' via Programming To: Programming Forum Sent: Monday, February 27, 2017 3:09 AM Subject: [Jprogramming] Fast derivative of Softmax function Given an array, we can calculate the softmax function https://en.wikipedia.org

[Jprogramming] Fast derivative of Softmax function

2017-02-27 Thread 'Jon Hough' via Programming
Given an array, we can calculate the softmax function https://en.wikipedia.org/wiki/Softmax_function a =: 0.5 0.6 0.23 0.66 sm=:(] % +/ )@:^ NB. softmax sm a 0.247399 0.273418 0.188859 0.290325 The (partial) derivative of softmax is a little more complicated: If the array is of length N, we n