le.
I assume fixing d. and D. are not priorities for new J releases.
Regards,
Jon
On Tue, 2/28/17, Henry Rich wrote:
Subject: Re: [Jprogramming] Fast derivative of Softmax function
To: programm...@jsoftware.com
Date: Tuesday, February 28, 2017,
for new J releases.
Regards,
Jon
On Tue, 2/28/17, Henry Rich wrote:
Subject: Re: [Jprogramming] Fast derivative of Softmax function
To: programm...@jsoftware.com
Date: Tuesday, February 28, 2017, 8:47 AM
The Bugs page is
festooned with anomalies in
rote:
Subject: Re: [Jprogramming] Fast derivative of Softmax function
To: programm...@jsoftware.com
Date: Tuesday, February 28, 2017, 1:05 AM
It's not clear what
you had in mind for the dyadic function "softmax"
.
Anyway, is this what
you require?
I =:
=/~@:i
rx=. rx , smx * (1 - smx)
>> else.
rx=. rx ,(j softmax y)* (0 - smx) end.
>> end. end.
>>
rx
>> )
>>
>>
>> -
Original Message -
>> From:
'Jon Hough' via Programming
>> To: Programming Forum
>> S
.
On Tue, 2/28/17, Raul Miller wrote:
Subject: Re: [Jprogramming] Fast derivative of Softmax function
To: "Programming forum"
Date: Tuesday, February 28, 2017, 12:24 AM
I was about to point out
the same thing.
a =: 0.5 0.6 0.23 0.66
sm=:(] % +/ )@:^ N
Yes, thanks. This seems to be what I need!
On Mon, 2/27/17, Louis de Forcrand wrote:
Subject: Re: [Jprogramming] Fast derivative of Softmax function
To: programm...@jsoftware.com
Date: Monday, February 27, 2017, 10:05 PM
You probably know
That looks right (and is about 10x faster than my attempts).
--
Raul
On Mon, Feb 27, 2017 at 11:05 AM, 'Mike Day' via Programming
wrote:
> It's not clear what you had in mind for the dyadic function "softmax" .
>
> Anyway, is this what you require?
>
> I=: =/~@:i.@#NB. Identity ma
It's not clear what you had in mind for the dyadic function "softmax" .
Anyway, is this what you require?
I=: =/~@:i.@#NB. Identity matrix (Kronecker delta)
pdsm =: (*"1 (-~I))@:sm NB. Partial derivative (matrix) of sm
pdsm a
0.186192 _0.0676431 _0.0467234 _0.0718259
_0.06764
ntested.
>>
>>
>> dsoftmax=: 4 : 0
>> rx=. ''
>> for_i. x do.
>> smx=. i softmax y
>> for_j. i.#vals do.
>> if. j_index. = i_index. do. rx=. rx , smx * (1 - smx)
>> else. rx=. rx ,(j softmax y)* (0 - smx) end.
>> end. end.
>> rx
=. i softmax y
> for_j. i.#vals do.
> if. j_index. = i_index. do. rx=. rx , smx * (1 - smx)
> else. rx=. rx ,(j softmax y)* (0 - smx) end.
> end. end.
> rx
> )
>
>
> - Original Message -
> From: 'Jon Hough' via Programming
> To: Programming For
) end.
end. end.
rx
)
- Original Message -
From: 'Jon Hough' via Programming
To: Programming Forum
Sent: Monday, February 27, 2017 3:09 AM
Subject: [Jprogramming] Fast derivative of Softmax function
Given an array, we can calculate the softmax function
https://en.wikipedia.org
Given an array, we can calculate the softmax function
https://en.wikipedia.org/wiki/Softmax_function
a =: 0.5 0.6 0.23 0.66
sm=:(] % +/ )@:^ NB. softmax
sm a
0.247399 0.273418 0.188859 0.290325
The (partial) derivative of softmax is a little more complicated:
If the array is of length N, we n
12 matches
Mail list logo