[
https://issues.apache.org/jira/browse/SYSTEMML-1965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16207814#comment-16207814
]
Mike Dusenberry commented on SYSTEMML-1965:
-------------------------------------------
[~niketanpansare] I would strongly prefer to keep the sizes in the {{forward}}
functions. The {{nn}} library is designed so that the {{init}} functions are
just convenience initialization functions that the user can either choose to
use or not. If we move the sizes to those functions, then that forces the user
to use the {{init}} functions at all times, and remember that those sizes are
associated with the output of the {{forward}} functions. I think it would be
better to improve the constant propagation portion of the engine.
> Refactor nn layers to move the computation in forward/backward function known
> at compile time to init function
> --------------------------------------------------------------------------------------------------------------
>
> Key: SYSTEMML-1965
> URL: https://issues.apache.org/jira/browse/SYSTEMML-1965
> Project: SystemML
> Issue Type: Bug
> Reporter: Niketan Pansare
>
> Ideally, we should move the computation known at compile time to init layer,
> rather than keep in the forward function. This reduces recompilation time and
> also potentially unnecessary instructions. Here is an example snippet from
> our conv2d layer:
> {code}
> Hout = as.integer(floor((Hin + 2*padh - Hf)/strideh + 1))
> Wout = as.integer(floor((Win + 2*padw - Wf)/stridew + 1))
> {code}
> [~prithvi_r_s] [~dusenberrymw] [~reinwald] do you have any comments or
> concerns ?
> [~dusenberrymw] do you have free cycles to take this over ?
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)