[ https://issues.apache.org/jira/browse/SYSTEMML-1570?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15991142#comment-15991142 ]
Niketan Pansare commented on SYSTEMML-1570: ------------------------------------------- Before closing this PR, please ensure that the performance benefits for CNNs (eg: Lenet) due fused sel+ operators (such as relu_maxpooling and relu_maxpooling_backward) are preserved. > Remove fused sel+ operator > -------------------------- > > Key: SYSTEMML-1570 > URL: https://issues.apache.org/jira/browse/SYSTEMML-1570 > Project: SystemML > Issue Type: Task > Reporter: Matthias Boehm > Fix For: SystemML 1.0 > > > The fused operator sel+ (select positive values) is applied for patterns like > (X>0)*X and max(X,0) in order to eliminate unnecessary intermediates. It > stems from a time when max was sparse-unsafe and hence inefficient over > sparse data. However, meanwhile we mark scalar operators as conditionally > sparse-safe depending on the given scalar constant c, which applies for max > if c<=0. Hence, this sel+ operator is meanwhile completely useless and should > be removed. > Furthermore, we should also generalize the rewrites to rewrite the selection > of negative values (X<0)*X to min(X,0) -- This message was sent by Atlassian JIRA (v6.3.15#6346)