[ 
https://issues.apache.org/jira/browse/SPARK-14043?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15206841#comment-15206841
 ] 

Eugene Morozov commented on SPARK-14043:
----------------------------------------

I looked at the spark code regarding the issue and I have couple of ideas how 
this can be fixed
- introduce Array64 (int[][] that allows longer arrays, than max_integer) or 
List, but the bad part is that it'd require a lot of memory just to store those 
indices,
- represent the decision tree as a tree without nodeIds at all.

> Remove restriction on maxDepth for decision trees
> -------------------------------------------------
>
>                 Key: SPARK-14043
>                 URL: https://issues.apache.org/jira/browse/SPARK-14043
>             Project: Spark
>          Issue Type: Improvement
>          Components: ML
>            Reporter: Joseph K. Bradley
>            Priority: Minor
>
> We currently restrict decision trees (DecisionTree, GBT, RandomForest) to be 
> of maxDepth <= 30.  We should remove this restriction to support deep 
> (imbalanced) trees.
> Trees store an index for each node, where each index corresponds to a unique 
> position in a binary tree.  (I.e., the first index of row 0 is 1, the first 
> of row 1 is 2, the first of row 2 is 4, etc., IIRC)
> With some careful thought, we could probably avoid using indices altogether.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to