Say n is the number of examples and m is the number of features, then a naive implementation of a balanced binary decision tree is O(m * n^2 log n). I think scikit-learn's decision tree cache the sorted features, so this reduces to O(m * n log n). Than, to your O(m * n log n) you can multiply the number of decision trees in the forest
Best, Sebastian > On Dec 20, 2018, at 1:09 AM, lampahome <pahome.c...@mirlab.org> wrote: > > I do some benchmark in my experiments and I almost use ensemble-based > regressor. > > What is the time complexity if I use random forest regressor? Assume I only > set variable estimators=100 and others doesn't enter. > > thx > _______________________________________________ > scikit-learn mailing list > scikit-learn@python.org > https://mail.python.org/mailman/listinfo/scikit-learn _______________________________________________ scikit-learn mailing list scikit-learn@python.org https://mail.python.org/mailman/listinfo/scikit-learn