Dear mlpack community,

My name is Adarsh Santoria, and I am a sophomore at IIT Mandi, India. I am
writing to submit my proposal for the GSoC project on improving ensemble
trees with XGBoost implementation. You can access the document through this
link:
https://docs.google.com/document/d/1mQx5e7thE42zIlEPO2U5aUkk4sZfvDZxBWtTYgytrNY/edit?usp=sharing,
which outlines my project plan and timeline in detail. XGBoost is a machine
learning algorithm that uses decision trees as base learners, known for its
high accuracy, interpretability, scalability, feature importance, and
robustness to noisy or incomplete data. Implementing XGBoost in mlpack is a
necessary step towards enhancing the performance of ensemble trees, making
it an important contribution to the mlpack community.

In summary, my proposal includes the following:
● Implementing Random Forest Regressor method and adding tests
● Parallelizing decision tree, random forest and xgboost with OpenMP
● Adding bindings for the decision tree, random forest and xgboost
● Adding the XGBoost Classifier and Regressor along with some split methods
and loss functions.
● Adding tutorials and sample code snippets

I believe that with my skills and experience, I can make significant
contributions to mlpack and enhance the performance of ensemble trees with
XGBoost implementation.
Thank you for considering my proposal for the GSoC project.

Best regards,
Adarsh Santoria
Github link: https://github.com/AdarshSantoria
_______________________________________________
mlpack mailing list -- [email protected]
To unsubscribe send an email to [email protected]
%(web_page_url)slistinfo%(cgiext)s/%(_internal_name)s

Reply via email to