• Stars
    star
    6
  • Rank 2,527,443 (Top 50 %)
  • Language
    R
  • License
    ISC License
  • Created about 9 years ago
  • Updated almost 9 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

I have done my individual project (dissertation) on ensemble methods. In which I first did the background study on different ensemble methods and then implemented Boosting, AdaBoost, Bagging and random forest techniques on underlying machine learning algorithms. I used boosting method to boost the performance of weak learner like decision stumps. Implemented bagging for decision trees (both regression and classification problems) and for KNN classifier. Used random forest for classification trees. I have implemented a special algorithm of boosting called “AdaBoost” on logistic regression algorithm using different threshold values. Then plotted the different graphs like an error rate as a function of boosting, bagging and random forest iterations. Compared results of bagging with boosting. Analysed the performance of classifier before applying ensemble methods and after applying ensemble methods. Used different model evaluation techniques like cross-validation, MSE, PRSS, ROC curves, confusion matrix, and out-of-bag error estimation to estimate the performance of ensemble techniques.