There are no reviews yet. Be the first to send feedback to the community and the maintainers!
H20-learnup
Created a model for the speed of the wind turbine , and created features like speed , direction and t sne distribution and used decision trees to combine the date and the month columns and used glmnet for feature selection Ran a bag of 10's xg boost and secured a rmse of 0.17321 rmse (root mean square estimation ) in public leaderboard but on the private leaderboard leads to overfitting inspite of good CVs . Anyways great learning experience , and I thank analytics vidya for conducting this competition Rank : 12th / 109 participants Team Name : Aegis Team Meambers : Aman Kapoor Anchal Guptaredhat_kaggle
Kaggle Project Red HatSmart--Recruits
Public leader Board : 0.684sbs
sequential backward selection , it is an algorithm which starts with n no of feature and check the accuracy of model with d no of features where n>d and remove the noise by removing the irrevelant features *Chapter 4* dimensionality reduction techniquesultimate-studenthunt
M5-Competion
Hi.Knockoctober
Med camp one of the organisation built a health camp for the low life work balance , our aim to generate target variable with the attributes like health camp detail , patient details , health camp 1 , camp 2 and camp 3 We did a predictive modeling by extracted some of the features on dates and other attritubes like linkedin shared , facebook shared , age , income and employer details Final model comprise of ~13 features with the blend of logistic regression , deccision trees and xg boost which gives us the auc of ~0.81 on the public leaderboard and ~0.75 on private leaderboard Final standing 27th out of 286 participantsLove Open Source and this site? Check out how you can help us