R/GradientBoostingMachine.R
setGradientBoostingMachine.Rd
Create setting for gradient boosting machine model using gbm_xgboost implementation
The number of trees to build
The number of computer threads to use (how many cores do you have?)
If the performance does not increase over earlyStopRound number of trees then training stops (this prevents overfitting)
Maximum depth of each tree - a large value will lead to slow model training
Minimum sum of of instance weight in a child node - larger values are more conservative
The boosting learn rate
Controls weight of positive class in loss - useful for imbalanced classes
L2 regularization on weights - larger is more conservative
L1 regularization on weights - larger is more conservative
An option to add a seed when training the final model