Create setting for AdaBoost with python DecisionTreeClassifier base estimator

```
setAdaBoost(
nEstimators = list(10, 50, 200),
learningRate = list(1, 0.5, 0.1),
algorithm = list("SAMME.R"),
seed = sample(1e+06, 1)
)
```

## Arguments

- nEstimators
(list) The maximum number of estimators at which boosting is terminated. In case of perfect fit, the learning procedure is stopped early.

- learningRate
(list) Weight applied to each classifier at each boosting iteration. A higher learning rate increases the contribution of each classifier. There is a trade-off between the learningRate and nEstimators parameters
There is a trade-off between learningRate and nEstimators.

- algorithm
(list) If ‘SAMME.R’ then use the SAMME.R real boosting algorithm. base_estimator must support calculation of class probabilities. If ‘SAMME’ then use the SAMME discrete boosting algorithm. The SAMME.R algorithm typically converges faster than SAMME, achieving a lower test error with fewer boosting iterations.

- seed
A seed for the model

## Examples

```
if (FALSE) {
model.adaBoost <- setAdaBoost(nEstimators = list(10,50,200), learningRate = list(1, 0.5, 0.1),
algorithm = list('SAMME.R'), seed = sample(1000000,1)
)
}
```