creates settings for the Estimator, which takes a model and trains it
setEstimator(
learningRate = "auto",
weightDecay = 0,
batchSize = 512,
epochs = 30,
device = "cpu",
optimizer = torch$optim$AdamW,
scheduler = list(fun = torch$optim$lr_scheduler$ReduceLROnPlateau, params =
list(patience = 1)),
criterion = torch$nn$BCEWithLogitsLoss,
earlyStopping = list(useEarlyStopping = TRUE, params = list(patience = 4)),
metric = "auc",
accumulationSteps = NULL,
seed = NULL
)
what learning rate to use
what weight_decay to use
batchSize to use
how many epochs to train for
what device to train on, can be a string or a function to that evaluates to the device during runtime
which optimizer to use
which learning rate scheduler to use
loss function to use
If earlyStopping should be used which stops the training of your metric is not improving
either `auc` or `loss` or a custom metric to use. This is the metric used for scheduler and earlyStopping. Needs to be a list with function `fun`, mode either `min` or `max` and a `name`, `fun` needs to be a function that takes in prediction and labels and outputs a score.
how many steps to accumulate gradients before updating weights, can also be a function that is evaluated during runtime
seed to initialize weights of model with