createControl creates a Cyclops control object for use with fitCyclopsModel.
createControl(
maxIterations = 1000,
tolerance = 1e-06,
convergenceType = "gradient",
cvType = "auto",
fold = 10,
lowerLimit = 0.01,
upperLimit = 20,
gridSteps = 10,
cvRepetitions = 1,
minCVData = 100,
noiseLevel = "silent",
threads = 1,
seed = NULL,
resetCoefficients = FALSE,
startingVariance = -1,
useKKTSwindle = FALSE,
tuneSwindle = 10,
selectorType = "auto",
initialBound = 2,
maxBoundCount = 5,
algorithm = "ccd"
)Integer: maximum iterations of Cyclops to attempt before returning a failed-to-converge error
Numeric: maximum relative change in convergence criterion from successive iterations to achieve convergence
String: name of convergence criterion to employ (described in more detail below)
String: name of cross validation search.
Option "auto" selects an auto-search following BBR.
Option "grid" selects a grid-search cross validation
Numeric: Number of random folds to employ in cross validation
Numeric: Lower prior variance limit for grid-search
Numeric: Upper prior variance limit for grid-search
Numeric: Number of steps in grid-search
Numeric: Number of repetitions of X-fold cross validation
Numeric: Minimum number of data for cross validation
String: level of Cyclops screen output ("silent", "quiet", "noisy")
Numeric: Specify number of CPU threads to employ in cross-validation; default = 1 (auto = -1)
Numeric: Specify random number generator seed. A null value sets seed via Sys.time.
Logical: Reset all coefficients to 0 between model fits under cross-validation
Numeric: Starting variance for auto-search cross-validation; default = -1 (use estimate based on data)
Logical: Use the Karush-Kuhn-Tucker conditions to limit search
Numeric: Size multiplier for active set
String: name of exchangeable sampling unit.
Option "byPid" selects entire strata.
Option "byRow" selects single rows.
If set to "auto", "byRow" will be used for all models except conditional models where
the average number of rows per stratum is smaller than the number of strata.
Numeric: Starting trust-region size
Numeric: Maximum number of tries to decrease initial trust-region size
String: name of fitting algorithm to employ; default is ccd
Todo: Describe convegence types
A Cyclops control object of class inheriting from "cyclopsControl" for use with fitCyclopsModel.
#Generate some simulated data:
sim <- simulateCyclopsData(nstrata = 1, nrows = 1000, ncovars = 2, eCovarsPerRow = 0.5,
model = "poisson")
#> Sparseness = 75.6 %
cyclopsData <- convertToCyclopsData(sim$outcomes, sim$covariates, modelType = "pr",
addIntercept = TRUE)
#> Sorting covariates by covariateId and rowId
#Define the prior and control objects to use cross-validation for finding the
#optimal hyperparameter:
prior <- createPrior("laplace", exclude = 0, useCrossValidation = TRUE)
control <- createControl(cvType = "auto", noiseLevel = "quiet")
#Fit the model
fit <- fitCyclopsModel(cyclopsData,prior = prior, control = control)
#> Using cross-validation selector type byRow
#> Performing 10-fold cross-validation [seed = 1698789247] with data partitions of sizes 100 100 100 100 100 100 100 100 100 100
#> Using 1 thread(s)
#> Starting var = 0.244 (default)
#> Running at Laplace(2.86299) None Grid-point #1 at 0.244 Fold #1 Rep #1 pred log like = 103.184
#> Running at Laplace(2.86299) None Grid-point #1 at 0.244 Fold #2 Rep #1 pred log like = 130.249
#> Running at Laplace(2.86299) None Grid-point #1 at 0.244 Fold #3 Rep #1 pred log like = 160.028
#> Running at Laplace(2.86299) None Grid-point #1 at 0.244 Fold #4 Rep #1 pred log like = 151.655
#> Running at Laplace(2.86299) None Grid-point #1 at 0.244 Fold #5 Rep #1 pred log like = 175.354
#> Running at Laplace(2.86299) None Grid-point #1 at 0.244 Fold #6 Rep #1 pred log like = 161.803
#> Running at Laplace(2.86299) None Grid-point #1 at 0.244 Fold #7 Rep #1 pred log like = 130.623
#> Running at Laplace(2.86299) None Grid-point #1 at 0.244 Fold #8 Rep #1 pred log like = 137.887
#> Running at Laplace(2.86299) None Grid-point #1 at 0.244 Fold #9 Rep #1 pred log like = 173.734
#> Running at Laplace(2.86299) None Grid-point #1 at 0.244 Fold #10 Rep #1 pred log like = 190.477
#> AvgPred = 151.499 with stdev = 24.8369
#> Completed at 0.244
#> Next point at 2.44 with value 0 and continue = 1
#> search[ 0.244 ] = 151.499(24.8369)
#>
#> Running at Laplace(0.905357) None Grid-point #2 at 2.44 Fold #1 Rep #1 pred log like = 103.166
#> Running at Laplace(0.905357) None Grid-point #2 at 2.44 Fold #2 Rep #1 pred log like = 130.248
#> Running at Laplace(0.905357) None Grid-point #2 at 2.44 Fold #3 Rep #1 pred log like = 160.03
#> Running at Laplace(0.905357) None Grid-point #2 at 2.44 Fold #4 Rep #1 pred log like = 151.647
#> Running at Laplace(0.905357) None Grid-point #2 at 2.44 Fold #5 Rep #1 pred log like = 175.309
#> Running at Laplace(0.905357) None Grid-point #2 at 2.44 Fold #6 Rep #1 pred log like = 161.801
#> Running at Laplace(0.905357) None Grid-point #2 at 2.44 Fold #7 Rep #1 pred log like = 130.578
#> Running at Laplace(0.905357) None Grid-point #2 at 2.44 Fold #8 Rep #1 pred log like = 137.837
#> Running at Laplace(0.905357) None Grid-point #2 at 2.44 Fold #9 Rep #1 pred log like = 173.718
#> Running at Laplace(0.905357) None Grid-point #2 at 2.44 Fold #10 Rep #1 pred log like = 190.449
#> AvgPred = 151.478 with stdev = 24.8369
#> Completed at 2.44
#> Next point at 0.0244 with value 0 and continue = 1
#> search[ 0.244 ] = 151.499(24.8369)
#> search[ 2.44 ] = 151.478(24.8369)
#>
#> Running at Laplace(9.05357) None Grid-point #3 at 0.0244 Fold #1 Rep #1 pred log like = 103.24
#> Running at Laplace(9.05357) None Grid-point #3 at 0.0244 Fold #2 Rep #1 pred log like = 130.205
#> Running at Laplace(9.05357) None Grid-point #3 at 0.0244 Fold #3 Rep #1 pred log like = 160.006
#> Running at Laplace(9.05357) None Grid-point #3 at 0.0244 Fold #4 Rep #1 pred log like = 151.671
#> Running at Laplace(9.05357) None Grid-point #3 at 0.0244 Fold #5 Rep #1 pred log like = 175.49
#> Running at Laplace(9.05357) None Grid-point #3 at 0.0244 Fold #6 Rep #1 pred log like = 161.794
#> Running at Laplace(9.05357) None Grid-point #3 at 0.0244 Fold #7 Rep #1 pred log like = 130.755
#> Running at Laplace(9.05357) None Grid-point #3 at 0.0244 Fold #8 Rep #1 pred log like = 138.031
#> Running at Laplace(9.05357) None Grid-point #3 at 0.0244 Fold #9 Rep #1 pred log like = 173.692
#> Running at Laplace(9.05357) None Grid-point #3 at 0.0244 Fold #10 Rep #1 pred log like = 190.557
#> AvgPred = 151.544 with stdev = 24.8316
#> Completed at 0.0244
#> Next point at 0.00244 with value 0 and continue = 1
#> search[ 0.0244 ] = 151.544(24.8316)
#> search[ 0.244 ] = 151.499(24.8369)
#> search[ 2.44 ] = 151.478(24.8369)
#>
#> Running at Laplace(28.6299) None Grid-point #4 at 0.00244 Fold #1 Rep #1 pred log like = 103.378
#> Running at Laplace(28.6299) None Grid-point #4 at 0.00244 Fold #2 Rep #1 pred log like = 130.133
#> Running at Laplace(28.6299) None Grid-point #4 at 0.00244 Fold #3 Rep #1 pred log like = 159.943
#> Running at Laplace(28.6299) None Grid-point #4 at 0.00244 Fold #4 Rep #1 pred log like = 151.618
#> Running at Laplace(28.6299) None Grid-point #4 at 0.00244 Fold #5 Rep #1 pred log like = 175.676
#> Running at Laplace(28.6299) None Grid-point #4 at 0.00244 Fold #6 Rep #1 pred log like = 161.739
#> Running at Laplace(28.6299) None Grid-point #4 at 0.00244 Fold #7 Rep #1 pred log like = 131.127
#> Running at Laplace(28.6299) None Grid-point #4 at 0.00244 Fold #8 Rep #1 pred log like = 137.975
#> Running at Laplace(28.6299) None Grid-point #4 at 0.00244 Fold #9 Rep #1 pred log like = 173.61
#> Running at Laplace(28.6299) None Grid-point #4 at 0.00244 Fold #10 Rep #1 pred log like = 190.596
#> AvgPred = 151.579 with stdev = 24.7953
#> Completed at 0.00244
#> Next point at 0.000244 with value 0 and continue = 1
#> search[ 0.00244 ] = 151.579(24.7953)
#> search[ 0.0244 ] = 151.544(24.8316)
#> search[ 0.244 ] = 151.499(24.8369)
#> search[ 2.44 ] = 151.478(24.8369)
#>
#> Running at Laplace(90.5357) None Grid-point #5 at 0.000244 Fold #1 Rep #1 pred log like = 103.378
#> Running at Laplace(90.5357) None Grid-point #5 at 0.000244 Fold #2 Rep #1 pred log like = 130.133
#> Running at Laplace(90.5357) None Grid-point #5 at 0.000244 Fold #3 Rep #1 pred log like = 159.943
#> Running at Laplace(90.5357) None Grid-point #5 at 0.000244 Fold #4 Rep #1 pred log like = 151.618
#> Running at Laplace(90.5357) None Grid-point #5 at 0.000244 Fold #5 Rep #1 pred log like = 175.676
#> Running at Laplace(90.5357) None Grid-point #5 at 0.000244 Fold #6 Rep #1 pred log like = 161.739
#> Running at Laplace(90.5357) None Grid-point #5 at 0.000244 Fold #7 Rep #1 pred log like = 131.206
#> Running at Laplace(90.5357) None Grid-point #5 at 0.000244 Fold #8 Rep #1 pred log like = 137.975
#> Running at Laplace(90.5357) None Grid-point #5 at 0.000244 Fold #9 Rep #1 pred log like = 173.61
#> Running at Laplace(90.5357) None Grid-point #5 at 0.000244 Fold #10 Rep #1 pred log like = 190.596
#> AvgPred = 151.587 with stdev = 24.7888
#> Completed at 0.000244
#> Next point at 2.44e-05 with value 0 and continue = 1
#> search[ 0.000244 ] = 151.587(24.7888)
#> search[ 0.00244 ] = 151.579(24.7953)
#> search[ 0.0244 ] = 151.544(24.8316)
#> search[ 0.244 ] = 151.499(24.8369)
#> search[ 2.44 ] = 151.478(24.8369)
#>
#> Running at Laplace(286.299) None Grid-point #6 at 2.44e-05 Fold #1 Rep #1 pred log like = 103.378
#> Running at Laplace(286.299) None Grid-point #6 at 2.44e-05 Fold #2 Rep #1 pred log like = 130.133
#> Running at Laplace(286.299) None Grid-point #6 at 2.44e-05 Fold #3 Rep #1 pred log like = 159.943
#> Running at Laplace(286.299) None Grid-point #6 at 2.44e-05 Fold #4 Rep #1 pred log like = 151.618
#> Running at Laplace(286.299) None Grid-point #6 at 2.44e-05 Fold #5 Rep #1 pred log like = 175.676
#> Running at Laplace(286.299) None Grid-point #6 at 2.44e-05 Fold #6 Rep #1 pred log like = 161.739
#> Running at Laplace(286.299) None Grid-point #6 at 2.44e-05 Fold #7 Rep #1 pred log like = 131.206
#> Running at Laplace(286.299) None Grid-point #6 at 2.44e-05 Fold #8 Rep #1 pred log like = 137.975
#> Running at Laplace(286.299) None Grid-point #6 at 2.44e-05 Fold #9 Rep #1 pred log like = 173.61
#> Running at Laplace(286.299) None Grid-point #6 at 2.44e-05 Fold #10 Rep #1 pred log like = 190.596
#> AvgPred = 151.587 with stdev = 24.7888
#> Completed at 2.44e-05
#> Next point at 1.62865e-05 with value 151.591 and continue = 0
#> search[ 2.44e-05 ] = 151.587(24.7888)
#> search[ 0.000244 ] = 151.587(24.7888)
#> search[ 0.00244 ] = 151.579(24.7953)
#> search[ 0.0244 ] = 151.544(24.8316)
#> search[ 0.244 ] = 151.499(24.8369)
#> search[ 2.44 ] = 151.478(24.8369)
#>
#>
#> Maximum predicted log likelihood (151.591) estimated at:
#> 1.62865e-05 (variance)
#> 350.43 (lambda)
#>
#> Fitting model at optimal hyperparameter
#> Using prior: Laplace(350.43) None
#Find out what the optimal hyperparameter was:
getHyperParameter(fit)
#> [1] 1.62865e-05
#Extract the current log-likelihood, and coefficients
logLik(fit)
#> 'log Lik.' -1858.348 (df=3)
coef(fit)
#> (Intercept) 1 2
#> -4.261846 0.000000 0.000000
#We can only retrieve the confidence interval for unregularized coefficients:
confint(fit, c(0))
#> Using 1 thread(s)
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#>
#> Warning: BLR gradient is ill-conditioned
#> Enforcing convergence!
#> covariate 2.5 % 97.5 % evaluations
#> (Intercept) 0 -4.295195 -4.228855 24