This page contains information of the cv_MI
method that combines
Multiple Imputation with Cross-validation for the validation of
logistic prediction models. This cross-validation method is based
on the papers of Mertens BJ
and Miles A.
The cv_MI
method is implemented in the function psfmi_perform
.
An explanation and examples of how to use the method can be found below.
With this method imputations are implemented as part of the cross-validation procedure. Within each cross-validation fold, imputation is done once. By repeating this process over multiple imputation runs, multiply imputed training and test sets are generated. Model performance is evaluated and tested on the training and test sets respectively. The method can be performed in combination with backward selection in the training set and subsequently testing the performance in the test set. The method can only be performed when the outcome data is complete and the original data, that contains the missing values has to be included.
How this method works is visualized in the Figure below.
Figure 2.1: Schematic overview of the cv_MI method
To run the cv_MI method use:
library(psfmi)
pool_lr <- psfmi_lr(data=lbpmilr, formula = Chronic ~ rcs(Pain, 3) + JobDemands + Tampascale +
factor(Satisfaction) + Smoking + factor(Satisfaction)*rcs(Pain, 3) ,
p.crit = 0.05, direction="BW", nimp=5, impvar="Impnr",
method="D1")
## Removed at Step 1 is - JobDemands
## Removed at Step 2 is - Smoking
## Removed at Step 3 is - rcs(Pain,3)*factor(Satisfaction)
## Removed at Step 4 is - Tampascale
##
## Selection correctly terminated,
## No more variables removed from the model
set.seed(200)
res_cv <- psfmi_perform(pool_lr, val_method = "cv_MI", data_orig = lbp_orig, folds = 5,
p.crit=1, BW=FALSE, nimp_cv = 3, miceImp = miceImp, printFlag = FALSE)
##
## Imp run 1
##
## fold 1
##
## fold 2
##
## fold 3
##
## fold 4
##
## fold 5
##
## Imp run 2
##
## fold 1
##
## fold 2
##
## fold 3
##
## fold 4
##
## fold 5
##
## Imp run 3
##
## fold 1
##
## fold 2
##
## fold 3
##
## fold 4
##
## fold 5
res_cv
## $pool_stats
## Train Test
## AUC 0.8819489 0.8288000
## Scaled Brier 0.4607886 0.3352870
## R2 0.5521409 0.4561184
##
## $LP_val
## (Intercept) lp_test
## 0.0686546 0.9879449
##
## $auc_test
## 95% Low AUC 95% Up
## AUC (logit) 0.7458 0.8288 0.8887
Back to Examples
To run the cv_MI method including BW selection use:
library(psfmi)
pool_lr <- psfmi_lr(data=lbpmilr, formula = Chronic ~ rcs(Pain, 3) + JobDemands + Tampascale +
factor(Satisfaction) + Smoking + factor(Satisfaction)*rcs(Pain, 3) ,
p.crit = 0.05, direction="BW", nimp=5, impvar="Impnr",
method="D1")
## Removed at Step 1 is - JobDemands
## Removed at Step 2 is - Smoking
## Removed at Step 3 is - rcs(Pain,3)*factor(Satisfaction)
## Removed at Step 4 is - Tampascale
##
## Selection correctly terminated,
## No more variables removed from the model
set.seed(200)
res_cv <- psfmi_perform(pool_lr, val_method = "cv_MI", data_orig = lbp_orig, folds = 5,
p.crit=0.05, BW=TRUE, nimp_cv = 3, miceImp = miceImp, printFlag = FALSE)
##
## Imp run 1
##
## fold 1
## Removed at Step 1 is - factor(Satisfaction)
##
## Selection correctly terminated,
## No more variables removed from the model
##
## fold 2
## Removed at Step 1 is - factor(Satisfaction)
##
## Selection correctly terminated,
## No more variables removed from the model
##
## fold 3
## Removed at Step 1 is - factor(Satisfaction)
##
## Selection correctly terminated,
## No more variables removed from the model
##
## fold 4
## Removed at Step 1 is - factor(Satisfaction)
##
## Selection correctly terminated,
## No more variables removed from the model
##
## fold 5
## Removed at Step 1 is - factor(Satisfaction)
##
## Selection correctly terminated,
## No more variables removed from the model
##
## Imp run 2
##
## fold 1
## Removed at Step 1 is - factor(Satisfaction)
##
## Selection correctly terminated,
## No more variables removed from the model
##
## fold 2
## Removed at Step 1 is - factor(Satisfaction)
##
## Selection correctly terminated,
## No more variables removed from the model
##
## fold 3
## Removed at Step 1 is - factor(Satisfaction)
##
## Selection correctly terminated,
## No more variables removed from the model
##
## fold 4
## Removed at Step 1 is - factor(Satisfaction)
##
## Selection correctly terminated,
## No more variables removed from the model
##
## fold 5
## Removed at Step 1 is - factor(Satisfaction)
##
## Selection correctly terminated,
## No more variables removed from the model
##
## Imp run 3
##
## fold 1
## Removed at Step 1 is - factor(Satisfaction)
##
## Selection correctly terminated,
## No more variables removed from the model
##
## fold 2
## Removed at Step 1 is - factor(Satisfaction)
##
## Selection correctly terminated,
## No more variables removed from the model
##
## fold 3
## Removed at Step 1 is - factor(Satisfaction)
##
## Selection correctly terminated,
## No more variables removed from the model
##
## fold 4
## Removed at Step 1 is - factor(Satisfaction)
##
## Selection correctly terminated,
## No more variables removed from the model
##
## fold 5
## Removed at Step 1 is - factor(Satisfaction)
##
## Selection correctly terminated,
## No more variables removed from the model
res_cv
## $pool_stats
## Train Test
## AUC 0.8514601 0.8247000
## Scaled Brier 0.3930935 0.3182324
## R2 0.4708196 0.4378683
##
## $LP_val
## (Intercept) lp_test
## 0.002082868 1.145024428
##
## $auc_test
## 95% Low AUC 95% Up
## AUC (logit) 0.7447 0.8247 0.8835